Artificial Intelligence’s Future Depends on Software-Defined Network IT


I’d like to imagine that one day my grandkids will be reading about artificial intelligence (AI) in their digital history books or via cranial implants or whatever the equivalent is in the future. Just few chapters after Newton’s apple or Apollo 13, they’d scour details about the early days of AI. Future generations will likely look back at early AI efforts with a wiser, aged perspective. Because, and let’s not lie: making AI commonplace will take a lot of work. And nobody will feel this pressure more than IT’s back-end folks.

First, consider the scale of this new era, which, by the way, is upon us now. Gartner has ranked AI among 2017’s top strategic trends. In everything from self-driving cars to virtual assistants, AI will have an increasingly important role. But it’s not only consumers who will be affected. Gartner also believes 50% of all analytics in 3-5 years will be AI-powered. Any decision that requires business intelligence can benefit—that’s certainly plenty of enterprise use cases! Just look at customer service, where businesses can use natural language processing to facilitate better interactions, and also analyze user patterns to create better customer profiles. There’s also talk of enterprise resource planning (ERPs) solutions, the most legacy of applications, being revolutionized and rendered more efficient. And while I believe near-term AI efforts will focus on ‘augmentation’ rather than pure ‘replacement’ of business decisions, it’s critical for businesses to look beyond just the near-term.

These piecemeal examples are fascinating, but they hardly scratch the surface. We’ve got to expand our horizons to paint a complete picture of the future. With so many ways to apply AI, we’ll eventually move toward an all-encompassing intelligent digital mesh. Think of it along these lines: every touch-point humans have with technology can be used to gather data. Connect all of those dots and tremendous insights will emerge. We’re talking about reorganizing 21st century life in a major way. Some experts even believe the growth of AI could mirror Moore’s Law—I believe, perhaps even eclipse it in a multi-factor, exponential way as technology advances accelerate.

Now, business leaders and software developers may find this exciting, but there is an obvious reality-check. Those more reserved about technological progress may find this scary (without going deep into the details, I want to mention those fears may be misplaced as AI is a tool humans will be in charge of…at least for now). But for network architects, all of this will simply seem daunting. How will networks handle such massive quantities of data?

Here, perhaps, history can offer us a lesson. Society has faced significant restructuring before, and Paris’ renovation in 1853 stands out as a striking example. That year Georges-Eugène Haussmann was commissioned by Napoleon III to re-architect the city. Previously, Paris had been a sprawling medieval expanse—lack of an appropriate logic in its design fostered overcrowding, disease and crime. What Haussmann did was regulate districts in an orderly fashion, create grand avenues and boulevards, and lay foundations for the modern city as we know it. He built a forward-thinking infrastructure for growth.

The renovation of Paris can be likened to today’s expanding network architecture and infrastructure. With huge quantities of data to be uploaded to the cloud and elsewhere for AI, IT faces a similar overcrowding problem. Sure, providers are enhancing their infrastructure for AI. Google’s cloud has AI capabilities built-in, IBM has released APIs for AI, and Amazon is committed to AI as well. Their new features are valuable, forward-looking and critical—speeding up AI for many different business cases. But there’s one lesson to take from Haussmann’s renovation of Paris: overcrowding may take an entirely new architecture to be successful. Otherwise, how will any network admin or DevOps group cope with the sheer scale of data for AI?

Luckily, a new wave of solutions addresses precisely this problem. It starts from the belief that enterprises should have fast, agile, and cloud-connected networks. Let’s face it: without optimizing flows of traffic, AI will only clog up the digital veins and arteries of enterprises. Unless data can be pulled in from anywhere, and quickly integrated, companies won’t be able to take advantage of the digital mesh I mentioned earlier. And how can the APIs, cloud services, and on-premise business data for AI work holistically?

The good news is you don’t have to burn your hardware. Unlike re-architecting a city, software can virtualize equipment for greater control and order. For example, imagine centrally managing your network flows and provisioning new sites, without typing a line of code. SD-WAN was built for this holistic yet agile approach. What if you need to make sure remote branches communicate with the cloud that’s processing artificial intelligence algorithms? Use a software-defined edge. And make sure your AI insights are easily available and quickly analyzed, with all the necessary data, by taking advantage of end-to-end network monitoring. Even if you must rely on external providers, monitoring cloud and SaaS products can ensure AI functions work the way you want, fast and effectively.

We’re ultimately talking about a new networking approach—software-definition is the key to unlocking infrastructure to support AI. That is what will be written in (or uploaded into) the history books. Ultimately, in the future, when my grandkids learn about the rise of AI, surely, there will be breathtaking passages about human advancements we can’t even imagine now. But IT infrastructure will also be featured—and for me, that’s the most exciting part. 

Comments are closed.