IT Trends—What to Expect in 2015
In late 2014, I took a moment to think about a few trends enterprise IT should expect to see for 2015. I’d like to explore these trends a little further.
Network functions virtualization takes off, even without SDN
2015 will see continued development of SDN technologies, and buyer confusion will not abate as the incumbent switch and router vendors jockey for position. But NFV, already being widely deployed into service providers, will make its way into ‘classical’ enterprise networks without the need for any SDN refresh (which, curiously, may require new hardware). Virtualized network functions allow organizations to dynamically provision networks wherever they’re needed, on an on-demand basis, independent of any underlying fabric.
While SDN continues to attract much attention, actual enterprise deployment remains low. Like ‘cloud,’ ‘SDN’ has been defined and redefined in various ways, often to advance particular agendas. This leaves potential buyers with a challenge to justify the costs and disruptions that an SDN deployment typically incurs. Fortunately, the path toward NFV is much clearer. Modern general-purpose CPUs provide equivalent or better performance than custom hardware, and software can now replace nearly every ‘middle box’ that sits somewhere in a network. Originally of interest mostly to service providers, NFV allows enterprises to separate network functions from location, allowing applications and data to be ‘projected’ wherever and whenever needed.
Data breaches grow larger and more frequent
Unfortunately, the relentless pace of data breaches in 2014 will continue in 2015. Traditional security tactics, such as relying on ‘hardened’ perimeters and rigid mobile device management, will have little effect at slowing down the bad guys. Enterprises should shift investments and spend more on detection and response. Visibility across all applications, networks, and devices is the first critical step toward improving overall security postures. Establishing a baseline of what’s ‘normal’ helps to better isolate actual threats and respond accordingly.
The high profile attacks of 2014 completely evaded perimeter-style security controls. Attacks that target people—always more vulnerable than technology—will remain in the headlines well beyond 2015. Despite this, security budgets won’t grow much, because demonstrating ROI on security investments can be a futile exercise (‘How much money did we make with that firewall again?’). Now is the time to reallocate budgets to be better prepared for when the inevitable happens. Improved detection capabilities can help limit how far an attacker might penetrate a network. A well-tuned (and practiced) response process guides the organization to recover quickly in a controlled fashion, minimizing errors and omissions and returning to profitability as soon as possible.
Hybrid architectures become the norm
Even though cloud computing and third party hosting will continue their rapid expansion, on-premises IT will remain a reality for 2015 and beyond. The resulting hybrid infrastructure stack will create challenges for most organizations—including architectural ‘collisions,’ where design patterns for on-premises development and deployment don’t translate well (or at all) into cloud. Working through these challenges will require more sophisticated models, policies, identity/access controls, and coding practices to ensure that end-user needs are met consistently across all platforms.
Immediacy is the theme. The dynamic nature of modern business requires that IT develop skills and methods for providing effective solutions quickly. Agile development methods, modeling and simulation tools, and a devops philosophy all can help an organization outflank and beat its competition. Software that offers abstraction layers for common tasks such as authentication/authorization, interprocess communication, and service chaining can help reduce dependencies on particular deployment architectures, making it easier to move application stacks across platforms.
Decision-making becomes primarily driven by actionable analytics
As visibility, control, and optimization are brought to hybrid networks it will become increasingly important to construct an analytics-driven infrastructure that can take action when problems occur anywhere in the network. In 2015, more IT organizations will begin instrumenting network architectures with predictive analytics to create self-correcting, self-generating networks that respond to business needs and intents. This will be an ongoing trend starting in 2015.
Well-instrumented infrastructures provide the foundation for introducing automation. Such automation helps infrastructures react to changing demands without requiring manual intervention (and also reduce errors that might occur whenever humans touch technology). Visibility tools can help to discover and map dependencies in application workloads, a necessary element for true workload portability. Furthermore, rich analytics supports the recommended shifts in security techniques toward detection and response.
Location transforms from a constraint into a feature
The technologies that will emerge in 2015—full stack virtualization, pervasive visibility, and hybrid deployments—create a form of infrastructure mobility that allows organizations to optimize for location of data, applications, and people. Regulatory policies that govern data locations will cease to become an impediment, and rapid access to that data will become possible for anyone, regardless of where they may happen to reside. Organizations that adopt these technologies will achieve new kinds of competitive advantages as a result.
The industry is at the threshold of a significant change—a change that is poised to eliminate this constraint from most decisions. With the help of modern tools designed for such purpose, IT organizations can liberate themselves from the limits of distance and location. Applications and data can be placed wherever is optimal for the business and can be quickly moved around as necessary. Users will have a consistent performance experience. Administrators and developers can retain visibility into application behavior regardless of how much distance might separate users from their data.