menu
"Cloud Heroes" Speak at Structure 2016—Insights, Trends, Recommendations & More « US English

"Cloud Heroes" Speak at Structure 2016—Insights, Trends, Recommendations & More

SHARE ON:

It was “Cloudy with a (great) chance of insights” at the Structure 2016 Conference last week in San Francisco, CA. This 2-day conference focused on a variety of topics relevant to anything from the established enterprise to the burgeoning startup and all those in between. Structure has long discussed the emergence of cloud computing and its evolutionary revolution through the years by collecting a consortium of technology experts, congregated to discuss anything from digital transformation to distributed computing to architecting for the edge to choosing the proper cloud mixture for business success.

Having attended this conference a few years ago, I was encouraged and amazed by the insights from these technology leaders (dare I say “cloud heroes”) and how businesses are innovating within their respective spaces. To many, the voices here, and I’m not talking about just the speakers, are heroes of technology, pushing boundaries, providing visibility into complex architectures, and breaking through the status quo. 

It is impossible to encapsulate all of the ideas, arguments, opinions and recommendations presented at the 25+ sessions, but I did see the emergence of threads and common themes that were prevalent at the conference. Of course, this is my opinion, subject to the salient points that I wanted (or didn’t want) to hear. There are others who have a bit more journalistically reported on various sessions and I have provided the links to those where appropriate. But I did see several recurring topics:

  • Clouds galore
  • The growth of containers
  • How IoT is still the wild, wild west
  • Government struggling & somewhat succeeding to stay current
  • The importance of machine learning and AI
  • Next generation of networking
  • Microservices as critical differentiation

  • Disrupting (again)

I found that the underlying current of many of the discussions was that disruption continues to happen within businesses and Information Technology. And while this may be stating the obvious, this disruption is evolving to a massive groundswell of activity. While in the past, companies were saying they HAD a cloud strategy, those same companies are now on second- and third-generation evolutions of their implementations, well beyond the planning stages. And with the breadth and depth of cloud technologies now available, those truly being disruptive are weaving together their technologies in quite creative ways. 

Clouds galore

Joe Weinman, author of Cloudonomics and Digital Disciplines, and who was to be the snarky emcee of the event, kicked off Structure 2016 with a series of examples of how well-known companies like Pinterest, Dropbox, Instagram, Netflix, Walmart and others have evolved their cloud strategies over time. While this may be clear—to NOT evolve is to die—what wasn’t immediately obvious until Weinman presented these examples was that every cloud implementation was unique and driven by company-specific factors. A full list of the company cloud-evolution examples can be seen here.

From pure public or private cloud, to hybrid, to on-prem, to multi-cloud, fog/edge computing, what truly stood out was that in many cases, a blended approach was an ideal solution. In fact, Weinman half-jokingly stated “the hybrid multi-cloud fog is the future,” a phrase he repeated later in the conference. What struck me after hearing other sessions was how accurate that statement actually was, not in the words used, but rather in the idea around matching multiple cloud technologies to be successful with business. 

Similarly, Seth Bodnar, Chief Digital Officer of GE Transportation, compared the evolution of train infrastructure to that of manufacturing with the difference being that "the conveyor belt is thousands of miles long" and compute capacity is on the edge (e.g., in the locomotive). In these specific use cases, it is a multi-cloud AND multi-networked approach, but carefully mapped to the business needs.

The growth of containers

One extremely common topic in many of the discussions at Structure was that of “containers.” A few years ago at Structure, this was hardly discussed. Now, containers, thanks to the work of Docker, CoreOS and others, are an extremely common technology being used by businesses. Instead of building for a particular environment like a single cloud or server, containers allow developers to architect their application solution abstracted away from the hosting environment. This brings portability between physical and cloud environments because not only is the application packaged up but all of its dependencies like libraries, configuration files and other binaries are also. CIO has a good primer on what containers are.

Scott Guthrie, EVP of Microsoft Cloud and Enterprise Group, pointed out a trend he sees within Microsoft Azure. While many consider containers to be “bleeding edge,” Guthrie said DevTest has been adopting container implementations rapidly. In fact, to drive adoption by Azure developers, Microsoft recently open sourced its Azure Container Service engine. 

Scott Johnston, COO of Docker, said their growth has been explosive and often tied to hybrid deployments. Eighteen months ago, companies were merely pushing toward the public cloud, said Johnston, now 75% of enterprise deployments are hybrid. Johnston touts the fact that dev teams can build applications once and then abstract them out which is perfect for the hybrid cloud. Companies can deploy containers based on security, compliance and/or economics. Another interesting note from Johnston was that the average lifespan of a container is less than 5 minutes. This temporary state of containers I feel is an equal comparison to the advantages of cloud computing in general – on demand in nature.  

The panel consisting of Google, Mesosphere and CoreOS discussing “the container of containers” (e.g., cloud providers and others) insightfully added that architectures will vary based on what is in the container – underlining the fact that clouds must remain open to ensure a strategy of extensibility.

Lastly, PayPal’s SVP & CTO Sri Shivananda said PayPal relies heavily on containers as they are more granular, more secure, can be automated, and in combination with clustering, drive incredible efficiencies for the company.

How IoT is still the Wild, Wild West

Still fresh on people’s minds was the recent DDoS attack that crippled DNS provider Dyn. As many know, these attacks were powered by an army of insecure IoT devices using default or hard-coded user names and passwords. Logically, there was much discussion around the future of IoT with the first words coming from IBM Fellow Mac Devine, who called out the fact there is currently no structured operating system for IoT. With IoT devices moving data to the edge, IBM desires to create a cognitive loop to allow machine learning to eliminate blind spots in data. In the past, data was locked in the data center like gold and you had to be “privileged” to access it, Devine suggested. It should be simply used now the way oil is, he added. 

In a particularly thoughtful session with Google’s Scott Jenson and Ping Identity’s Paul Madsen, moderated by Stacey Higginbotham, new approaches to the governance of IoT infrastructure which includes the embracing of standards were outlined. The belief is that many IoT devices are not built to be scalable—they rely on IP passwords and setup within specific apps. The panelists agreed that new IoT devices must be meaningful, that they be functionality-based and that they advertise or broadcast their specific functionality. Additionally, networks must be smart enough to have a conversation and have the ability to self-organize these connected devices. Lastly, there is the belief that IoT devices should not have passwords, but rather some sort of OATH or token-based authentication and there should be mechanisms to collect the “data exhaust” from every single device into a data silo for future analysis and learning.

Government struggling & somewhat succeeding to stay current

At Structure 2016 there were two sessions specific to the government; one was a presentation by Bernd Verst, an Innovation Specialist in 18F and another with Arlene Hart, CISO of the FBI. My two takeaways from these sessions were: 1) government innovation is dramatically hampered by regulations and 2) equally dramatic are the efforts to speed implementation and adoption of cloud and other technologies. 

For starters, Verst describes 18F as a digital consultancy for the government, created to help the government build out digital services like cloud.gov, FEC.gov and analytics.usa.gov. The 18F team is tackling the problem of timely delivery (or lack thereof) of digital services which pervades across governmental systems. Their solution is to approach digital challenges with swiftness and agility. According to Verst, speed is the new security and therefore the teams develops in small, fast iterations. They have also adopted a new approach of building platforms with automation and OpenSource, using certified public cloud providers and avoiding IaaS lock-in. By selecting certified clouds like the AWS Government Cloud who have already been through the FedRAMP assessment process, they are able to practically templatize their approach for other branches in the government. Verst and the 18F team are working to streamline agile development to get through over 4,400 pages of regulations more quickly.

The FBI’s Arlette Hart echoes similar IT challenges within the FBI itself. These challenges are equally as complex as they not only need to support mission-enablement of agents (e.g., the customer) but also provide support for legacy systems which frequently are slower to update. According to Hart, the FBI also has to fight through regulations and they must speed through these processes a fast as possible while still considering insider-threats (e.g., WikiLeaks and Snowden) and ever-present external threats. Unlike commercial organizations, the FBI doesn’t monetize its risks; it must classify the data at various levels to gauge the impact of risk.

The importance of machine learning and AI

Artificial Intelligence (AI) and Machine Learning (ML) also proved to be a hot topic at Structure. Urs Hölzle of Google outlined three bottlenecks currently facing cloud computing: 1) the lack of cheap, fast storage putting pressure on the interconnect, 2) how ML is also becoming a pressure point, and 3) the network itself. Intel’s Naveen Rao discussed the movement toward predictive analytics and the importance of the evolution from ML to Deep Learning. Deep Learning, according to Rao, takes vast data sets and trains them against neural networks which consequently produces new types of data results. Rao says in 2015, ML accounted for only 7% of servers and within that group, 63% was dedicated to ML and 37% to Deep Learning. He postulates than Deep Learning will dramatically increase in coming years and will allow businesses to uncover new features from data automatically using iterative algorithms to figure things out. 

Within the VC panel consisting of Sunil Dhaliwal (Amplify Partners), Creighton Hicks (Kleiner Perkins Caufield & Byers) and Peter Wagner (Wing Venture Capital), the belief is that advances in large volumes of business and organizational decisions will be made via ML/AI, decisions that typically overwhelms a human operator. Also, there is a dramatic shift from traditional application behavior of submitting and retrieving data (the old way), to AI-driven apps where recommendations or predictions would be made to suggest a behavior. 

Lastly, Vinod Khosla, founder of Khosla Ventures, said that AI is only interesting if it can manipulate infrastructure, and controversially mentioned that AI could be highly leveraged to potentially replace 80% of IT folks.

Next generation of networking

When it came to networking, something I believe to be the lifeblood of infrastructure, there were varying opinions. Jason Forrester, CEO of SnapRoute, believes that while networking is more important than in previous years, networking solutions actually need to provide fewer features. He expanded his thoughts to say that while there may be fewer protocols in the future, there will be more intelligence actually built into networks using AI. 

John Donovan, Chief Strategy Officer at AT&T, boiled it down to a simple statement specific to software-defined networking (SDN). AT&T’s vision, Donovan explained, was they wanted to move their entire network to software because SDN services were faster and easier to consume. However, Guido Appenzeller, VMware’s Chief Technology Strategy Officer of Networking and Security, presented a different point, particularly around the term “SDN,” stating that the term has been “overused to death” and that today, networking has become software so is therefore redundant.

Microservices as critical differentiation

Of equal importance to containers, AI/ML, and clouds in this modern, digitally disruptive era are microservices and there were plenty of discussions around their value to modern day business strategy. In fact, I feel microservices of today can be likened to the introduction of cloud computing of the past. Jeetu Patel, CSO and SVP of Platform at Box, says yet another disruption is underway. Every company is becoming a digital company in some way, say Patel, and that companies are not competing against their direct competitors any more, but with others in terms of the user experience. What companies own are the curations of the user experience through a combination of microservices. First, companies were building and delivering apps, then they were delivering experiences, and now companies are essentially platforms, says Patel.

I will translate what Patel is saying into a simple example: If your user experience requires a mapping function, your customers are better served with you accessing Google’s mapping API/microservice, for example, than your company re-inventing mapping and presenting a sub-par experience. As your business curates that user experience, choose the best-of-breed third-party microservices to enrich that experience and focus on your core competencies.

Disrupting (again)

If we were playing Bingo at Structure 2016, it would have been easy to fill a row for the number of times “disruption” was mentioned. Each topic focused on new ways to innovate or disrupt the status quo. Whether it is the government looking to dramatically reduce the delivery time of digital services by employing an agile approach, or an established cloud player like Amazon continuing to build innovative services like Lambda which allows users to run code at any scale without the need to provision servers, technological growth and innovation are explosive. I don’t believe it’s a single point in time or a "disruption event." This is a disruptive era. Forget about the status quo, it’s no longer relevant. 

And if we were playing blackout Bingo, the ever-dynamic Bryan Cantrill, Joyent’s CTO, would have allowed you to fill your board multiple times. Honestly, I wouldn’t do Cantrill’s presentation justice by trying to recap it all in a few sentences. If you haven’t seen Cantrill in action, I recommend you watch his presentation at this year’s Structure. His talk wove politics with technology and back again, which was refreshing for all of us with our post-election-results hangover. His slides are available on SlideShare as well.

Summing it all up

In my mind, each of the panelists and presenters of Structure 2016 are heroes of their area of expertise. They have courage to journey into the unknown, taking risks along the way and learning as they go. But more importantly, they share their knowledge. The pace of innovation and disruption is exponential. To be disruptive, competitive and win in today’s business marketplace, companies must leverage the technologies, procedures and business rules of others, but only by blending these uniquely will they be transformative and successful. 

While blended cloud implementations, containers, IoT, ML/AI, and microservices are critical, I propose that of equal if not greater importance is understanding how they interconnect and perform as well as having the ability to seamlessly connect these disparate environments and services into a holistic vision. I circle back to the network being the bloodline of IT. Without it, there is failure. When it performs poorly, customers are dissatisfied. And when it works well, companies can focus on innovation and differentiation.

top.name