As we enter 2019, what stands out is how trends in business and technology are connected by common themes. For example, artificial intelligence (AI) is at the heart of trends in development, data management and delivery of applications and services at the edge, core and cloud. Also essential are containerisation as a critical enabling technology and the increasing intelligence of Internet of Things (IoT) devices at the edge. Navigating the tempests of transformation are developers, whose requirements drive the creation of new paradigms and technologies that they must then master in pursuit of long-term competitive advantage.
AI will get its early start mostly in the clouds
Still at an early stage of development, AI technologies will process massive amounts of data, the majority of which will happen in public clouds.
A rapidly growing body of AI software and service tools – mostly in the cloud – will make AI development easier and easier. This will enable AI applications to deliver high performance and scalability, both on and off premises, and support multiple data access protocols and varied new data formats. Accordingly, the infrastructure supporting AI workloads will be also have to be fast, resilient, and automated. While AI will certainly become the next battleground for infrastructure vendors, most new development will be aimed at the cloud.
IoT: Don’t phone home. Figure it out.
Edge devices will get smarter and more capable of making processing and application decisions in real time.
Traditional IoT devices have been built around an inherent “phone home” paradigm: collect data, send it for processing, wait for instructions. But even with the advent of 5G networks, real-time decisions can’t wait for data to make the round trip to a cloud or data center and back, plus the rate of data growth is increasing. As a result, data processing will have to happen close to the consumer and this will intensify the demand for more data processing capabilities at the edge. IoT devices and applications – with built-in services such as data analysis and data reduction – will get better, faster and smarter about deciding what data requires immediate action, what data gets sent home to the core or to the cloud, and even what data can be discarded.
The demand for highly simplified IT services will drive continued abstraction of IT resources and the commoditisation of data services.
Remember when car ads began boasting that your first tune up would be at 100,000 miles? (Well, it eventually became sort of true.) Point is, hardly anyone’s spending weekends changing their own oil or spark plugs or adjusting timing belts anymore. You turn on the car, it runs. You don’t have to think about it until you get a message saying something needs attention. Pretty simple. The same expectations are developing for IT infrastructure, starting with storage and data management: developers don’t want to think about it, they just want it to work. “Automagically,” please. Especially with containerisation and “server-less” technologies, the trend toward abstraction of individual systems and services will drive IT architects to design for data and data processing and to build hybrid, multi-cloud data fabrics rather than just data centres. With the application of predictive technologies and diagnostics, decision makers will rely more and more on extremely robust yet “invisible” data services that deliver data when and where it’s needed, wherever it lives. These new capabilities will also automate the brokerage of infrastructure services as dynamic commodities and the shuttling of containers and workloads to and from the most efficient service provider solutions for the job.
Building for multi-cloud will be a choice (and you know what choices come with…)
Hybrid, multi-cloud will be the default IT architecture for most larger organisations while others will choose the simplicity and consistency of a single cloud provider.
Containers will make workloads extremely portable. But data itself can be far less portable than compute and application resources and that affects the portability of runtime environments. Even if you solve for data gravity, data consistency, data protection, data security and all that, you can still face the problem of platform lock-in and cloud provider-specific services that you’re writing against, which are not portable across clouds at all. As a result, smaller organisations will either develop in-house capabilities as an alternative to cloud service providers, or they’ll choose the simplicity, optimisation and hands-off management that come from buying into a single cloud provider. And you can count on service providers to develop new differentiators to reward those who choose lock-in. On the other hand, larger organisations will demand the flexibility, neutrality and cost-effectiveness of being able to move applications between clouds. They’ll leverage containers and data fabrics to break lock-in, to ensure total portability, and to control their own destiny. Whatever path they choose, organisations of all sizes will need to develop policies and practices to get the most out of their choice.
The container promise: really cool new stuff
Container-based cloud orchestration will enable true hybrid cloud application development.
Containers promise, among other things, freedom from vendor lock-in. While containerisation technologies like Docker will continue to have relevance, the de facto standard for multi-cloud application development (at the risk of stating the obvious) will be Kubernetes. But here’s the cool stuff – New container-based cloud orchestration technologies will enable hybrid cloud application development, which means new applications will be developed for both public and on-premises use cases: no more porting applications back and forth. This will make it easier and easier to move workloads to where data is being generated rather than what has traditionally been the other way around.