by Kevin Hannah, Director of Product Operations, Kazuhm

The Changing Definition of “Cloud”

The message of achieving IT nirvana by moving to the cloud continued to ring loud throughout 2018. But in the face of practical realities that included overrunning budgets1, security concerns2, performance issues due to network latency3, and an ever-increasing skills gap, the emphasis on public cloud changed to one of hybrid cloud where organizations were encouraged to take advantage of both public and private deployments; with 80% “repatriating workloads back to on-premise systems”4. A fact that the public cloud providers have been forced to embrace, as evidenced by Amazon announcing Outposts to bring their hardware into customer data centers. Now more recently followed by Google with Anthos. And a further recognition that “some customers have certain workloads that will likely need to remain on-premises for several years”5.

The number of “cloud” options has continued to increase, there is no one-size-fits-all, and so what we were really talking about at the end of last year was any variant on xyz cloud (public, private, multi, and hybrid).

But wait. The “fog” is rolling in. Or as Gartner would say “the Edge will eat the Cloud”7. The future tsunami inherent in Edge and Internet-of-Things (IoT) deployment behind both these statements, driving organizations away from a single threaded focus on “cloud”, requires another rethinking of our definitions. Add the ability to run workloads on desktop and the truly disparate constituent parts of this ever-expanding compute continuum and xyz cloud just doesn’t cut it anymore.

Adding a version number, e.g. Cloud 2.0, is lackluster. And although the use of “3rd platform” by IDC7 builds on an evolution of mainframe/greenscreen, through client/server, to cloud/browser, and comes somewhat closer, I see it as muddying the waters by weaving in social business and big data analytics that are not intrinsically part of a compute continuum.

Is it Cloud, is it Edge, or is it both? I believe we need new terminology, one best characterized in a Next Generation Grid of heterogeneous, connected, compute resources.


Containers as the “Life Blood” of Digital Transformation need a Heart

Despite the hybrid/multi-cloud push in 2018 and the lauded growth rates in $spend and adoption, the reality is somewhat different and “the so-called rush to the cloud is not, at present, much of a stampede, at all”; by 2021 only 15% of IT budgets will be going to the (public) cloud8.

Cloud this year is “still only used for around 10-20% of applications and workloads, according to 451 Research9, and this doesn’t even differentiate between production and non-production.

The drip has now become a trickle in 2018 but to reach flood stage will require the ability to have workloads move freely across the entire compute continuum, from desktop, to legacy server, to private cloud, public cloud, to the Edge and the IoT beyond. In other words, Containers. So, it is not a surprise that Forrester predicts “2019 will be the year that enterprises widely adopt container platforms as they become a key component in digital transformation initiatives”.  A recent survey of IT professionals done by Kazuhm supports this with 75% of respondants predicting they would increase their use of containers in 2019.

However, it is not just a case of organizations simply rolling-out containerized application workloads. It matters that the right workloads are deployed onto the right resources for the right reasons (including cost, performance, security/compliance, and even more esoteric vectors such as “data gravity”11 that root the location of processing that data). In other words, Optimal Workload Placement. We have already explored the breadth of resource but the addition of a myriad of both workload types and business reasons exponentially compounds the complexity.

The use of AI and the cloud have seen parallel growth. The latter an enabler through collection, storing, processing, and analyzing the vast volumes of rich data necessary to feed AI algorithms. But again, AI at the Edge is set to take center stage as issues with latency, bandwidth, and persistent connectivity (reliability), compound the problems the cloud already has with privacy-security-regulatory concerns and economics. What were we saying about cloud being inadequate as an overarching term…

That aside, now is the time to apply AI inward, with 2019 I believe to be marked as the start in the evolution of AI-enabled Orchestration of container workloads, the pumping heart of digital transformation.

The future is AI-enabled Orchestration for Optimal Workload Placement on the Next Generation Grid.


You hear that Mr. Anderson?… that is the sound of inevitability…

My parting thought for this future. “AWS wants to rule the world” 12. As did IBM, the biggest American tech company by revenue in 1998. Now 20 years later they are not even among the top 30 companies in the Fortune 500. The cycle of technology change continues to turn, but at an even faster pace. Perhaps Cloud today gone tomorrow?



1 Source: Cloud trends in 2019: Cost struggle, skills gap to continue

2 Source: What’s Coming for Cloud Security in 2019?

3 Source: Cloud 2.0: What Does It Mean for Your Digital Strategy?

4 Source: Businesses Moving from Public Cloud Due To Security, Says IDC Survey

5 Source: Amazon Web Services Announces AWS Outposts

6 Source: Gartner, The Edge will Eat the Cloud

7 Source: IDC

8 Source: ‘Big four’ set for assault on cloud market

9 Source: Sky’s the limit in global race to adopt cloud

10 Source: Predictions 2019: What to Expect in the Cloud/Container World

11 Source: Defying data gravity: How can organizations escape cloud vendor lock-in?

12 Source: AWS wants to rule the world


Leave a Reply

Your email address will not be published. Required fields are marked *