The Cloud: the third time is a charm

Most don’t realize it but the current trend is the third time that “cloud computing” has achieved preeminence, having failed twice before.

Cloud computing has been with us much longer than most of us are aware – unless you are a historian, or has lived through it. I can only claim to have witnessed the latter part, the last ten years. So instead of detailed history, let’s talk about the story of cloud computing from overall broad strokes and get some insight of why, this time is it here to stay. We can think of three overlapping waves that led to the current state:

The First Wave of Cloud Computing – the 60’s

I recall in the early nineties, just when I was starting to use brand new Windows NT at my first job, DEC had gone broke, and Unix (not Linux) was THE standard for distributed computing, I came across an odd, old book from 1966 that was being discarded by the library: The Challenge of the Computer Utility by Douglas F. Parkhill.

image

I hesitated taking it home; it did not fit anything I know about computing. However this short book was fascinating. It talked about computing power as similar to electricity or water, something that could be shared and channeled to be used by millions, and the economics behind it. I kept the book, feeling it that the technology behind was already three decades old, but that the economics part of it seemed still relevant.

Only in the early 2010’s, with the coming of Cloud Computing was that I finally became aware of the importance of that book as I got more and more into a new thing called Azure and Amazon clouds.

A funny fact is that the book itself has a short history of computing in the first four chapters (chapter four: “Early Computing Utilities”), and only on chapter five goes on to latest state of the art on Time Sharing Systems. Chapter Six is on the economics aspect of computing utilities, which pretty much is visionary in anticipating all that is going on now with AWS and Azure evolving pricing models.

Despite the optimism of Parkhill and others, the concept of a public computing utility was still at its infancy considering the technical complexity of the problem. It continued to evolve as private time-sharing systems, with DEC/ UNIX systems accelerating in the 70’s and becoming mainstream in the 80’s.

The Second Wave of Cloud Computing – the 90’s

The second overlapping wave came by capitalizing on technical advancements as a foundation to evolve the economics of utility computing. Perot, with EDS, was one of the early proponents about selling computer time, not hardware to customers. He took the lead in the 60’s, and by the early 90’s the outsourcing industry had become a complex services segment of IT, with hundreds of companies that would not only sell computer time but also take over your IT services as a whole.

It became a mantra of the day to focus on core business and outsource all non-core business functions to these service bureaus, as they sometimes had been called since early days. That meant sharing not only computing infrastructure with other businesses, but also services as well.

Most of these services were initially mainframe based, such as IBM’s or Unisys’. IBM itself was a big provider of this kind of integrated IT service, and was not sitting on its laurels with just mainframe shared services, having branched to embrace the PC revolution it helped start. It was Gerstner’s major success and turned the company around.

All this focus on core services might have been a consequence of an initial well intended reengineering idea, but it ended up being pricey due to pretty much lock-in and lack of competition.

The lowering of computing hardware prices brought by ruthless commoditization by Compaq and Dell was so much in full swing, that circa ‘98 I recall working with company after company in migrations from service bureaus to in-house client server systems. Still, it would seem in around 1999 that the “computing utility” world was for IBM and HP to share, and the typical view of the world was as a customer plugged into the Matrix mainframe.

In the meantime, the technology continued to evolve with rapid progresses in virtualization (as shown by the stellar rise of VMware as the de facto standard) in late 90’s, clustering, and distributed computing (culminating with the Internet).

Most companies opted for in-house farms of inexpensive servers with virtualization software such as VMWare. The advent of Linux replacing all things UNIX, and the Xen hypervisor set the stage for the displacement of all previous contenders (IBM, HP and Sun) for the arrival of AWS.

The third wave – the 10’s

How a company that had started selling books online ten years before had become the major proponent of selling computing utility (now being called everywhere “public Cloud” services) is a fascinating topic in itself. However it was not without precedent: the infamous Enron company had already in its target “to trade bandwidth like it traded oil, gas, electricity, etc”. Amazon pretty much made use of its already existing ability to sell anything, to sell one more thing: excess capacity of its data centers.

It took an amazing turn-around effort from Microsoft in hiring Ray Ozzie, who championed the cloud and what came to be Azure, and realigning itself for growth. Today you pretty much talk about three clouds: two major contenders with AWS as the giant, Azure coming fast, and Google Cloud a distant third. IBM, HP, Dell/EMC are now the dwarfs:

Gartner Magic Quadrant for Cloud Infrastructure as a Service, Worldwide June 2017 

All this effort has paid off to Amazon and Microsoft Why is this third time a charm? It is because by 2008, all technological components were in place to provide reliable public computing utility services. From Pfister’s book (originally on clusters, but those are the technical foundations of high availability), here are the standard technical reasons why you want it: performance, availability, price/performance ratio, ability to grow incrementally, scaling.

According to Pfister, there is also the “scavenging of unused computing cycles” as a reason to why you want computing clusters, and therefore, the Cloud. On top of the technical platform, Amazon also had the logistics and economics patterns figured out, and now so do the other providers.

The Cloud also summarizes and commoditizes all the best implementation patterns around, from software design ones, to architectural, and its own cloud computing patterns. It is a massive collation and redistribution of knowledge, and in this foundation relies its stability.

Public computing utilities are here to stay. Cloud Computing has now finally a chance to fulfill its potential environed at MIT in the early 60’s.

Calendar

<<  March 2024  >>
MonTueWedThuFriSatSun
26272829123
45678910
11121314151617
18192021222324
25262728293031
1234567

View posts in large calendar

Month List