Jason J. Gullickson

Jason J. Gullickson

On cloud computing

I’ve been making an analogy lately between IBM’s predicted world need for about 5 computers and the current cloud computing situation.  Turns out I’m not the only one looking at it this way, in fact this article takes the same position, interestingly before the rise of the clouds:

http://www.theguardian.com/technology/2008/feb/21/computing.supercomputers

There certainly are a lot of Utopian ways to view this model, but I think seeing this from the 1940’s IBM perspective sheds important light on the subject.

IBM didn’t see a world market for 5 different computers, IBM saw a world market for 5 IBM computers.  This might sound obvious or the meaning subtle, but if you’re familiar with how companies used to own IBM mainframes (which is to say, they didn’t) you’ll appreciate why the distinction matters.

Consider the personal computer revolution.  Why was it a revolution?  Before personal computers the average person was unlikely to interact directly with a computer on a regular basis.  If they did, it was likely that they were a data-entry professional or possibly an “operator”.  It was very unlikely that they were a programmer, as the number of programmers worldwide at this time would be unlikely to fill a reasonably-sized conference hall.  The computers that exists were largely leased from IBM or a handful of smaller manufacturers or were custom purpose-built machines by universities or private research institutions.  The range of applications for these computers amounted mostly to accounting systems or other business information management like tracking insurance policies and the like.  There were some sites where scientific computing was happening but these were mostly at universities and only graduate students were allowed any sort of creative access to the machines.

In 1971 Intel created the first mass-produced microprocessor, which bundled the most interesting parts of a stored program computer into a single chip. For some reason they released this at a price-point that was accessible to the typical electronics enthusiast (most of which were building HAM radios at the time) and a few of them began building their own computers.  Given the scarcity of computer resources at the time, there was enormous motivation for anyone who wanted to do something new with a computer to make these projects successful.  It’s hard to imagine what that would have felt like, the ability to suddenly have unfettered access to something that otherwise was locked-up in universities or enslaved to the most boring and monotonous tasks.

For the most part the established computer industry ignored the whole thing, as it was obvious that these toy computers would never become capable of competing with the “Big Iron” that was IBM’s bread-and-butter at the time. However there were a few companies paying attention (some growing right inside the garages and basements of these early microcomputer hackers) and it was these companies that would have the vision to bring the microcomputer to the masses in the early 1980’s.

Suddenly machines that just five years earlier were only available to the richest private companies and most advanced university students were being placed in the hands of laymen and even children.  The number of programmers exploded, and along with it the breadth and depth of computer applications. Over time these toy computers gained strength and began to penetrate the professional world as well as a generation of hackers who had grown up with entire computers at their disposal were not going to suddenly go back to sharing a single machine with the entire company.

My friend Alan summed this up nicely during a word processing class in tech school.  He was assigned a workstation that shared a single dot-matrix printer with two other stations and was told the reason was that he kept picking the station with a dedicated laser printer.

The instructor explained, “What if you work for a company that only has one dot-matrix printer?  You’ll be glad you know how to work with a shared printer then!”

to which Alan replied: “I’m not going to work for some poor-ass company that can’t afford to buy me a laser printer”.

This might sound quaint (especially in an era where a printer can be purchased for less than the price of the ink it consumes) but the point it illustrates is clear, once someone has experienced the power of dedicated resources they resist going back, and the experience they have gained gives them a choice about where they spend their time and talent.

But what does all this have to do with cloud computing?  As we consume more of the conveniences of the shared computer we simultaneously loose the control, autonomy and expressiveness of the personal computer.  The needs of the many outweigh the needs of the few or one, and increased security and stability are required of the cloud that compromise the openness and flexibility that allowed new uses and applications to emerge from the personal computer that simply were not possible in the pre-microcomputer revolution world.  A more sinister aspect of this consolidation is that is is happening under the corporate umbrella of a very small but very rich collection of companies, companies whose bread-and-butter is just the kind of technology that emerges from the creative use of computers.  Can it be proven that these companies will use this to their advantage in order to squelch competition that might arise from their own clouds?  Perhaps not, but history indicates that any other outcome is unlikely.  Furthermore malice aside, large monolithic systems such as these are know to develop systemic vulnerabilities due to their lack of internal diversity, and as we become ever more dependent on these services (either directly or by using services built on top of them) we begin to drive the statistical probability of a catastrophic failure to 100%.

Fortunately there are alternative architectures that can provide the key conveniences of cloud computing while eliminating the costs and risks outlined above.  However the first step in pursuing these is to raise awareness of the true cost of cloud computing in its current form, until then the viability of other approaches will be artificially deflated compared to the “blinders on” view of cloud computing as it exists today.

- Jason