I wanted to take a few minutes to pen some of my thoughts about the so called "Cloud" revolution that has been underway for a few years in our IT world. For all practical purposes the concept of "Cloud" is as old as the time when all computing was centralized. Remember the days of the good old monochromatic screens and mainframes. While I don't want to reduce the Cloud revolution to that level of simplicity, in essence Cloud really is a bottling of the past all be it in a fancy bottle. The fancy bottle being the richness of the user interface that has been brought about by the sophistication of the browser and thin client enabled by Java or .Net coupled with automation in provisioning of infrastructure.
To understand the impact of Cloud is to first understand the evolution of Cloud, while I would not put Cloud in the same transformatory category as Java or Windows, it nevertheless is a very important culmination of many incremental steps.
In the beginning most of the computing world had to be content with a monolithic type of infrastructure where you had these big hunk of a machine called mainframe occupying all of your data center space and everyone connected via dumb terminals and submitted jobs. Then came the age of Windows and PCs and with it the advent of Client-Server architecture where you had thick clients providing rich user interface and ran completely on Windows, while most of the compute and logic was handled by the backend servers. These backend servers could have been anything but in most cases people were moving away from mainframes to less expensive mid-range UNIX servers that offered more horsepower and flexibility than the single dimensional centralized mainframe computing model. One of the key deficiency of the Client-Server model was the need to deploy and maintain several user PCs (Clients) with any changes to the application and the fact that some users had different software loads on their PCs sometimes would interfere with how the user interface worked. Client-Server was an important paradigm where in for the first time you could break your application and have it distributed to run across multiple machines.
Then came Java, a revolution in how we write, manage and deploy application code. For the first time you could write your code once and run it anywhere on any platform without any changes and the code would behave exactly the same. Though in the beginning Java tried to emulate the Client-Server model by pushing out thick client versions called Java Applets, where you would force a download of the Applet when you first visited a URL, and then that Applet would run within the JVM on your PC providing similar look and feel as the old Windows based Client that was say built in VB or Powerbuilder. These Applets on slow networks at that time (1990s) would take forever to download causing frustration and this then led to the evolution of the thin-client we now have. The thin-client would not have come about without changing capabilities of the browser, HTML and advances in the Java client side scripting language combined with new design patterns in Java. All of these pushed almost all logic, compute and even how the user interface looked to the backend and the rendering of user interfaces were primarily done by Java scripts, Flash, etc which are light weight and can provide any amount of richness to the user interface. So today for all practical purposes all you really need is a browser to do anything that you used to in the old Client-Server days and the good part is you no longer need to install anything on your laptop or PCs, whatever plugs-ins, Add-ons your need will be prompted for by the browser, downloaded and enabled without breaking any sweat. So in essence we have come a full circle from centralized computing during mainframe days, to distributed Client-Server computing during the Windows revolution to back to centralized computing combined with rich UI all pushed from the backend during the ongoing Java revolution and this is where the Cloud story starts.
Once the paradigm of deploying and managing shifted back to the centralized model where you could blow out changes enmasse to your user base by simply deploying new code in your backend application servers without the need to touch any user's PC or laptop, the impetus to deploy and create several such farms of applications started taking shape. With the dot com bust and the fact that in most data centers the servers are not used more than 15% of the time, an awareness arose that looked at such poor utilization rates and the need to harness the wasted compute power became paramount. This gave rise to Virtualization and the rise of companies like Vmware and Citrix. The idea was simple, create the illusion of multiple machines on the same single machine. Each with its own IP address and OS, for all practical purposes unless you were told you would have assumed it was a standalone machine and not a virtual machine. This ability to virtualize the physical machine meant you not only increased utilization rates of servers from a meager 15% to 90%+, it also meant that you could shift these virtual machines and everything contained in it to a totally different physical machine as the need arose (for example if the physical server hosting your virtual machine was over subscribed and became over loaded) without missing a heart beat. In essence virtualization decoupled the application from the infrastructure; the whole data center became one large blob of compute power all be it made up of several physical servers and even more virtual servers, all available for any application. Your application could run anywhere, anytime and use whatever resources as needed on demand. This was the beginning of the "Cloud".
Very quickly companies like Amazon for example saw the immense opportunity to rent out their servers when they did not have enough demand internally for their huge infrastructure. Unfortunately since the compute infrastructure is always built for peak that happens only a few times a year, for example Amazon's peak time is during the holiday season (Nov-Jan) they are relatively idle the rest of the year. So with the advent of virtualization companies like Amazon were quick to develop their own wrappers or to put it simply scripts and logic that would provision their existing infrastructure as small virtual servers on demand to external users. These then evolved such that they could provision all components that a typical application would need from virtual machines that hosted your web servers, to application servers, database servers and storage. And they charged you only for what you used using metering that measured your consumption. The ability to provision a whole platform within minutes without the intervention of network admins, systems admins and DBAs meant that as a consumer your costs went way down, in addition you do not own a single piece of physical hardware, you could pay as you go and all of these came with very reliable availability (thanks to the providers they already baked in redundancy when they built these systems for their own use initially) and the ability to scale as needed. This is the Cloud revolution where any kid can compete with large companies with pennies on the dollar because he is no longer constrained from using infrastructure that the big boys play with but more importantly he can do with his pocket money. Software thus developed and consumed all run in such data centers filled with thousands of virtual machines, but to you and me when we click that next link on a web page it still works the same and we have thousands of apps to choose from, all brought to you because of the low barriers to entry for developers who make those apps. All thanks to the Cloud. Next time we will see how many companies are missing out on the Cloud revolution.
Comentarios