Crucial to understand is that cloud computing signifies a movement away from IT-centric product focus and signals a re-engagement with computing users, made possible by those long-established trends.
Many users of computer technology—and for that matter, many technology creators and administrators—complain about the rapid pace of change in information technology. The most recent example of a new technology trend bursting upon the scene is cloud computing. Setting a record for going from “what is it?” to “I’ve got to have it,” cloud computing for many people seems to represent a revolution in how computing will be done in the future. It’s important, however, to understand that, despite its sudden arrival, cloud computing is actually the latest manifestation of wellestablished trends, each of which has brought new benefits and new challenges to those working in IT. Crucial to understand is that cloud computing signifies a movement away from IT-centric product focus and signals a re-engagement with computing users, made possible by those long-established trends. Cloud computing is the latest—and hottest—technology trend going. Many people see it as crucial to the next step in enterprise computing, and an inevitable development in internal data centers. But what is it? It doesn’t take long in examining the buzz around cloud computing to realize the definition of cloud computing is, well, cloudy. There are tons of definitions around, and each day brings someone else’s definition to the mix. At HyperStratus, instead of adding to the cacophony of definitions, we refer to one promulgated in the February 2009 report by the UC Berkeley Reliable Adaptive Distributed Systems Laboratory. This organization, also known as the RAD Lab, identified three characteristics of cloud computing: The illusion of infinite computing resources available on demand, thereby eliminating the need for cloud computing users to plan far ahead for provisioning; The elimination of an upfront commitment by cloud users, thereby allowing companies to start small and increase hardware resources only when there is an increase in their needs; The ability to pay for use of computing resources on a short-term basis as needed (for example, processors by the hour and storage by the day) and release them as needed, thereby rewarding conservation by letting machines and storage go when they are no longer useful. What do these three characteristics mean in real-world environments?