Who Is Doing What?
Let's begin by reviewing who is doing what in the IT marketplace. Competing marketing terms (similar though they may be) are used to describe utility computing:
IBM introduced on-demand computing.
Hewlett-Packard (HP) uses the term utility data center (UDC). This from the company that claims to have actually invented the concept, or first wrote about it, more than 20 years ago.
Sun Microsystems calls it N1, a virtualized version of the network and data center.
Microsoft announced their Dynamic Systems Initiative (DSI), which proposes to unify hardware, software, and service vendors around an open software architecture that enables customers to harness the power of industry-standard hardware; and brings simplicity, automation and flexibility to IT operations.
There are even more. It gets a little more descriptive or abstract, depending on your perspective, as the different vendors get into terms for more actual offerings:
Virtual Data Center. Sun seems to be promoting this term the most, although the others all have their ways of describing it as a sweet spot. It means pooling resources to make them seem like one big machine.
Autonomic computing. This technology is being offered mainly by IBM, but there are others, at least in this way. Think self-healing, self-managing networks and systems.
Adaptive infrastructure. HP has this version of utility computing.
Grid computing. Dozens to hundreds of individual systems (PCs, workstations, servers) connected via LAN or WAN to solve computer or data-intensive problems, now evolving from scientific uses to more practical business applications.
Dynamic data center (DDC). In addition to the main DDI announcement, Microsoft announced that it will showcase the concept of a DDC that it developed jointly with HP. DDC features a combination of HP servers, software, storage, and networking hardware connected based on prescribed network architecture. Microsoft software dynamically assigns provisions and centrally manages the DDC resources.
Web services. An overused term in its own right, but one that actually can be better understood as part of the larger utility computing concept. Its initial promise is to automate communication between disparate applications, taking advantage of evolving open standards such as XML. But where it's headed goes well beyond software communication protocols, to delivering real serviceseven going beyond "software as a service" to "business process as a service."
Utility computing can weave these technologies together, so that the users can mix-and-match to specific needs and requirements.