Evolution of the Data Center: Moving Toward Consolidation
The need for consolidation in the data center didn't just occur overnight; we have been building up to it for a long time. In this chapter, we review the evolution of today's data center and explain how we have managed to create the complex information technology (IT) environments that we typically see today. This chapter presents the following topics:
"Consolidation Defined"
"History of the Data Center"
"Complexity in the Data Center"
Consolidation Defined
According to Webster's College Dictionary, consolidation is the act of bringing together separate parts into a single or unified whole. In the data center, consolidation can be thought of as a way to reduce or minimize complexity. If you can reduce the number of devices you have to manage, and if you can reduce the number of ways you manage them, your data center infrastructure will be simpler. With a simpler infrastructure, you should be able to manage your data center more effectively and more consistently, thereby reducing the cost of managing the data center and reducing your total cost of ownership (TCO).
When we first started working on consolidation methodologies in 1997, we focused on server and application consolidation; the goal was to run more than one application in a single instance of the operating system (OS). Since then, the scope has widened to the point that virtually everything in the corporate IT environment is now a candidate for consolidation, including servers, desktops, applications, storage, networks, and processes.
History of the Data Center
Over the last 40 years, the data center has gone through a tremendous evolution. It really wasn't that long ago that computers didn't exist. To better understand how we got to a point where consolidation has become necessary, it's worth taking a look at the evolution of today's computing environment.
The following sections address the role mainframes, minicomputers, and distributed computing systems have played in the evolution of the data center in a historical context. However, it is important to note that many of the qualities mentioned affect the choices IT architects make today. While mainframes are still the first choice of many large corporations for running very large, mission-critical applications, the flexibility and affordability of other options have undoubtedly altered the design and functionality of data centers of the future.
The Role of Mainframes
Mainframes were the first computers to gain wide acceptance in commercial areas. Unlike today, when IBM is the sole remaining mainframe vendor, there were several mainframe manufacturers. Because IBM has always been dominant in that arena, the major players were known as IBM and the BUNCH (Burroughs, Univac, NCR, Control Data, and Honeywell). These major players dominated the commercial-computing market for many years, and were the data processing mainstay for virtually all major U.S. companies.