The Server Layer
Above the storage layer in our simplified diagram is the server layer or computing infrastructure. This layer consists of computer "servers"a term that includes mainframes, minicomputers, and personal computer-based systemsthat process data to produce useful information.
Simply explained, a server utilizes a central processing unit (CPU) to execute the programmed instructions submitted to it as software. For basic operations, the CPU acts in accordance with a software-based operating system. An operating system (OS) is the program that manages all the other programs in a computer. Popular operating systems range from OS/390 on IBM mainframes, to UNIX variants, Linux and Microsoft Windows NT or Windows 2000 on midrange and server systems, to Microsoft Windows or Apple OS on desktop systems.
The programs managed by the operating system are called applications. Applications range from complex and specialized business transaction processing systems to generic databases, spreadsheets, electronic mail, and word processors. A server is said to "host" the application it executes.
There continues to be a bit of a bias in the technology industry that places the server at the center of the "IT universe." This concept, which affords the server infrastructure precedence over other infrastructure layers, dates back to the earliest days of corporate computing and is reflected in the designation of the department responsible for corporate IT as "data processing" or "information systems." The bias is easily explained by the dominant architecture of early corporate IT shops: a mainframe typically sat at the center of the "glass house" of the data center, where it controlled data storage, hosted all applications, operated all peripherals (such as printers, tape drives, and other devices), and managed access through a network of dedicated user terminals. In this configuration, the server would seem to be the "king."
The arrival of personal computers (PCs) and of local area networks (LANs) changed this mainframe-centric universe, however. PCs enabled computing technology to be distributed beyond the confines of the mainframe data center, out into departmental settings and onto the desktops of individual users. LANs allowed the distributed computers (desktops, minicomputers, and mainframes) to be united into networks and to share resources with one another, if necessary. New applications were designed to capitalize on this distributed computing "platform," leading eventually to the observation by Sun Microsystems's CEO, Scott McNeely, that the network had become the computer.
The bottom line is that the server layer or computing infrastructure today comprises a key component, but by no means the only or most significant component, of IT service provisioning. The functions of this layer are numerous, as summarized in Figure 13. First and foremost, architects need to evaluate the appropriateness of operating environments and platform hardware to meet specific application requirements.
Figure 13 Systems infrastructure systems.
According to Dan Kusnetzky, vice president of Systems Software Research for International Data Corporation (IDC) in Framingham, Massachusetts, approximately 5.7 million operating environment shipments were made to consumers worldwide in 1999. Microsoft Windows NT shipments commanded approximately 38 percent of all shipments, while the open source UNIX operating system "look-alike," Linux, accounted for 25 percent. The balance of operating system environments comprised a mix of Novell Corporation's Netware, various versions of UNIX, and a small percentage of other OS products. A summary of IDC's findings is provided in Table 11.1
Table 11 1999 Operating Environment Shipments Worldwide *
Microsoft NT Server |
38% |
Linux |
25% |
Novell Netware |
18% |
UNIX |
15% |
Other |
4% |
Total Shipments: |
5.7 million copies |
Kusnetzky clarifies a popular misconception that all operating environments are general purpose in nature. Based on IDC research, he observes that organizations "tend to use different operating environments for very different purposes."
According to Kusnetzky, the four uses most often cited by companies deploying Microsoft NT and Novell Netware operating environments "are file and print services, electronic messaging, communications services, and database supportin that order." By contrast, companies fielding UNIX servers rank "database support as their number one use for the operating system," followed by electronic messaging and custom application development. "While functionally similar," he points out, "different operating environments fill very different application niches."
For example, Kusnetzky observes that Linux, although certainly capable of supporting a broad range of applications, "is primarily used to support Web servers. In contrast to UNIX, less than 10 percent of companies use Linux to host databases."
Sorting through the options for server architecture(s) and OS environment(s), then fielding a well-managed server infrastructure that can scale as company needs dictate, remains a key determinant of an effective systems layer in corporate IT services. Also, like the storage infrastructure, the server infrastructure needs to be designed with acceptable levels of redundancy, failover capabilities, physical security, and other disaster avoidance technologies to protect the operation of servers and to provide availability at desired levels. A sturdy server management capability is also required to monitor thresholds of server performance so that problems can be identified early and resolved before they cause avoidable downtime.