- The Computer of the Future Meets Reality
- Information Technology Is Your Business
- Public Sector Recognizes Critical Value of IT
- IT Can Disable an Organization
- Rapidly Shifting Business and Technological Requirements
- E-Business Meets Aging Hierarchies and Infrastructures
- No Easy Answers to Difficult Legacy Challenge
- Business Agility and Legacy Systems
- Redesigning Business Processes— Enabling the Agile Enterprise
- The Evolution of Legacy Computing Architectures
- The Business Case for Legacy Architecture Transformation
- Crafting a Strategy to Address the Legacy Architecture Challenge
- Taking on the Legacy Challenge
1.10 The Evolution of Legacy Computing Architectures
Legacy architectures are comprised of hardware, software, and data. Old or obsolete hardware is readily replaced, as long as the software can be migrated to another machine and reused. Software portability has therefore been a major consideration since the 1960s. That is when IBM introduced the IBM 360 computer. Prior to that time, most computers were not multifunctional, and the software that ran on one computer would typically not run on a different computer. The IBM 360 changed all of that.
The IBM 360 allowed an enterprise to run multiple applications on the same machine. This made the 360 immediately popular. What sustained its popularity, along with IBM's rise to prominence in the computer industry, was the concept of upward compatibility.
Upward compatibility, pioneered by the IBM 360 along with its successor, the IBM 370, allowed a company to develop software that could be readily ported to the next generation of hardware. As a key component of this strategy, certain application languages were designed to be portable across hardware environments of different manufactures. An application system is used to solve business or user-related problems. This is in contrast to operating system software, which is used to make a computer function at a basic level. The most prominent application programming language to emerge from this era was COBOL.
COBOL, the Common Business Oriented Language, was created in the 1950s by a team headed by Admiral Grace Hopper. COBOL is a third-generation language, meaning it is more understandable to humans than earlier machine and assembler languages. More important, COBOL compilers were built for most hardware platforms so developers could run the same software across any number of hardware environments. Today, there is an estimated 200 billion lines of COBOL software accounting for roughly 60 percent of the total software deployed worldwide10.
Upward compatibility is a crucial concept in understanding how legacy systems evolved. It allowed most application systems built during the 1960s or thereafter to be moved from an older hardware platform to the next generation of hardware platform without rewriting the source code. The source code is what the programmer maintains. A compiler turns source code into software that the computer can execute. This concept has been the basis for computing languages for the past 40 years.
Many of these application systems are still running in production at countless organizations around the world. Applications have a very long life. In fact, many COBOL applications are older than some of the people working on those applications are. While hardware can be replaced, legacy software and the data processed by that software live on for many decades.
Legacy systems are not defined by age, language, platform, or data structure type. If an application system is functioning in a production environment within an enterprise, it can be considered a legacy system. Today's new system is tomorrow's legacy system. C and Visual Basic applications were once new, but now they are legacy systems. This is not due to their age, but rather to the fact that they are running in a production environment.
Legacy applications include handcrafted systems built inhouse, third-party (leased) software packages, systems generated by other software, and systems managed by third parties. These third parties include outsourcing firms and application service providers (ASPs). Third-party packages represent a minority of legacy applications, but even these packages were mostly handcrafted and therefore pose a major challenge to the vendors responsible for maintaining and enhancing them and companies using them.
Legacy systems run on mainframe, mid-range, or distributed computers and may be written in any one of hundreds of computing languages. COBOL, for example, runs on mainframe, mid-range, and networked computers. COBOL has also been updated to accommodate the latest distributed user interfaces. Other major programming languages include Assembler, FORTRAN, PL/I, C, C++, LISP, Visual Basic, Java, XML, and well over 400 other languages.
The term architecture refers to how computer systems were designed and implemented. It also can describe how a system can be invoked or used and how it shares data, interacts with the user, and communicates with other systems. Early software architects lacked the understanding we have today when they originally designed and constructed legacy systems and data structures. This is because, at just over 50 years old, IT is a young industry compared to architecture or engineering, disciplines that are thousands of years old.
Application architecture describes the design and construction methodology, implied or explicit, behind an application. Legacy application architectures typically have the following attributes.
Humans cannot understand how the system functions.
The system is hard to modify with confidence that a given change is correct.
Business logic is hard to distinguish from logic that controls data access, user interface, and environmental management functions.
Business logic is redundantly and inconsistently defined within and across systems.
The system lacks functional or technical documentation, or both.
It is difficult to integrate the system with other systems not built under the same architecture.
Legacy application architectures are not only a major challenge to organizationsthey are proliferating at a surprising rate. Each year, more than 5 billion new lines of COBOL code alone are added to legacy portfolios [10]. At the same time, legacy applications are increasingly the focal point for supporting internal and external initiatives. These initiatives almost universally require sharing and integrating functionality and data across barriers built into these early systems decades ago. Legacy applications rarely fulfill these requirements.
Because legacy applications specify access points to legacy data along with the business logic for processing that data, they cannot be eliminated or easily replaced. Most legacy systems do not integrate well with other legacy systems or with new systems because each system tends to access and process data in its own unique way. This is the one main reason that EAI emerged as a means of "triggering" legacy system transactions from Web-based front ends to access and process legacy data.
The data architecture within an enterprise is typically a derivative of the application architecture. In a perfect world, the data architecture would reflect business requirements and facilitate easy access from any application across the value chain, which includes internal business units as well as distribution and supply chains. The term data architecture refers to how enterprise data was designed and implemented. Legacy data tends to have many of the following characteristics.
Data is defined and stored redundantly across multiple stovepipe business units and applications.
The same or similar data is defined inconsistently across multiple systems.
The same data terminology may be used to define different data across multiple applications and business units.
The integrity of the data may be poor and contain information it should not contain.
Data may not be easily accessible by modern systems or through user-based inquiries.
Data cannot be readily shared across systems, business units, and organizational boundaries.
Businesses need access to legacy data and system's functionality in ways that legacy architectures cannot support. EAI tools help to a degree, but EAI is a stopgap measure and not a long-term solution. Legacy systems based on stovepipe infrastructures simply cannot support real-time, cross-functional business requirements in zero latency fashion. A Web-based order processing system should, for example, return a customer order confirmation immediately. Back-end procurement systems, however, may take days to confirm a customer order, if that functionality exists at all.
Even if this were the case, legacy data architectures cannot support an integrated, consistent, and readily accessible view of enterprise data on demand. Legacy applications are needed to manage essential business data defined in legacy data structures. Legacy applications accomplish this by applying business logic that most humans have long since forgotten even existed.
Legacy system and data architectures are valuable business assets and should be recognized as such when IT planning takes place. If the role of legacy systems in projects is assessed openly and candidly, executives will discover that they have a better set of options at their disposal for deploying priority business and information initiatives. As this occurs, the enterprise will need to identify more effective solutions to the legacy architecture roadblock than they have found to date.