Software Development Has Always Been Difficult
- Software's Difficult Past
- What Makes Software Development So Difficult?
- Harris Kern's Enterprise Computing Institute
In a previous article, we defined "10 commandments" for successful software development. But why is successful software development so difficult? The answer lies in the unique combination of people, processes, and technology that need to come together for a software development project to succeed. If you understand the dynamics of this combination, you'll start to understand why there has never beenand never will beany "silver bullet" in software development. By learning from the past, we can try to avoid making the same mistakes in the future.
Software's Difficult Past
Let's take a brief look at the history of modern software and identify some of the difficulties surrounding successful software development.
In the 1970s, development backlogs for corporate IT departments averaged 18 months or longer. Since IT was usually the only department with the necessary resources to develop software, it owned the monopoly and often wasn't concerned about service levels or prices. In the 1980s, developers struggled with PCs, DOS, and 64KB memory limitations. In the 1990s, just as many software developers thought they were starting to understand client/server software, widespread use of the World Wide Web set expectations for point-and-click access to any piece of corporate data. Software and network infrastructures struggled to catch up with web technology that literally overnight made many of even the newest client/server software architectures obsolete. One thing, however, has remained constant over time: There are no silver bullets in software development.
Successful software development starts with good requirements and good software architecture, long before the first line of code is ever written:
Processes. Since software is easy to modify compared to hardware, often it's all too easy for users to change the requirements. Changing even a single line of code can wreak havoc on a program, especially a poorly designed one. You also need to start planning for testing, production rollout, and maintenance of your software early in the project lifecycle, or you'll never catch up.
People. You need more than just a few good software developers for a successful project. You also need system administrators and other support staff, such as database administrators in your development organization. As you schedule and budget a project, you must make programmer skills the largest weighting factormore so than the language, development tool, operating system, and hardware choices you also have to make.
Technology. If COBOL programmers in the 1970s would have planned for users accessing their programs through a web-based front end in the year 2001, imagine where we would be today!
In the 1970s, IT departments running large mainframes controlled most of the corporate software development projects. The mainframe was the infrastructure for the enterprise-computing environment. COBOL was the language of choice. Any department with an adequate budget, and willing to wait for the average IT department programming backlog of 18 months, could have the application they wanted developed or modified. Software was difficult to develop, if for no other reason than because development was so tightly controlled by a small group of people with the necessary skills and access to expensive computers. In reality, much of the perceived unresponsiveness of centralized IT organizations was not due to any lack of software development skills or organizational structure; it was simply a result of the software architectures imposed by COBOL and mainframes.
Mainframe-based enterprise software applications, such as payroll processing, were typically monolithic programs in which even simple changes were difficult to implement. The complicated structure of such programs usually limited the number of people who could modify them to their original developers. It was cost-prohibitive to have a new developer learn enough about a large mainframe program to modify it. (This became painfully obvious when many organizations tried to update 1970s code to make it Year 2000 compliant.) Instead, development managers would simply wait for the original developers to finish their current tasksand then assign them to go back and modify their earlier work. COBOL technology was well understood by those developers who programmed in it. Even in the rather simplified model of centralized mainframe development organizations, however, people and process issues already played equal weight to technology issues in their impact on the success of software development.
In the 1980s, inexpensive PCs and the popularity of simpler computer programming languages such as BASIC led to the start of IT decentralization. Even small departments with no formal IT staff could purchase a PC, figure out the details of DOS configuration files, and get a department member with a technical background to learn BASIC. There was no longer an absolute requirement to wait for a centralized IT organization to develop your software program. Suddenly, large companies had dozens or perhaps even hundreds of "unofficial" IT departments springing up, with no backlog to complete, which could immediately start developing stand-alone applications. The only required infrastructure was a PC and plenty of floppy disks for backing up programs. Software seemed easy for a moment, at least until a program grew larger than 64KB or needed more than a single floppy drive's worth of storage. Even the year 2000 was only a far-off concern that crossed a few developer's minds. Most PC applications couldn't access mainframe data, but most developers were too concerned about installing the latest operating system upgrade to worry. Software development was still difficult; we were just too busy learning about PCs to worry about it.
One result of the 1980s PC boom on software development was the creation of "islands of automation." While the software program on a stand-alone PC might have been very useful to its user, such programs often led to duplicated work and lower productivity for the organization as a whole. One of the biggest productivity losses suffered by organizations was probably duplicate data entry when a stand-alone system could not communicate with a centralized system and the same data was required by both systems. Many organizations still suffer from "multiple data entry" today, and it continues to be a challenge to software developers who must reconcile input errors when trying to collect and merge data. This process, referred to as data cleansing, is particularly applicable in one of the hottest new fields of software, data warehousing. Data cleansing is a well-known problem to anyone trying to build a large data warehouse from multiple sources. Electronically connecting islands of automation, rather than solving the problem, simply increases the volumes of data that must be combined from various systems. As with many development-related problems, the answer lies not in simply interconnecting diverse systems, but in doing so with common software architecture that prevents such problems in the first place.
In the 1990s, corporations started to worry about centralized software development again. Microsoft Windows replaced DOS as the prominent operating system and brought a graphical user interface to stand-alone applications, along with a new level of programming complexity. Business managers realized that stand-alone PC applications might solve the need of one department, but did little to solve enterprise-wide business and information-flow problems. At the same time, UNIX finally matured to the point that it brought mainframe-level reliability to client/server systems. This helped connect some of those PC islands of automation, but at a cost. MIS directors often found themselves supporting three separate development staff for mainframes, UNIX, and PCs.
In the second half of the 1990s, our kids suddenly started teaching us about the World Wide Web and the Internet. Almost overnight, network infrastructure went from connecting to the laser printer down the hall to downloading multi-megabyte files from the web server halfway across the world. With a few clicks, anyone who could figure out how to use a mouse could get stock quotes and Java-enabled stock graphs on a web browser. A few more clicks to register on the site, and you could be completing e-commerce transactions to sell or purchase that same stock. With the explosion of the Internet and its inherent ease of use, the same expectations were instantly set for accessing corporate dataapproximately 80% of which is still stored on mainframes!