- The Right Tool for the Job
- History's Hardware Trail
- What Was That For?
What Was That For?
Now, why did I feel it necessary to provide a terse history of the hardware trail to the Web in an article about software platforms? For two reasons: mess and evolution.
First, the free-love PC has been to the stodgy mainframe much like democracy is to the communist state. No one will ever claim that democracy is an orderly, efficient mode of governance, nor will anyone claim that checking version numbers of each system DLL is any fun. However, no one I care to associate with would call for the return to either an aristocracy or dumb terminals. The problem comes in the gray realm between the two possibilities. Instead of performing financial transactions on mainframes, consuming clock cycles on workstations through atomic calculations, and using personal computers as expensive typewriters, Web developers must be able to target a range of products with varying standards and functionality.
Second, as hardware technology has improved and operating systems have become more sophisticated, evolution has moved toward a balance between functionally rigid mainframes and self-contained desktops based on applications. If you need to store the international bank records for millions of customers, a mainframe is the only choice. If you need to check your email, the possibilities now range from phone to pager to Internet appliance, and so on. Just as the aforementioned Unix administrators scripted multiple operations into a single unit of execution, PC users now retrieve music and information on a number of Web sites. Why should they not also be able to access the functionality of those Web sites without having to purchase or install new software for every new functional whim?
Now, this is not to be confused with standards-based development of traditional behavior and design patterns with a middleware platform in mind. An acronym soup representing different faces of competing and complimentary technologies now pervades e-space, providing a semblance of order and a couple of targets at which developers can shoot. HTML is one such standard; XML is another. But standards come, standards go, and standards evolve. What survives are the practices (and companies) that provide the clearest benefit to both people providing a service and those consuming a service.
The idea of Web services was conceived to make it possible to transfer a wider array of goods and information between providers and consumers without needing to transfer unnecessary overhead in the form of software.
Any idea that lubricates such transactions is a survivable one. Take money, for example. Bartering worked really well until you needed a bucket of snow (don't ask) and all you had to trade for it was suntan lotion. Such is the case with Web services. If the beach is only a vacation for you, why should you have to purchase lotion, an umbrella, a towel, a jeep, and a boat just to visit? Although the lotion may sound perfectly reasonable, what if it was available only in 50-gallon drums? This notion is of increasing importance as the future begins to take shape. More people are wanting to participate in services or have access to wide bases of knowledge than ever. And what happens when a market (in this case, software and computing in general) becomes a volume business? A lower percentage of your consumers will be willing to pay the traditional prices when they don't engage in traditional usage.
The future of computingand, more or less, the premise of this articleis a seemingly natural extension of existing practices and technologies that has eluded many a platform designer to date: to introduce software objects (relatively independent bits of functionality) that are currently bought and sold in a wide and varied market, and convert them into readily accessible services.
In more practical terms, how many pieces of software truly must exist to compute mortgage interest? (Actually, many, but let's not dwell on that.) Wouldn't it be better to have a single piece of software capable of computing mortgage interest in all manners with many different variables taken into account as needed? This is the notion of services. Add the capability to perform the complex operation many times per second and either store or report the result, and you have what passes for a Web application.
At the heart of this Utopia lies the basic design principles of encapsulation and componentization. Although this is good practice in general applications, it is essential in the realm of Web services, where speed is of the essence and you want to trouble users with only the information they truly desire.