The Business of Technology, Part 2: New Technology, Old Ideas
- Business Enters the Digital Age
- The Tried and True
- Harris Kern's Enterprise Computing Institute
...the final causes of all social changes and political revolutions are to be sought not in men's brains, not in men's insights into eternal truth and justice, but in changes in the methods of production and exchange.
Friedrich Engels, Socialism: Utopian and Scientific (1972, International Publishers Co.)
This passage is an example of man's innate understanding of the importance of technology to everyday life. Max Engels was an economist, not a technologist, yet his definition of an economy is clearly based on technology: "methods of production" combined with commerce. This is not the historical revelation that made Engels famous, for Adam Smith had written the same thing 70 years earlier. It was Engels' insight into the relationship of these two things. Engels realized that technology was not the result, but the determining factor of how people lived their lives.
Prior to the assembly line, all goods, agrarian or manufactured, depended on a distinct and specialized labor pool. This was good for labor, but not for the consumer, because these pools could produce only so many goods. Relatively speaking, anything produced prior to the assembly line was quite expensive. But with the advent of the assembly line, manufactured goods could be made cheaply and in vast quantities. At the beginning of the 19th century, the vast majority of American households made their own clothing. By the end of the century, making one's own clothing, at least for everyday wear, was relatively rare. Today it's almost unheard of. The New England textile makers made fortunes supplying the United States and eventually the world with high quality but inexpensive shirts, pants, and dresses.
After the success of the textile industry, almost every form of manufacture turned to the assembly line. Soon almost all consumer goods, from bathtubs to automobiles, were rolling off of assembly lines. In conjunction with the explosion of textile manufacturing, artificial dyes were being developed to supplement natural dyes that were expensive and limited in quantity. This led to the intense study of and research into pure chemistry. Eventually, petroleum products, first used to replace whale oil, were subjected to chemical processes, and the nascent organic chemical industry was born. Electricity was harnessed and exploited to meet the new demands of a well-dressed populace with more money in its pockets than ever before. The Industrial Revolution began in the 18th century, and by the end of the 19th century it was in full swing.
This unprecedented consumer demand fueled a number of disparate industries to innovate. These innovations came together to form follow-on innovations that led to whole new industries. In turn, these innovations in manufacturing, chemicals, electricity, and magnetism led to a vast array of inventions in a phenomenally short period of time. From 1859 to 1900, the air brake, the telephone, the light bulb, the electric generator, the Bessemer process, acetylene, the typewriter, the zipper, celluloid film, the movie projector, and scores of other innovations were born. After the turn of the century, economists no longer had to differentiate one industry's innovations from another, but could combine products and services in economic terms to describe the country's entire economic output. With the advent of the New York Stock exchange in 1817 came the financial market's ability to "value" a company (whether it was a manufacturer, a chemical company, or entertainment) by its products as well as by its ability to innovate. The market for technology was born along with the modern notions of progress that we still accept today: market share, industrial growth, and R&D. (Our most recent notion of progress is Internet speed.)
By the 1920s, at least in the United States, these concepts of modern business took firm economic root, and the course of the 20th century was set. Countries and governments rose and fell according to the industrial standards they were able to set and meet. The United States became a world power not just because of its military might, but because of the American system of manufacturing. More than the doughboy, the American assembly line turned the tide in World War I. But it wasn't just America that was following this course; although the world in general was still very provincial, and peaceful intercultural relationships scarce, the modern global economy was eventually born, and the European colonial economy slowly headed toward a self-destructive end.
The 1930s veered a bit off course. Most traditional businesses still had a 19th century perspective of what business was about. Even though their own economies had become interdependent, a railroad man concerned himself only with railroads, farmers only with their farms, bankers only with their customers and finances, and manufacturers only with what they manufactured. And with the collapse of the stock market system in 1929, most businesses did what any 19th century company would have done: cut their losses and started from scratch.
But because the world economy had become completely intertwined by this time, drastic measures one business used to save itself only hurt other businesses that supported it. Without this support, businesses fell further into the hole. Soon, the hole became a chasm, resulting in the Great Depression. Politically, of course, both historians and economists have noted the ramifications in great detail. (For more information on the Great Depression, see www.amatecon.com/greatdepression.html.) But the Great Depression, in spite of all its hardships, finally crystallized, at least in the minds of many economists and political theorists in the United States, just how interdependent business all over the world had become. As a side note, Keynesian economics was the result of acting upon this analysis, and modern macroeconomic theory was born.
The crux of modern macroeconomics (and microeconomics, for that matter) is the concept of the consumer, the producer, and the pricing model. If a firm is in business, its purpose is to generate a profit. Let's consider a simple example using old-fashioned dollars to measure a profit. If company A spends $2 to make a widget and sells it for $3, it makes a $1 profit. The consumer, on the other hand, is out to maximize value. If the consumer has $3 to spend, he or she wants to get the most that $3 can provide. One day, our consumer happens to be looking for a widget. If $2 can buy a widget of the same quality as the $3 widget, he or she will buy the $2 widget and have $1 left; this $1 is referred to as added value. Now the important point is, where did the $2 widget come from? If it costs $2 to make a widget, how can a company sell it for $2 and make a profit? Simple. Company B figures out how to make the widget for $1.90. True, this company makes only a $0.10 profit, but it presumably sells more widgets. If it sells 10 more widgets than company A sells, company B profits similarly to company A.
Without going into pure economic theory, let's simply say technology is responsible for the $1.90 widget. There are many ways to trim $0.10 off the cost of producing a widget, including hiring less expensive labor, installing cheaper machinery, and finding lower rent. But the focus of our theory of economics is added value through technology. Company A's widget press can produce 100 widgets an hour. If it tries to create any more, its main piston becomes too hot and breaks down. Company A used the Acme widget press that has been the standard in the industry for decades. Company B, however, uses the Digital Wizard widget press. Digital Wizard's widget press uses a microprocessor to regulate just how many stamps the pressing piston makes in an hour. If the piston exceeds 100, the microprocessor automatically slows the piston motion down to more acceptable levels. With this safety system in place, the Digital Wizard widget press can now run much longer during the course of a business day, producing considerably more widgets at about the same price as the Acme widget press. If the overall incremental cost reduction is $0.20 per widget, then not only can company B reduce the cost of widgets by $0.10 cents, it can even make an additional profit of $0.10 on top of the extra units sold. This, from company B's perspective, is a strategic advantage created by technology!
This is an oversimplification, and a fanciful one at that, but it does explain the idea of generating added value by implementing technology. In this case, I used a digital example, but a different processing method or any number of other things could have created the added value. By the turn of the 19th century, technology was seen as an amalgamation of mechanical doodads and machinery to do work, the strategic advantage being gained by simply increasing production rather than by doing work more efficiently. But by World War II, the idea that technology, in and of itself, could generate real economic value was settling into the American business mindset. By the end of World War II, this concept was ready to take off.
The technical innovations created by industries around the world in the latter half of the 19th century were exploited to their fullest in the two world wars at the beginning of the 20th century. By the end of the World War II, it was obvious that the 19th century way of life was gone. Instead of empires, kingdoms, reichs, and colonies, the business world would become thousands of corporations, each looking for its own strategic advantage in an acknowledged world economy. Note that both of the world wars had been "technological" wars. World War I introduced long-distance aeronautics, chemical warfare, and the footprints for wireless technology and radio. World War II resulted in an even more diverse explosion of new technologies into everyday life, including the use of radar, the jet engine, the atomic bomb, helicopters, television, and of course the electronic computer.
Business Enters the Digital Age
The tabulating machines of the 1890s had proved useful not only in census taking but also in military use for calculating artillery trajectories. Since the amount of artillery in use during World War I alone was staggering, the need for faster, more reliable calculating had become essential to the aggressively militaristic economies of the early 20th century. The problem of calculating numerical tables remained largely mechanical, however, until just before World War II. Germany began experimenting with completely electronic calculating machines as early as 1932, but because they relied on the then-new vacuum tube technologies, these electronic calculators were excessively large and expensive. But the electronic technology race was on!
By the end of World War II, the age of technology was ready to take hold. The power of ENIAC (an early tabulating machine) and other computers like it were put to immediate use by the military as the first line of defense in the cold war. But, by the mid-20th century, the electronic digital computer was able to divert some of its power from military to commercial use. By the mid-1950s, the SABRE application was introduced by IBM and American Airlines for the express purpose of real-time flight reservations, meaning that if a seat was booked in New York for a flight to Los Angeles at 12 noon Eastern time, a travel agent in Los Angeles would know about it at 9:00 a.m. Pacific time, and vice versa. This assured that the same seat could not be booked twice. This created a considerable strategic advantage for American Airlines, as double-booking flights was a serious problem for the burgeoning transcontinental airlines business of the 1950s. The implementation of SABRE was a smashing successso much so that it's still in use today.
From that point on, the value of real-time processing became the instant strategic advantage to any firm whose business crossed more than one time zone, and computers became a necessity for business. Thirty-four years after the Computer Tabulating Recording Company had changed its name to IBM, "Big Blue" generated revenues of well over a billion dollars. The Information Age was here to stay.
But although technologydigital technology in particularwas now generally accepted as a sound, sometimes even necessary business stratagem, the cost of entry was still pretty steep. During the 1950s and 1960s, the cost of an IBM computer was astronomical. International and transcontinental companies could justify such an expenditure because of the advantages of doing business in real-time, but local businesses and businesses where processing in real-time really didn't matter so much were unable to make this computing investment quickly. These smaller local companies, firms, businesses, colleges, universities, and government agencies all turned to the digital computer in a variety of ways. All had to somehow justify the expense. With the mood of the nation decidedly progressive, any excuse would do. If you couldn't buy a computer from IBM, you could lease it for a few thousand dollars a month. This was palatable to corporations that were able to directly replace labor in areas such as accounting and finance. For a few thousand dollars per month of computer time, companies could replace 20 accountantstheir salaries, pensions, benefits, and sundry "people" issueswith a system willing to work 24 hours a day, able to calculate thousands of times faster, and for the most part, rarely making mistakes. This was the stuff of both science and popular fiction. But in reality, the cost savings were slow to accrue. The 20 accountants were only replaced by 20 programmers, operators, managers, and technicians to support this new colossus. But what board of directors could resist making such a beast a part of their modern corporate arsenal?
So, by the 1970s and 1980s, computer systems, especially the mainframe, were well established in the international business community. However, the fiction and romance had given way to experience and bottom-line common sense from a generation that had used these systems both successfully and unsuccessfully. Anyone who knew even the most basic computer programming languages was in very high demand, although the managers hiring or even interviewing these people sometimes had very little knowledge about computers or programming. But by this time, the mainframe had become like the family carrelied on with little idea of how it works.
Now that we have established that business and technology are almost synonymous in the 20th century, we can take up one of two general points of view: Either we ourselves are the purveyors of technology (that is, our product or service is technology), or our organization's main business is not technology but rather the use of digital technology to gain a strategic advantage in our particular market. There is of course a third option: that our economic system is composed of single enterprises that produce technology by relying on other companies' technologies. But this new breed of business is relatively young in the world of digital technology.