Does Technology Have a Will of Its Own?
We know that weather behaves in certain ways, and an understanding of weather patterns helps forecasters do a better job. For example, when a mass of humid cold air collides with a mass of humid warm air, we can expect storms. Observation and analysis then suggests to what extent such storms are violent, and where and when they will probably occur. Students of technology have long debated and studied the question of whether or not technology also has an internal dynamic of its own, in which patterns of behavior can be identified, and thus lead to a body of knowledge about how it evolves. If we could forecast the behavior of technology, the benefits would be enormous. One could predict when new capabilities would exist, when to adopt or drop the use of a device or technology, when to invest, and so forth. The issue is so important that students of technology have emerged in many disciplines: history, economics, philosophy, literature, political science, engineering, science, and even in cultural anthropology and sociology. As American society became increasingly aware of the importance of technology during the Cold War, experts joined with those from Western Europe and later East Asia, to study the nature of technology. Historians have conducted some of the best studies on the issue.
The term often used to capture the essence of the debate is technological determinism. As one student of how historians study the issue defined it, "determinism asserts, in the first place, that technological change occurs within a purely internal frame of reference, that norms of internal and functional efficiency are the only ones governing it." Put another way, "determinism asserts that modern technological change follows a fixed and necessary sequence." The essence of the debate is whether or not technology evolves based on its own internal set of laws, regardless of what people might choose to do. The debate is also intensified and broadened by concerns about whether or not technology did something. For example, the statement that the car created suburbia, or that the Internet globalized the economy almost gives a specific technology an enormous power. Other examples come from various walks of life: atomic bombs took away the ability of the United States to declare war, computers made IBM the largest IT supplier in the late 20th century, hand calculators have ruined children's ability to do math, automation has destroyed millions of jobs. Does an invention take on a life of its own? Two students of the subject, Leo Marx and Merritt Roe Smith, have made a very important point, namely that the mass media (and I would add also those writing for business managers and public officials) have been the most vociferous in assuming or arguing in favor of technology taking on a life of its own. On the other hand, the majority of historians, including these two have argued a different case. Most historians see a variety of issues affecting the nature of technology and how it evolves. These include the values people hold, attitudes of the developers of technologies, existing technical and economic realities, market conditions, and other political and social priorities.
Historians who have looked at hundreds of instances of technological evolution have found many influences at work, thereby providing a great deal of evidence that technology does not necessarily have a life of its own. These influences have ranged from social attitudes to political and legal choices, to the circumstance of events and economics, to available materials and knowledge. One simple example illustrates their perspective. In the 1940s, military airplanes experienced a high number of catastrophes as wings fell off in flight, due to bad mathematical calculations during their design. In the early 1950s, the U.S. Air Force mandated that all aircraft designed for its use had to have the wings designed using software created for that purpose because the calculations and engineering would be more accurate. The software, called Computer Aided Design, or CAD, was expensive at the time, not easy to use, and the computers used were not really as efficient as engineers would have liked. Many would have preferred just to design using blue prints and slide rules. But the military made the choice. One consequence was that wings stopped falling off airplanes. Another was that during the 1950s, 1960s, and 1970s, enormous advances where made first in CAD software and then in a by-product called Computer Aided Manufacturing (CAM). Once you had digital data on a design it was not a great leap in logic to use that data to instruct machines to make parts that were exactly what the blueprints called for, with little or no human intervention. CAD would not have developed as early as it did if it were not for the U.S. Air Force, and who knows how much later CAM would have arrived. The engineering and mathematical possibilities of using software were always there, but other influences determined how the technology would evolve. It is why, for example, that the earliest and most extensive users of CAD/CAM here in the U.S. aerospace industry. In time, this technology diffused into all manufacturing industries and around the world.
The issue of determinism spills over into discussions about technology and society. It is not uncommon to see comments like "Americans have a tendency to invest in technology when they have a problem to solve." I have made comments like that all through this book. But is there a tail wagging a dog here? True, Americans have a tendency to rely heavily on technology, a behavior that has been reinforced over time by numerous successes with technological innovations. Does that mean that Americans will always reach for technology? Does that mean that they never have or never will rely on alternatives? Are Americans prisoners of a pattern of behavior in which technology takes on a life of its own? In other words, is it so deterministic that people become prisoners of some process by which technology changes or seizes control over patterns of behavior? Some scholars have argued that in fact this has been the case.
At the risk of making an outrageous gross generalization, a solid look at the debate has to lead one to the conclusion that the majority of students of technological evolution do not buy the strict argument of the determinists. The more they study the issue, the more they see various influences at work in shaping the course of technological evolution. Sometimes a breakthrough in science is at work, such as the sudden appearance of the transistor. In other circumstances, political and military realities drive a development, such as the creation of the atomic bomb. In other situations, economic opportunities provide the incentive, as in the case of how PCs have evolved. The reverse too can occur, as in the slow evolution of television technology or commercial aircraft flying at essentially the same speeds as 25 years ago because in neither case were there economic incentives to change. U.S. automotive companies were guilty of the same behavior during the 1960s and 1970s when it came to designing vehicles that consumed less fuel, resisting change until Japanese competition forced their hands.
The form that a particular technology takes is also influenced to a great extent by cultural and social attitudes. For example, the Japanese do not have a lot of space in their offices and homes. Therefore, much of the consumer electronics they built is small in order to fit into these spaces. Consumer electronics in the United States tended to be much larger because space was not such a critical factor. It was only after the Japanese came to dominate the American consumer electronics industry that such products as stereos and TVs available in the U.S. shrank in size. Americans love big vehicles and, therefore, it is no surprise that over the course of nearly a century, the largest number of models of big cars and trucks came out of the United States. European 18-wheelers are often 12 to 14 wheelers. Their cabs are smaller than American truck cabs. The smallest automobiles in the world are regularly made outside the U.S.
For reasons we need not go into here, European publishers of books have more often than Americans published paperback products. American publishers like to publish hardback versions of books first, then later paperback editions. Both have access to the same printing and binding technology, and both publish the same kinds of material. At the end of the 20th century, many large European book publishing conglomerates moved into the American market, buying up well-established U.S. firms, such as Pearson acquiring Prentice Hall. They then imposed European styles on their American counterparts. That is one reason why, for example, you are now beginning to see business books published in the U.S. appearing first as paperbacks. Yet the technology to produce another format was always there. Although one can argue that U.S. publishers were very late to play the digital game, they now get it and are producing digital products such as taped books, chapters on the Internet, CDs, and are doing direct publishing of first editions on the Web before going to paper.
Is this another example of the American tendency to apply technology when it is not needed? The answer, I believe, is simply no. This society does reach out to technology for answers because so many times in the past technology has helped solve problems and improve performance or enhance profit and pleasure. But Americans have also been just as alert to the problems posed by technology and have not been cautious about backing away from it. The closure of so many nuclear power plants in the U.S. in the 1990s, the demand that automobile engines burn fuel cleaner, and the political strategy of avoiding the use of atomic weapons during the Cold War are simply some illustrations of the ability of this society to sometimes constrain its use of technology. In short, nationally and at the personal level, technologies are picked or rejected based on a variety of reasons out of the control of technology itself. The same has always applied to information and its tools. It is why, for example, new information handling tools have to earn their acceptance. While Americans are quick to see the benefits of a new tool that does not mean they blindly adopt it. It is the understanding of information technologies that gives this society the expertise to judge when it is a good time to embrace new things.
The reason for going through this discussion about technological determinism is to point out that in theory, and sometimes in practice, one does not have to adopt a technology, or give up something that works, or tolerate a specific form of a device. These are choices to be made. America's love of information and its technological forms is not inevitable. Historical reality is that it could change, and in all probability will, in ways that are both predictable and unpredictable. But because we have choices, and since options are normally based on some assumptions and perceptions of possibilities, any forecast of the future of information and its technologies should begin with some assumptions.