The Future of Technology, Part 2
- The Era of Personalization: Into the Crystal Ball
- The Second Industrial Revolution
- The Rise of the Knowledge-Based Economy
- Harris Kern's Enterprise Computing Institute
Part 1 of this series described some of the hazards and rewards of technology forecasting in the Digital Age—a common requirement for any IT manager. Businesses must be able to determine what type(s) of technology they need to create, utilize, or disseminate in order to maximize return on investment in computers, networks, and software—but also on less visible technology costs such as support staffing, hidden R&D requirements, training, customization, loss of productivity during upgrades, and so on.
We've concluded that options analysis is the most reliable way of predicting the possible impact of new technologies. To take advantage of technology improvements, we have to be able to predict "potentialities," which requires some comprehension not only of how modern electronics have changed and are changing our lives and our businesses, but what effect those changes will have on future technologies.
The Era of Personalization: Into the Crystal Ball
The transistor paved the way for miniaturized electronics. Miniaturized electronics led the way to digital computers—originally the size of refrigerators! Now computers can fit in the palm of your hand. Using technology to develop and enhance existing technologies has allowed us to make chips that carry the same amount of memory as the original ENIAC (the first computer built in the United States) in a square millimeter or less. When referring to memory chips, knowledge really is power. A single germanium wafer can hold trillions of transistors. So what are we going to do with all this power? We can assume that we'll continue down our current path of using advances in technology to make other technologies more useful.
Currently, personal digital assistants (PDAs) are all the rage. PDAs are small, handheld receptacles that allow us to store and gather knowledge anywhere, anytime. They're capable of storing a fair amount of data, run relatively small programs, and interface directly with larger desktop computer systems. The cutting-edge models allow Internet and email access. Now that we can use PDAs to access Internet-based applications and email from anywhere, it's a relatively safe assumption that voice will eventually be added, blurring the line between digital telephones with Internet access and the common analog telephone. When this occurs, we'll have a completely interconnected personal communications device. The device will become smaller and smaller, replacing cumbersome button pads and digital writing displays with a voice-command system. In conjunction, digital video will be able to transmit, error-free, over wireless connections. We'll actually be able to see the people with whom we're speaking.
Once the device is the proper size, say about the size of a large wristwatch, we will strap it on and go about our business. Want to phone a friend? Press a button, raise the watch to your lips, and simply say "Dick Tracy." This may sound whimsical, but start paying close attention to business articles on the advancement of wireless and PDA technologies. In the next year, we'll begin to see actual examples of what was once farcical. If it can be conceived, it can become real. The motivation is simple: What would we pay for a device like this? If the elements of these technologies exist, it's just a matter of time before someone has both the imagination and the resources to put them together. Because digital logic chips are produced on an almost microscopic scale, programmable "intelligence" will allow almost anything, such as coffeemakers and digital alarm clocks, to be "smart appliance" technologies. The question then becomes, do we need our appliances to be any smarter than they already are? Do you want your refrigerator to order groceries if it detects you're running low on milk? Perhaps not. It's up to the real world and the good old-fashioned "invisible hand" of Adam Smith to determine.
But in the meantime, the iron that turns itself off and the oven that can cut its own temperature a few degrees after 20 minutes would be immensely useful items to anyone who uses one or both of these appliances on a regular basis. But how do you interface with irons and ovens? Button pads are simple and easy and work well with appliances already (microwave ovens, for example), but a button pad would seem oddly out of place on an iron. And even if buttons could be made more embedded, would the average consumer want button pads on everything? Where do we start setting limits? Basic human instinct, more than any other force, has driven the direction of technology. Somehow a "collective consciousness" has separated the genius from the crackpot.
So, what if we could talk to our appliances? I'm not interested in chatting with my clothes dryer, but I'd like to be able to give my kitchen appliances simple direct commands such as, "Tea, Earl Grey. Hot." Ah, there we go again, off to the realm of science fiction. The technology to do this already exists; we're simply waiting for voice-recognition software to catch up with the hardware. And even if this technology were on the mass market right now, we would still want our teapots to look like teapots and not like the office desktop PC. But what if our standard of a home desktop system could handle sophisticated voice-recognition software and also communicate with various appliances and conveniences? What if we simply used the following phrase: "Computer? Tea. Earl Grey. Hot." We now go from the plausible to the very probable. Let the desktop become like Star Wars' R2D2 and turn our complex English-based commands into simple, digital, infrared signals and binary beeps; that's how electronic alarms and garage door openers communicate every day. Of course, if it were this simple, I could wire my home right now to make my coffee, start the shower, cook the oatmeal, and wake me up at 6:30 every morning. Interestingly, if I did have the time, the money, and of course the inclination, I could actually wire my own house to accomplish these things right now.
Now let's take the next logical leap. The idea of intelligent devices starts as a personal one, such as a television or a coffeepot. The idea is limited to our personal realm. But what happens when we go beyond our world to the world at large? When we add wired and wireless communication to our technology-immersion scenario of the 21st century, not only can we now talk to the teapot, but we can do it from 2,000 miles away! Will we have communication at the speed of thought? Will we be able to transport around the globe at the speed of light? If this is what we're able to do here on earth, than what won't we be able to do in outer space? Will we have intergalactic starships? These are noble notions, but a bit beyond the scope of this article. The question for this article is this: How will these technologies affect our personal and business lives, both as individuals and as organizations? Our level of knowledge at any given point can be reaffirmed, reassessed, or reevaluated by an event happening anywhere in the world, at any time. With such enormous reach, how in the world are we going to be able to determine the richness of all the information we can access and can put to work for us?
As these questions are answered one at a time, humankind will witness subtle but profound changes in our economic and social thinking. We'll begin to merge into a truly interglobal society, not just an economic "global village." Decisions will be made on a global scale. We'll watch television shows from Moscow, buy shoes from Mali, and make a date with someone in Singapore. And it won't require excessive wealth to do so. True, the idea of first, second, and third worlds won't go away simply because of technology, but technology will definitively become the borderline between all three; the economics of technology, more often than not, are simply based on accessibility. Once obtained, its value is based on technology's ability to act in a predictable and reliable fashion.
Unfortunately, a good deal of current economic thought centers on control and predictability. This can be good when kept in the context of what can be controlled and predicted, but in business, as well as in life, control and predictability usually make up the smallest portion of the pie. DEC was a prime example of a company trying to control the uncontrollable. For example, networks currently exist because someone installed them. Certain companies, which began putting together our current telecommunications systems at the turn of the last century, have almost as much power to re-create those telecommunications systems in the new century. But the challenge becomes how to do so without losing out on the investment of the prior century. This is why wooden telephones poles with copper wire that were conceived 100 years ago still exist and remain the dominant form of telecommunications, at least in the U.S.
Whether pushed by federal intervention or the demands of the marketplace, some form of broadband Internet access will eventually be as common in most households as the standard telephone line is today. As broadband becomes more prevalent, our personal world can link into the real Net. Although humankind will continue to be separated by thousands of native tongues, the Internet will define some level of standardization across the globe. Some of those standards will be hard-fought and perhaps some even foolish, but they'll exist for the world as a whole! This alone will be the shining pinnacle of technology's role in making the world a better place. This worldwide standardization should make our lives simpler—our experiences, for better or worse, will become much more universal.
The time is coming when information technology will be everywhere and in everything. This doesn't seem to narrow the gamut of possible directions in which technology can go, but in truth, it actually does. Eventually three fields will impact universal standardization and the ubiquitous effects of information technology: telecommunications, computation, and imaging. Let's try to catch a glimpse of where this convergence should be going in real terms. We'll start by taking a close look at how technology has been converging.
Information technology in the last half of the 20th century affected information in four stages:
Collecting, processing, and presenting data. We're most familiar with this use, for example, in statistical tabulation of the U.S. census since the late 19th century (the original Hollerith census).
Interpreting data into information. Only in the last 20 years have we begun to consider the immense amount of data we create and store every day. By simply selling databases to telemarketers and to artificial intelligence systems, raw data, if formatted and used correctly, can become value-added constructs of knowledge itself. In other words, we're rapidly becoming a society that never forgets. Almost any small item of data, once recorded and stored, will always be there for somebody to access.
Generating knowledge. This, in my opinion, is the current borderline. Right now, the technology we have available to gather and store knowledge doesn't necessarily create new knowledge. That still requires human intervention, because the actual border between data and knowledge is very difficult to define. After all, a silicon chip can contain data, but the technology that creates the chip is also knowledge. How do we capture that differentiation on a chip?
Generating wisdom. When we can generate more wisdom from a computer than we can from our own experience on earth, will this be a step forward, or a step backward?
Not only will humankind begin to merge, with notable exceptions of course, but so will technology. Like our cellular phone/PDA example, the line between technologies has begun to blur. As digital integration becomes widespread beyond teapots and toasters, the digital revolution will be complete.
With the convergence of technology and our lives, will we be able to tell where users start and technology ends? Of course, the most rapid changes will first creep into the way we do business and then into our personal lives. Whether we're at work, at home, or both, we'll most likely be using the same networks. You may think this is the case now, since telephone networks are generally the same for a business or the home. But there are different services, different prices, and obviously different ways of making a connection, provided by an ever-increasing variety of companies. But as the Internet grows, and broadband Internet in particular, the equivalent to the "Baby Bell" will be the ISP. But there will be a dramatic difference. Where the Baby Bell has a localized monopoly on all telephone access in a local area, ISPs can't "control" access to the Internet. And as the demand for Internet access continues to heat up, so will the competition between ISPs. This competition will spur an ever-increasing swell in productivity among media companies, network providers, and all the companies that will provide the technological infrastructure of the Internet Age, surpassing the productivity of the Steel Age at the end of the last century.
Slowly but surely, every piece of information that can be digitized will be accessible, recordable, and processed on the Internet. Other than the obvious security and privacy issues, this means that the Internet will allow a single individual to renew a driver's license; order groceries; send invitations to the latest party; and watch personalized television, old movies, and home videos. The home entertainment center, which now exists as a group of units (television, VCR, stereo, DVD, etc.) that are merely spaced together in the same area, will be a single, universal, digital, individually programmable unit.
As a result of this personalization, the industrial revolution in the 21st century will be driven by the opposite force of the one that drove the industrial revolution of the 19th and 20th centuries. Until now, the focus of most economies has been to produce goods and services inexpensively in mass quantities. The assembly line has been the focal point for increasing output by decreasing costs. But as information is gathered about our day-to-day working and business lives, the true level of success in the information economy will be the degree of personalization that manufacturers and services can embed into their products.
This personal information, though sometimes intrusive, should be benign, but still enable a coffeemaker to not only brew coffee in the morning, but to brew it to just the right temperature every time, and, perhaps, have it measure the amount of coffee to brew (a feature I anxiously await). This more acute degree of electronic knowledge will slowly take precedence over the more common attempt to combine as much knowledge into specific categories as possible. Although artificial intelligence has been available for years, it has never developed a strong niche in any product or service. As such, computers can store vast amounts of data, but cannot store instinct, intuition, or insight in order to use that knowledge in ad hoc situations.
But artificial intelligence on a smaller, more personalized scale is useful in a variety of ways and is gradually becoming widely implemented. This "soft intelligence," commonly used via "cookies" and home entertainment boxes, is the subtle key to the coming era of personalization. Cookies are simple files created on local systems across networks. They store the information needed to tell information providers whose Pc they're talking to, and possibly what that Pc's user wants. It is hoped that, with some restraints, information providers can personalize their content to our tastes. As tools to access information evolve, our personal choices will evolve as well. Our entertainment centers will soon begin to aggregate all of our home activities through a central area; we will access all this information from one common appliance, the "flat screen."