Software Issues 2: The Future of Programming
What are the prospects for programming in the future? With technology advancing so rapidly, what will programming be like in 10 years, or 300 years? Will programming as we know it be forgotten like pyramid construction, silent filmmaking, and Viking shipbuilding? Of course, anyone attempting to makes guesses about the future can only base their thoughts on the present and past. Given the short history of programming and what we know about its fundamental nature, what can we say about its future?
When you want to see the future, one good place to start looking is science fiction, in the crystal ball called television. We've all seen the future of software development on Star Trek. Ensign Paris on Starship Voyager is able to program a detailed, interactive new holodeck simulation by pressing a few keys on a touchpad. Seven of Nine routinely reprograms the deflector array to save the ship within the last few seconds of each show. Neither is a specialized software programmer.
Is this the future of programming? Will technology provide software development tools so sophisticated that even a nonprogrammer can communicate immense amounts of complex specifications with so little effort?
Consider another Star Trek icon, Commander Data from "Star Trek: Next Generation." He was a manmade android so sophisticated that he qualified as a legitimate life form. The possibility of such an incredible technological achievement is apparently not far-fetched. Scientists today are already deeply immersed in refining the requisite technologies, including nanotechnology, microcircuitry, and robotics. Futurists tell us that it's only a matter of time until these fields are sufficiently mature to build such sophisticated machines.
If recent history is any indication, the futurists are probably too conservative when it comes to hardware. Not very long ago, computer experts were predicting that soon computer clock speeds would reach their theoretical maximum of 100 MHz. Of course, we know now that reality has already blown away those predictions by an order of magnitude, despite the fact that they seemed so optimistic at the time.
Can software keep pace with the breakneck speed of hardware advances? Will software continue to benefit from faster processing speeds, or will it gradually become the limiting member of the hardware/software team?
One thing is for sure: If software development doesn't become much more efficient than it is now, poor Commander Data will be a physical marvel but an intellectual simpleton. There would simply be no way to write enough code using current practices to give him his level of artificial intelligence. (Although the term artificial intelligence may be completely a human bias. If something shows intelligence, it's simply intelligent; what's artificial about it? Is a dog or even a human child artificially intelligent because we train the dog or teach the child?)
Anyway, about the same time that all the 100 MHz predictions circulated, software gurus were making predictions of their own. They predicted that software development would soon be radically different. Language translators would convert code from one language to another transparently, effectively removing any language dependencies. They predicted that soon there would be "fifth-generation" languages that would make current languages obsolete relics.
Yet software languages remain essentially unchanged. Certainly there are new vendors, new languages, improved operating systems and libraries, but these improvements are evolutionary, not revolutionary. Nothing on the software horizon even hints at the kind of advancement potential comparable to emerging hardware technologies like, say, organic circuitry.
Software is all built on the same old if-then-else coding logic that started back with assembly languages. "Expert systems" were pretty-much glorified if-then-else generators. Even the radically strange neural network technologies were based on if-then-else logic and math operations. The most modern languages and code generators still just package good old if-then-else statements beneath the thick layer of hype.
What else could software possibly do? It's difficult even to conceptualize any language that's not essentially if-then-else in the final analysis. That's how we think. But hardware is not similarly constrained. We can build machines based on radically different technologies than ourselves. Imagine if machines were limited to flesh and blood. How much could they advance? They would hit an immovable ceiling inherent to their basic organic composition.
Does software have a similar ceiling? Have we already reached it? It's reasonable to ponder whether it's even possible to teach a machine to think differently than we think. Even if we were able to think up a new way of thinking, how would we communicate it? Our brains can think in if-then-else logic and perform simple computations. We do associations better than machines, and computers have better memories and compute faster, but we both think the same way. Could we possibly teach machines to think in a new way that we can't? There may be an inescapable limitation preventing it.
Well, so what if our software languages are constrained by if-then-else thinking? Won't fantastically large memory capacities and processing speeds make that point moot? Our Commander Data will have virtually unlimited memory capacity and instant, infallible recall. He'll be able to process a superhuman number of if-then-else logical operations each microsecond. We can compensate for quality with quantity.
However, if the future computer processes if-then-else logic statements at incredible rates, how would all that logic get programmed? How many analysts would it take to develop all the if-then-else rules of intelligence? How many planners would it take to organize it all? How many programmers would it require, working around the clock, to program all that logic using future languages that are little different than what we use today?
We tend to always assume that technology will find an unanticipated solution, but technology may not be able to save us this time. Software may indeed prove to be the limiting factor in our technological progress. Remember that programming is not essentially a technological challenge. It's a communications challenge. We haven't been able to improve significantly on spoken and written communications since the first humans grunted something meaningful. We can transmit words at incredible rates, but it would still take ages for humans to write them or read them on either end. Software development faces the very same insurmountable barrier—us.
All this is not intended to claim that we can't do better, or that we won't make tremendous strides. I hope and believe that we will. There are certainly lots of people doing innovative, groundbreaking work in software development. However, in order to make the kind of strides we need to keep up with technology, it's useful to recognize that the challenges before us are not purely technical. We face fundamental challenges that are intimately tied to our own communications limitations and are extremely difficult to overcome.
In the worst case, think of it as job security. Our jobs aren't likely to be eliminated by technology. Quite the contrary, our challenges will get tougher as hardware begs for ever greater intelligence. Writers have been around forever, and despite moving from quill and parchment to word processors, their craft is essentially unchanged. In the same way, software writers will be around for the unforeseeable future, still using our tiny if-then-else vocabulary to communicate with the computers of the future.
About the Author
Tyson Gill is the director of information technology at Alitum, Inc., in San Diego, California. He also teaches Visual Basic and Microsoft.Net programming at the University of California, San Diego. He is well known for his influential presentations on design, architecture, planning, and coding. Tyson is the author of Visual Basic 6: Error Coding and Layering (Prentice Hall, 2000, ISBN 0-13-017227-8) and the upcoming title Planning Smarter: Creating Blueprint-Quality Software Specifications (Prentice Hall, 2001).