- Ubiquitous Computing
- Web Services
- The Semantic Web
- Spaces Computing
- Peer-to-Peer Computing
- Collaborative Computing
- Dependable Systems
- Security
- Languages
- Pervasive Computing
- Cluster Concepts
- Distributed Agents
- Distributed Algorithms
- Distributed Databases
- Distributed Filesystems
- Distributed Media
- Distributed Storage
- Grid Computing
- Massively Parallel Systems
- Middleware
- Mobile and Wireless Computing
- Network Protocols
- Operating Systems
- Real-Time and Embedded Systems
- Commentary
- Endnotes
Languages
How many spoken languages has our species uttered? Although some 6,000 remain today, despite the language pruning effects of global competitive pressures of the last century, some experts assert that all but 250 to 600 languages will become extinct by the end of the current century.[20] While it might be more efficient from an economic perspective if all humans spoke one and only one language, how immeasurable would the loss be?
Imagine our world today without the influence of Plato, Aristotle, or Archimedes. Imagine Western legal systems without Exodus or the New Testament. Though their languages are officially extinct, civilization today is ineffably indebted to the written words of ancestral giants, many of whose tongues would otherwise be silent in the modern world. Language conveys thought beyond the essence of message-passing. It provides the foundation for worldview, which cannot be expressed with words but rather between them. Alas, the loss of human language diversity is acceleratinga consequence of our shrinking planet.
Computer languages may be subject to Darwinian selection mechanisms, but once compiled, the binary code will run as long as a processor exists that can execute the target instruction set. In a metaphorical sense the language is still in use, although not "spoken." This too presents a problem for our collective well-being. While spoken languages, if not used by a large enough body of humans, disappear, computer languages can go on for years after the last line of code is compiled. As such, the maintenance of aging code can become problematic, especially when the larger fitscape rewards developers who may be fluent in whatever language happens to be popular at the moment.
An example of this exposure was the much-hyped Y2K problem, which never materalized because so many organizations made considerable effort to ensure that applications were Y2K-aware. But the lessons of COBOL (which is currently still in use, albeit not the language of choice for any number of new NDC development projects) and language paradigm persistence should be clear: the code we write today may last a lot longer than we anticipate. As such, our choice of language is as important as any other project choice we may make.
In the 1970s to 1990s, many computer programs were written that used only a two-character field to contain the year of any particular date to save memory and storage. This practice was based on the assumption that the data and the code written to manipulate that data would not survive past the turn of the century. Unfortunately, developers who made this assumption were wrong. When the calendar turned from 1999 to 2000, applications designed to use only two characters to represent the year were suddenly confused because not only was data storage involved in the shorthand but algorithms were as well. Previously sound code would suddenly breakor so the Y2K story went. But a cascading set of system crashes did not occur. Indeed, the calendar turned without much notice from the IT community. Perhaps due to the investments made by large organizations to correct the problem in advance or perhaps due to the overestimates of Y2K-injured code, the problem in retrospect appears to have been a nonproblem.
In the 1970s to 1990s, many computer programs were written that used only a two-character field to contain the year of any particular date to save memory and storage. This practice was based on the assumption that the data and the code written to manipulate that data would not survive past the turn of the century. Unfortunately, developers who made this assumption were wrong. When the calendar turned from 1999 to 2000, applications designed to use only two characters to represent the year were suddenly confused because not only was data storage involved in the shorthand but algorithms were as well. Previously sound code would suddenly breakor so the Y2K story went. But a cascading set of system crashes did not occur. Indeed, the calendar turned without much notice from the IT community. Perhaps due to the investments made by large organizations to correct the problem in advance or perhaps due to the overestimates of Y2K-injured code, the problem in retrospect appears to have been a nonproblem.
NDC efforts have brought computer languages into the forefront of research as well as coolness. For example, the Java programming language arguably bootstrapped a new generation of Internet applications. With the Java specification, the essence of C++ without the shortcomings was married with byte code and the promise of Write One, Run Anywhere, and new models for applications that would span the public networks could be imagined. The Java platform was a solid step in the direction of a network-aware platform. The fact that Microsoft's C# (pronounced "C sharp") platform is a close syntactic and runtime mirror of Java platform designs attests to the visionary appeal that brought Gosling's invention such dramatic acclaim.[21]
But just as nothing stands alone in the Network Age, nothing stands still. The Java platform has evolved considerably since "dancing Duke" first graced a Web page.[22] And since languages are the first line of expressive capability, it follows that computer languages in a general sense should continue to evolve.
As with all the other categories of NDC, language evolution represents its own complex fitscape. Containing layers within layers, a computer language ultimately relies on an underlying theory of computing in order to provide a usable tool set from which NDC developers can choose to implement algorithms of choice. Examples of language R&D that NDC developers should at least be cognizant of going forward include the following:
-
Java/C# Object oriented, virtual machine interpreted languages
The Java platform started evolving from the moment it was announced. From its modest beginning came a standard version (J2SE), an enterprise-aware version (J2EE), a version targeting smaller and mobile devices (J2ME), a community process to enable standardization (the Java Community Process), and much more. The dream of Write Once, Run Anywhere, once a holy grail of computer labs worldwide, unfolded thanks to a C++ type of syntax, a virtual machine, baked-in security considerations, and the collective imaginations of millions of developers around the world. Java (and Microsoft's C#.NET) is one aspect of NDC languages that continues to evolve because of research and fitscape development pressures.
-
XML (eXtensible Markup Language) A language for high-level language creation
With XML came the ability to create HTML-like higher-level languages to serve specific needs. Portable data, supported by the portable behavior of the Java platform, proved a compelling vision. But XML provides only the basis for creation of metadata-centric approaches to data standardization. The hard part is the creation of domain-specific extensions; such efforts require communities of agreement, not unlike the JCP. An analogy would be a new spoken language; if I decided to invent a new language, it would provide no benefit to anyone unless others understood and spoke the same tongue. Now imagine that this new language must be created and agreed to by a committee of individuals who will ostensibly use it to compete with each other. That's the inherent problem XML faces.
-
The Fox Project A strongly typed intermediate language with proof-carrying code
This language layer was pioneered by Carnegie-Mellon University. Funded by the Defense Advanced Research Projects Agency (DARPA) of the U.S. government, the Fox Project's goal is to develop language support for building safe, highly composable, reliable systems. For the goal to be reached, a return to the mathematical basis for programming languages is deemed to be a certain requirement. While the current emphasis has been on applications for ensemble composition in embedded systems, the Fox Project is a comprehensive program of research that is applying theoretical foundations of programming languages to development tools and techniques for systems in general. Interesting features of the Fox Project include the following:
-
Typed intermediate languages, which extend the benefits of type safety enjoyed by higher-level source languages to the intermediate and target languages of a compiler
-
Certifying compilers, which provide a foundation for trust-free code dissemination, by which code can be shared in an untrusted environment without sacrifice of safety
-
Proof-carrying code, a technique by which a host computer can automatically verify that code provided by an untrusted agent is safe to execute
-
-
π-calculus A theoretical computing model for mobile code Robin Milner of the University of Cambridge is one of the early advocates of this theoretical model. The π-calculus recognizes that communication is a fundamental component of theoretical computing models, which differ from other models of communicating behavior primarily in their recognition and treatment of mobility: The movement of pieces of data inside a program is treated exactly the same as the transfer of a message (which can be an entire program) across the Internet. The π-calculus differs from other models in its capacity to simply classify behavioral equivalence among entities, as well as in its patterns of interactive behavior. It holds that previous theory (that is, classical automata theory), upon which most of today's parsers are built, does not appear to be correct when an automaton's actions consist of reactions between it and another automaton. In other words, a fundamental network-awareness needs to be part and parcel of the computational theories upon which our languages (and hence our systems) should be built.
Each of these NDC language developments represents a different layer of computer language implementation, just as each represents the evolving nature of languages in the current period.
These examples of language evolution in NDC have three things in common:
-
Research on all began in earnest in or after 1995 (year zero of the Network Age).
-
All introduce a heretofore absent network-awareness into computer languages.
-
All reexamine communication approaches between distributed computing nodes.