- Introduction (1 minute)
- Cooper's Quick History of Programming (6 minutes)
- The World of Objects (3 minutes)
- Harris Kern's Enterprise Computing Institute
Cooper's Quick History of Programming (6 minutes)
When I first started programming with Java, almost five years ago, the Internet was still in its infancy in terms of application development. I've been developing enterprise applications since the early 1980s, programming in BASIC, MACRO-11, Fortran, PL/1, Pascal, LISP, Ada, and of course C. Back then, the trick wasn't just knowing how to program, but deciding what language you needed to learn. There were plenty, and it was anybody's guess as to which one(s) would catch on.
In the days before the Internet, before client/server, even before UNIX workstations, IBM and Digital Equipment Corporation (DEC) dominated the world of computer technology. Whether your industry was engineering, finance, scientific, or whatever, chances are that if there was any "IT" going on, it was based on one of these two platforms. Other players were around, such as Burroughs, Tandem, and Data General, but didn't have nearly the market share of Big Blue or DEC. So if you were a programmer, chances are that you programmed on one of these brands, confined to the programming language(s) that came with the particular model you worked on.
IBM was clearly the leader in number-crunching technology, but its programming languagesCICS, DBM, etc.were a bit esoteric, though based on sound computer-science principles. DEC, and the VAX series in particular, was the programmer's dream. A DEC VAX could be loaded with the programming languages of your choice, provided that you could afford it. The programming interface was a simple command line, with an English-based command interpreter. Building a program was simply a matter of writing and editing the program, typing the program's name to compile it, typing link to link it, and run to run it. Although the device-specific VMS operating system (OS) didn't allow programmers to access the OS directly without specific system privileges, it did provide a wonderful sense of power, freedom, and ownership, all at the same time. A single VAX usually came with three or four of the current "hot" programming languages.
In the early 80s, manufacturers and engineering firms favored Fortran, financial companies and banks leaned toward Pascal, and various other companies followed their own unique path. A comprehensive commodities-trading application used by the Chicago commodity exchanges was written entirely in Ada! But this was all before C became the de facto standard and started to dominate the IT landscape.
In case you're wondering, I'm mentioning all this as a precursor to explaining why we're where we are todaydeeply imbedded in the language of the 21st century, but more importantly the technology of the 21st century: Java. Studying Java will also give us the opportunity to study object-oriented design and programming (OOD/OOP). OOD/OOP is leading the way for software to become ever-present in just about anything that can run it, from cell phones to toasters. And until a technologist or programmer understands why this is so, it's very difficult to fully understand the power and scope of today's application development that was only a dream just 15 years ago.
The UNIX Revolution in the late 80s caused two major achievements in computer science/application development:
"Open" operability. A computer operating system and programming language needn't necessarily belong to a single company or manufacturer.
Network programming. Programmers could write and execute programs that "talked" to other computers. Of course, this technology had existed for years before Sun, Hewlett-Packard, and others came along to truly exploit them, but these previous technologies were developed with specific hardware in mind. SNA ran only on IBM computers, DECNET only DEC equipment. But TCP/IP could run on anything that could support UNIX, and eventually that means just about anything with a computer processor.
These two achievements created the current technology revolution, with the Internet being just the most obvious tip of an iceberg that most of us are barely aware of. Once computers could "talk" to other computers over vast digital networks, it was a natural extension to devise ways of sending programmable objects from one computer to the next.
Think of it this way. Classical computer science teaches that computers are able to store information electronically, 1s and 0s in discrete locations called bits and bytes, comprising what's known as a computer's memory. Through digital electronics, this information can be found and processed, performing specific mathematical functions and either passing on the results or storing them in memory. The information itself is known as data, and the processes are routines, subroutines, and procedures. In the past, modular languages viewed data as separate, discrete entities that were processed as such. Languages such as BASIC, Fortran, and, even C, declared this data as constants, variables, and/or libraries. Once the operating system stores this data in memory, it's there for as long as the program is running. (You probably know all this. Hold on, I'll get to the good stuff in a second.)
The differences in programming languages are usually based on how they reference, read, and interact with the data once it's created and stored. For instance, old macro languages could access the actual registers that made reference copies of stored data. C had reference pointers that did basically the same thing, but hid the details behind an ampersand (&) or asterisk (*). This is why programming languages are called languages. They do the same things but have their own syntax and forms of communicating.