- The Human Need to Compute
- Early Sequence of Numerical Knowledge
- Key Inventions Relevant to Software
- The Impact of Software on People and Society
- Summary
Key Inventions Relevant to Software
The inventions listed in the previous tables are all important in one way or another. However, in thinking about the inventions that had the greatest impact on software, the inventions discussed in the following section are the most critical.
Alphabetic Languages
Information recorded using pictograms such as Egyptian hieroglyphics is elegant and beautiful and has produced some wonderful calligraphy, but such systems do not lend themselves to rapid data entry and computerization. The same is true of information recorded using ideograms such as Chinese and Japanese kanji (which uses Chinese symbols). There are thousands of symbols, which makes typing extremely difficult.
During World War II, the text entered into the Japanese “Purple” coding machine actually used two American Underwood typewriters and plain text using English characters. Alphabetic languages have the greatest speed for typed entry.
Binary and Decimal Numbers and Zero
Computers and software can process numbers using any base such as binary, octal, decimal, or hexadecimal. However, electronic circuits for performing mathematics are somewhat easier to design using binary arithmetic. Octal or base 8 numbering systems are easily convertible from binary. (Some Native American tribes used octal numbers since they counted by using the gaps between the fingers rather than the fingers themselves.) Several computers were based on octal numbers such as the DEC PDP line.
Hexadecimal or base 16 numbers are also used in computers and are convenient because they match byte capacities. However, the bulk of day-to-day calculations used by humans are based on decimal or base 10 numbers. Decimal numbers are somewhat analogous to the QWERTY keyboard: not optimal but so widely used that switching to something else would be too expensive to consider.
The decimal point seemed to have originated in India during the ninth century, but it was John Napier who made the concept important in Western mathematics around 1620. Napier also invented logarithms and an interesting manual calculator called “Napier’s bones.” Logarithms were used in the first slide rules and hence are an important background topic for analog computation.
The concept of zero seemed to have several independent sources. It was used in Babylon with base 60 math, but apparently as a placeholder rather than actual calculations. This use was about 2,500 years ago.
The Olmecs and Mayans both used zero as a true number, and it was used for calendar calculations, which were quite complex. This use of zero seems to date to around 400 AD.
The use of zero in India dates to about 458 AD when it was found in a text on mathematics. Whether this was an indigenous invention or inherited from Babylon is not certain. Later in the 600s, the famous Indian mathematician Brahmagupta wrote a paper on the uses of zero, which moved past zero itself into negative numbers.
Decimal numbers, the decimal point, and zero were all important precursors leading to computers and software calculations.
Digital Computers
Later chapters in this book will discuss the evolution of digital computers and associated software from the mid-1930s through 2010, with projections to 2019. Suffice it to say that software was created specifically to operate on digital computers. Without digital computers, there would be no software. Without software, digital computers would have no major purpose and would probably not have supplanted analog computers.
Higher-Level Programming Languages
I started as a young programmer in the 1960s. Programming using both machine language (mainly for patches and bug repairs) and basic assembly language was how I first programmed IBM 1401 computers.
My firsthand experience was that machine language was very error prone and also rapidly fatiguing due to the high attention span needed to deal with it. Assembly language was a step in the right direction, but not a very big step. Having to use dozens of assembly instructions to handle calculations or format printed output was time consuming and actually boring. Higher-level languages, starting with ALGOL, COBOL, FORTRAN, PL/I, APL, and others, reduced coding time, significantly reduced coding errors, and converted programming into a viable occupation.
Random-Access Storage
Sequential storage of data on paper tape, card decks, or magnetic tape had a fairly long and useful life. But it was very inefficient and required far too much movement of tapes to achieve high speeds. The invention of disk drives and random-access storage allowed faster processing, sophisticated search algorithms, and a path that eventually would lead to today’s “big data” world with billions of records and millions of files being accessed for specific problems.
Without random access, modern computing and software could handle only a small fraction of important data analysis problems. Random access would also lead to the relational database concept, sorts, and a variety of powerful query languages in the Structured Query Language (SQL) family.