Home Up Computer Hierarchy Sequence Detector
Home Teaching Glossary ARM Processors Supplements Prof issues About

A few years ago I was shocked to find you that the transistor had been invented far earlier than the time given in most of the textbooks. I looked into this and discovered that many of the inventions we take for granted weren’t invented by the person usually associated with the invention. I wrote this paper for a Frontiers in Education conference.

Honesty in History

Abstract - We expect university-level textbooks to be accurate. Although mainstream material is accurate, supplementary material such as the history of computing can be inaccurate or misleading. This paper looks at some of these inaccuracies in computing and other branches of engineering. Although some may feel that such inaccuracies do no harm, we argue that academic integrity should be maintained throughout education. Moreover, tolerating inaccuracy in teaching computer history implies that computer history is of little importance to the profession. This paper has three threads. We first discuss the type of inaccuracies found in textbooks. Then we look at how the history of several inventions across the engineering spectrum are misrepresented. Finally, we discuss ways of teaching the history of computing.


Index Terms – Computer science education, Curriculum development, Ethics, History of computing.

Introduction

This paper demonstrates that the history of computing and engineering in some undergraduate texts is inaccurate and misleading. We look at specific instances of inaccuracy and explain why it is necessary for the academic community to adopt a more rigorous approach. If I were to tell my students that the square root of nine was 4 or that microprocessors were made of plutonium, students would be outraged by such travesties of the truth. However, few would notice if I were to make equally erroneous and outlandish statements about the history of computing.

This paper looks at the way in which myths are perpetuated in engineering history, and how history is sometimes simplified to a point where statements in textbooks become grossly misleading. We appeal to academics to avoid superficial sources when preparing material on the history of computing and to examine their reference material as if they were researching any other topic. Honesty and integrity are the foundations of the academic world. So why do we make an exception for the teaching of the history of technology and tolerate breathtaking levels of inaccuracy?

We look at the origin of major inventions such as the telephone and wireless, and demonstrate how inventions are inevitably the result of a long process over time and of clusters of near-simultaneous discoveries. In particular, the myth of the sole inventor who has a sudden flash of inspiration is examined and challenged.

The history of computer science is important because it demonstrates how computing has grown and evolved. An appreciation of history is important in research; we need to understand how products and systems develop and what drives their evolution. A knowledge of history helps us to avoid repeating the mistakes of others.

 INACCURACY IN Computer History

A paper that had a profound effect on how I view computer history was “Historical Content in Computer Science texts: A Concern” by Kaila Katz. She demonstrated how historical material in popular mainstream texts was sometimes wildly inaccurate. Some of the inaccuracies she identified are:


Anachronisms – the use of terminology that was not current at the time of the device being described (e.g., the use of software when referring to Babbage’s analytical engine or describing Ada King as a programmer).

Unfounded assumptions – bland assumptions that go unchallenged.

Errors – statements that are demonstrably incorrect.


As well as actual errors and misconceptions, the history of computer science can be described by the pejorative term Whig History; that is, a form of history that demonstrates an unbroken trend of development from the past to the present. Whig history deals in heroes and often neglects developments that were failures. It is often from its failures that society learns most.

We now examine five inventions in computer science and electronic engineering. In each case we demonstrate that the inventor named in much popular literature is not necessarily the actual inventor.


What should be included in Computer History?


Table 1 is taken from the CC2001 computing curriculum [1] which was the first such work to include the history of computing as a formal topic. Even today few schools teach a course on computer history. CC2001 specifies a minimum of one hour of computer history and divides computing into a pre-1946 then and a post-1946 now, concentrating on pioneers in computing as much as on ideas.

The history of computing hardware (this paper does not look at software and networking), is processor-centric; that is, textbooks and courses stress the development of the processor from the mechanical calculator to the microprocessor. Other aspects of computer hardware such as memory technology are frequently relegated to almost footnote-status.

If I ask my students what the most important developments in computing over the last few years have been, they usually respond by listing some of the milestones in chip design such as multithreading and Intel’s dual/quad core technologies. I disagree, suggesting instead that the rise of the simple but elegant USB serial bus (together with plug-and-play technology) plus the introduction of high-density flash memory have had a greater impact on computing than enhancements to processor technology. The USB bus and flash memory paved the way for the explosion in ubiquitous or pervasive computing. Without flash memory, the MP3 player, the PDA and the digital camera would not have been practical.

TABLE I

CONVENTIONAL SEQUENCE OF EVENTS

SP1. History of computing [core]

Minimum core coverage time: 1 hour

Topics:

Prehistory—the world before 1946

History of computer hardware, software, networking

Pioneers of computing

Learning objectives:

1. List the contributions of several pioneers in the computing field.

2. Compare daily life before and after the advent of personal computers and the Internet.

3. Identify significant continuing trends in the history of the computing field.


In the next five sections we take a brief look at inventions that are of prime importance in our society and demonstrate that, in each case, the invention was not entirely the sole work of the pioneer associated with the invention. These notes demonstrate that educators are not providing a sufficiently accurate record of the history of engineering.

Invention 1 – The Computer

The invention of the computer extends over more than 2,000 years. Some texts provide a poor account of the computer’s history because they are highly selective in the way in which they describe milestones in the history of the computer. In particular, they do not distinguish sufficiently between the stages of the computer’s development; all too often, they concentrate on personalities rather than ideas. We can briefly define the stages in the development of the computer as:


Mechanical Calculator


The development of the mechanical calculator is attributed to Pascal and Schickard who created machines that automated addition and subtraction and later multiplication (this explanation omits the abacus and Napier’s bones for reasons of space). These machines used decimal arithmetic and mechanical components such as cog wheels, gear chains and levers. They demonstrated that arithmetic calculations could be automated.

Mechanical Computer

The mechanical computer was proposed by Charles Babbage in the 1830s. This computer was never constructed due to financial and engineering limitations. However, it can be called the first computer in the sense that it was able to read a program and to perform a set of operations. The program and data were to be stored on punched cards.

Electronic Analog Computer

The electronic analog computer performs calculations by modelling a system electronically; that is, a circuit behaves like the system under investigation. For example, you find the terminal velocity of a parachutist by modelling the parachutist’s acceleration due to gravity and the retarding effect due to wind resistance. Analog computing did not lead to digital computing, but it did accelerate the development of electronics.

Electro-mechanical Calculator

The electro-magnetic digital computer implements digital circuits by means of devices with moving parts controlled by electricity, such as relays. In one sense the electro-mechanical computer is a high-speed, mechanised version of Pascal’s calculator. Electro-mechanical computers carry out a predetermined sequence of operations that are wired into the fabric of the machine; that is, they cannot be programmed. Aiken’s Harvard Mark I was a series of calculators controlled by punched tape. Although once regarded as the first mechanical computer, that accolade now goes to Konrad Zuse who developed a mechanical computer in Berlin as early as 1938.

Electronic Calculator

The early electronic computers replaced the mechanical components of the electro-mechanical computer with vacuum tubes that are very much faster. These computers carried out pre-determined actions and could be programmed easily by rewiring circuits. In the USA the first two such computers were Atanasoff’s ABC and Mauchly and Eckert’s ENIAC.

Electronic Computer

The electronic computer is a version of Babbage’s analytical engine implemented electronically. That is, it can read the instructions of a program from memory, execute the instructions, and store the results. Several computers claim to be the world’s first computer as we know it today (EDVAC, EDSAC, Manchester Mark 1).

Conclusion

These brief notes demonstrate that the development of the computer is a continuum from the simple mechanical calculator to the computer as we know it today. How you assign a name to the inventor of the computer depends on which milestone in the computer’s history you regard as most significant.

Invention 2 - the Transistor

No single invention is more responsible for the computer age than the transistor that lies at the heart of amplifiers and logic circuits. Today’s ability to put thousands of millions of transistors on a single chip makes it possible to construct tiny computers and large memory systems.

I used Google to search for the invention of the transistor and found the following:  “The scientists that were responsible for the 1947 invention of the transistor were: John Bardeen, Walter Brattain, and William Shockley. Bardeen, with a Ph.D. in mathematics and physics from Princeton University, was a specialist in the electron conducting properties of semiconductors. Brattain, Ph.D., was an expert in the nature of the atomic structure of solids at their surface level and solid-state physics. Shockley, Ph.D., was the director of transistor research for Bell Labs…
In 1956, the group was awarded the Noble Prize in Physics for their invention of the transistor.”

No one would dispute that Bardeen, Brittain and Shockley at Bell Labs were responsible for the development of the transistor and today’s electronics industry. However, the story of the transistor’s origin is more complex. A strong claim to the invention of the transistor belongs to Julius Edgar Lilienfeld.

Lilienfeld received a PhD in physics in Berlin in 1905. He was an accomplished physicist who worked on separating gas mixtures, cryogenics, and x-ray systems. He moved to the USA in 1927 where he worked on electrolytic capacitors – a field that requires an understanding of the behaviour of electronics in solid materials.

In 1926 Lilienfeld made a US patent application for a device that “related to a method of and apparatus for controlling the flow of an electrical current between two terminals of an electrically conducting solid by establishing a third potential between said terminals.” [6].

Lilienfeld proposed the field effect transistor, a device that is closer to today’s transistors than those constructed at Bell labs. A brief literature search reveals that Lilienfeld’s claim is now widely supported, although there is no direct evidence that he created a working transistor.

Given that Lilienfeld’s claims to the design of the transistor appear undisputed, it seems surprising that so few texts refer to his invention. Moreover, this omission fuels the suspicion that an invention created in the labs of a major corporation, is more likely to be noticed.

Invention 3 – The Microprocessor

The microprocessor is a computer on a chip and was invented at Intel in 1970 by a team led by Ted Hoff. The microprocessor was an inevitable invention that would have been made even if Intel had not been the first company to pass the winning post. The notion of the microprocessor existed, the need for a microprocessor existed, and semiconductor manufacturing was rapidly advancing in the late 1960s to the point at which a microprocessor was possible (the enabling technology of the integrated circuit had been invented by Noice and Kilby in 1958).

In 1969 Intel was a new company making semiconductor memories. Busicom, a Japanese company, asked Intel to make a desktop calculator. Initially, the calculator was to be constructed from about 12 discrete chips. Ted Hoff decided to use Intel’s advanced technology to construct a single-chip calculator using a general-purpose 4-bit programmable calculator – the world’s first microprocessor (in practice, the 4004 was a set of four chips). The actual processor was realized by Federico Faggin. Because the owner of the new microprocessor, Busicom, was experiencing financial problems, Intel was able to keep the rights to the microprocessor. Intel’s 4-bit 4004 processor rapidly led to 8-bit and then 16- and 32-bit processors.

Intel also created an 8-bit microprocessor, the 8008 that started life as an 8-bit programmable chip for use in a Computer Terminal Corporation terminal. The 8008 was rather limited and was soon replaced by the 8080, the first really practical microprocessor. Note that the 8008 was produced almost concurrently with the 4004 (the 8008 appeared only five months after the 4004 which indicates that the microprocessor was inevitable).

Although much literature credits Hoff and Faggin with the construction of the world’s first microprocessor, there are competing claims. For example, Ray Holt claims to have invented the microprocessor in 1969 while working for the Garrett AiResearch Corp on a contract from Grumman Aircraft. Holt claims to have invented a microprocessor for use as a control device in the F14A jet fighter and maintains that military secrecy prevented him from receiving the credit for the microprocessor. Furthermore, Holt claims a 1971 paper that was submitted to Computer Design magazine was not printed until 1998 for reasons of security. Holt’s chip set does not implement a programmable microprocessor as we know it today – it is closer to a microprogrammed device that is dedicated to a specific task.

In 1990 Gilbert Hyatt was granted a patent for a Single Chip Integrated Circuit Computer Architecture (after a 20-year battle with the U.S. Patent Office) that recognized him as the inventor of the microprocessor rather than Hoff and Faggin. The patent office later reversed its decision and awarded the patent to Gary W. Boone of Texas Instruments. Boone’s ‘computer on a chip’ was intended as an industrial controller and was not a general-purpose user-programmable device. It appeared that Hyatt’s original patent application was opportunistic and he had not actually constructed a microprocessor.

Irrespective of the actual invention of the microprocessor, by concentrating on backward compatibility (allowing new processors to run old software) and cutting-edge design and implementation techniques, Intel has maintained its dominant position in the semiconductor world for four decades.

Conclusion

The invention of the microprocessor is claimed by several inventors. The inventor who won the patent is not the inventor who won the lion’s share of the market and fame in the textbooks. Does it matter? The history of the microprocessor demonstrates that it was inevitable; however, Intel’s domination of the market led to the emergence of the IBM PC and that led to the IBM-Intel-Microsoft partnership. In this case, Intel’s role in the development of the microprocessor and its early exploitation was to define the future of mainstream computing.

Invention 4 – Radio

Of all the major inventions based on electronics, the invention of the radio is probably the most contentious. Radio was, of course, discovered rather than invented. The existence of radio waves was first predicted by the Scottish mathematician James Clerk Maxwell. Radio waves themselves were first created and detected by Heinrich Hertz in 1887. Consequently, you could say that radio waves were discovered by Maxwell and that the radio was invented by Hertz.

However, Hertz’s work was not immediately of practical value. What was needed was a means of reliably generating radio waves, using them to carry information (modulation to carry either Morse code or speech/music), filtering waves of different frequency (to allow multiple transmitters), and amplification (to allow reliable long-distance communication).

Traditionally, the invention of wireless is attributed to Gugliemo Marconi, who was born in Italy and emigrated to England in 1896 at the age of 21. Marconi was familiar with Hertz’s work and used a telegraph and spark transmitter to generate radio waves together with a device called a coherer to detect them. Neither the spark transmitter nor the coherer were invented by Marconi. After gradually increasing the range of his transmitter, Marconi announced that he had managed to transmit across the Atlantic in 1901. This spanning of the Atlantic by radio made Marconi a household name. Although Marconi used the spark-gap transmitter to generate radio waves, it had no long-term future.

A brief glance at the literature on the development of wireless will reveal that Marconi was not the only player in this field. Consider:


Conclusion

Wireless technology has shaped the modern world. However, it is in this area that inventors and their supporters make more claims and counter claims than almost anywhere else. Although Marconi was an experimenter and an entrepreneur, it seems to me that his claims to be the inventor of radio are not necessarily stronger than those of other pioneers such as Tesla. Even Marconi’s claim for spanning the Atlantic rests on the design of the high-power transmitter created by Fleming – Marconi’s original spark gap transmitter could not have generated the necessary power.

Does it matter who we call the inventor of radio? The problem is that the naming of one person above all others obscures the large number of radio pioneers who nearly simultaneously made major contributions to the development of wireless technology.

Invention 5 – The Light bulb

Although the invention of the light bulb is not connected with the computer, it is indicative of the way in which the history of an invention is distorted. In this case, the man most closely associated with the light bulb in the public’s mind did not invent it. Consider the following quote [8]: “Thomas Alva Edison lit up the world with his invention of the electric light. Without him, the world might still be a dark place.”

This quote not only credits Edison as the inventor of the light bulb, but also implies that without Edison we would have no electric lighting; a preposterous claim. Even worse, a website called ‘The Lemelson Center’ at http://invention.smithsonian.org/home/ with the banner ‘The Lemelson Center is dedicated to exploring invention in history and encouraging inventive creativity in young people’ presents the following text. ‘Thomas Alva Edison changed our world! His genius gave us electric lights in our home and an entire system that produced and delivered electrical power… Of course, Edison's most famous invention to come out of Menlo Park was the light bulb… ’ Coming from such an authoritative source, this statement is seriously misleading.

Electric lighting has a far longer history than many sources imply. The English chemist Humphrey Davy demonstrated the carbon arc light as early as 1809. This created an intense beam suitable for use in search lights but of little use for domestic or street lighting.

Warren de la Rue invented an incandescent light bulb in 1820 with a platinum filament in a glass bulb with the air removed. However, it was impracticable because of the then difficulty of maintaining a good vacuum seal. Joseph Swan in England invented and demonstrated the carbon-thread light bulb in 1879. Sir William Armstrong’s home near Newcastle upon Tyne was the first building to be illuminated by Swan’s light bulbs in 1880. In 1881 the Savoy Theatre in London was illuminated by over 1000 Swan lamps.

Edison demonstrated his carbon filament light bulb in 1879 and produced bulbs using carbonized bamboo filaments in 1890 (Edison tested over 6,000 possible filaments). Edison’s principal contribution is his methodical experimentation, testing, and commercial control of all aspects of light bulb production.

Edison’s carbon filament bulb had a relatively short life and Karl Auer von Welsbach in Austria tried Osmium in 1898. It was not until 1906 that the tungsten filament was introduced by the General Electric Company. Tungsten has ideal properties for a filament because of its high melting point and ductility. The only significant difference between today’s bulbs and those of 1906 is that we now fill the bulb with an inert gas such as argon to reduce the filament’s evaporation.

Conclusion

Of all the inventions discussed in this paper, it appears that the development of the incandescent light bulb is the most widely misreported in popular literature and web sites. Edison made very important contributions to the development of electric lighting. However, he neither invented the concept nor created the first light bulb and his light bulb was rapidly rendered obsolete by the tungsten filament. By solely concentrating on Edison, many articles on the history of the light bulb fail to give readers an insight into the long and complex process of invention. In particular, they omit the nature of multiple near-simultaneous inventions and the eventual refinement of the invention.

Ways Forward

A better way of teaching the history of computing and other engineering concepts might be to decouple the invention from the inventor; that is, instead of providing a list of major inventors, it may be better to provide a list of the intellectual steps and the various implementations between the earliest beginnings and the final products.

The following examples are indicative and are not intended to be complete – they demonstrate only the concepts involved and do not provide full examples. Table 2 illustrated a conventional presentation of computer history with a chronological list of dates and the corresponding inventions.

Table 3 demonstrates an alternative approach in which events are grouped according to their category. In this example, we have grouped together the development of mechanical computers, and memory technology.

Table 3 also includes intellectual development; for example, Claude Shannon’s development of switching theory (a practical extension of Boolean algebra) provided the foundation for the use of binary arithmetic and all digital circuit design.

Table 4 demonstrates another way of looking at the history of computing; this time from the point of view of the development of enabling technologies. Here the technologies that were required in order to implement computers are described. All too often some of these technologies are omitted in conventional computer histories; for example, the laying of the first transatlantic telegraph cable led to surprising results – rapidly keyed signals at one end were received as slowly varying signals at the other. Thompson (later Lord Kelvin) was given the problem to solve and he laid the foundation for the field of electronics and the behaviour of signals in circuits. Similarly, the development of the vacuum tube was needed to construct electronic computers.


TABLE 2

Conventional sequence of events




















TABLE 3

Computer Development by Concept









































Table 5 looks at the history of computing from the point of view of dead end projects and ideas that never directly affected the outcome of computing. Two famous cases are Konrad Zuse’s Z1 computer, that was invented in war-time Germany and remained unknown to the world until well after the end of the war, and war-time code-breaking computers such as Colossus that remained secret until well after WW2.

Brian Ranlell describes two interesting computer pioneers that are not well known and whose inventions can be said to have led to a dead end [5]. Percy Ludgate designed an analytical engine using electro-mechanical devices in 1914 in Ireland, and in 1920 Leonardo Torres Y Quevedo demonstrated an electromechanical Arithmometer in Madrid. This invention was almost 20 years ahead of its time and was ignored by the rest of the world.


TABLE 4

Computer Development by Enabling Technology






















TABLE 5

Dead-ends














Table 6 looks at yet another way of organising computer history: by means of disputes within the computer industry. This brief example mentions the Atasanoff v. Eckert dispute that ultimately awarded Atanasoff the patent to the first computer because he had been visited by Eckert. Atanasoff argued that Eckert’s later success was derived from his original ideas.

Another interesting dispute involved John von Neumann. Although he is often regarded as the father of the stored program computer, some believe that the ideas expressed in the First Draft Report on the EDVAC that carries his sole name are those of Eckert and Mauchly, and that the actual author of the report was Herman Goldlstine [8].


TABLE 6

Litigation and Disputes












One modern factor that does provide hope for an improvement in the teaching of computer (and engineering) history is the emergence of wikis, particularly Wikipedia. The Wikipedia allows errors of omission (or honest misunderstanding) to be rapidly corrected. While writing this paper I found, post hoc, that Wikipedia provided information with a high level of accuracy.

Conclusions

This paper is called truth in history because, in higher education today, undergraduates are not always given an accurate history of the origins of computer science. We point out two areas that are most deficient: the first is the general accuracy of some of what has been written. The second is the omission of much important detail that is necessary to give students a fair view of the evolution of computing.

The final section of this paper has tentatively suggested alternative ways of looking at computer history; not in terms of heroes but in terms of the development of concepts or of the enabling technologies. We have also suggested that looking at areas such as disputes within the computer industry and inventions and discoveries that did not achieve success can also provide suitable themes for discussion in the classroom.

References

[1]  Kaila Katz, “Historical Content in Computer Science texts: A Concern” IEEE Annals of the History of Computing, Vol. 19, No. 1, 1997

[2]  Computing Curricula 2001 – Computer Science, The Joint Task Force on Computing Curricula, IEEE Computer Society and Association for Computing Machinery.

[3] “Those who forget the lessons of history are doomed to repeat it”, IEEE Annals of the History of Computing, Vol . 18, No, 2, 1996

[4] B. Randell, “The history of digital computers”, The Institute of Mathematics and its Applications. Springer Verlag, Berlin, 1973.

[5] B. Randell, “From Analytical Engine to Electronic Digital Computer: The Contributions of Ludgate, Torres, and Bush”, IEEE Annals of the History of Computing, Vol. 4, No.4, October 1982.

[6] IEEE-USA Today’s Engineer Online, May 2003, http://www.todaysengineer.org/2003/May/history.asp.

[7] http://depts.gallaudet.edu/englishworks/exercises/exreading/topic1a.html

[8] “From the Editor’s Desk”, IEEE Annals of the History of Computing, Vol . ?, No, ?, July-September 2004