Home Up Exams Embedding Ethics Analogies Computer History External Examiner System
Home Teaching Glossary ARM Processors Supplements Prof issues About
A comment on teaching

See also my paper Honesty in History that looks at how historical facts get distorted.

Why Do we Teach Computer Architecture?


Two generations ago, British school children had to learn Latin in order to enter a university. Clearly, at some point it was thought that Latin was a vital prerequisite for everyone going to university. When did they realize that students could still benefit from a university education without a prior knowledge of Latin? Three decades ago students taking a degree in electronics had to learn the dance of electrons in magnetic fields – a subject so frightening that older students passed on its horrors to the younger ones in hushed tones. To my relief, this required course on electron dynamics was dropped the year before I went to university because courses in semiconductor electronics were replacing courses in vacuum tube physics.


If courses in the past have fallen out of the curriculum with no obviously devastating effect on the education of students, what about today’s curriculum? This article examines one particular course because it is at the crossroads and may go either way – into oblivion or into a new future via metamorphosis.


Computer architecture is probably the oldest component of the computer curriculum. Computing as we know it emerged in the 1940s and 1950s with the first-generation of computers constructed using vacuum tubes – the ENIAC, the EDVAC, the IAS, and the “Manchester Baby”. These computers were constructed by mathematicians such as Alan Turning and von Neumann, and engineers such as Atanasoff and Eckert. Indeed, some of the initial impetus driving computer design came from the physicists who constructed high-speed vacuum tube circuits to analyze the pulses generated by high-energy cosmic rays in Geiger counters.


The very first courses on computer science were concerned with the design and construction of computers. At that time programming was in its infancy and compilers, operating systems, and databases did not exist. In the 1940s, working with computers meant building computers.


By the 1960s computer science as a discipline in its own right had emerged. With the introduction of courses in programming, numerical methods, operating systems, compilers, and databases, the then curriculum reflected the world of the mainframe. By end of the 1970s, the microprocessor had arrived and the computer was well on the road to its commoditization.


In the 1970s computer architecture was still, to a considerable extent, an offshoot of electronics. Texts were more concerned with the circuits in a computer than with the fundamental principles of computer architecture as now encapsulated by the expression “instruction set architecture”, ISA.


Computer architecture experienced a renaissance in the 1980s. The advent of the low-cost microprocessor-based systems and the single-board computer meant that computer science students could study and even get hands-on experience of microprocessors. They could build simple systems, test them, interface them to peripherals such as LEDs and switches, and write programs in machine code. In the 1980s what was being taught in universities was in step with what was happening in the real world (at least at board-level design and the applications of microprocessors). Bill Gates himself is a product of this era.


Since today’s students either have their own computer or have access to a computer laboratory, the role of the single-board computer in CS courses is now obsolete. Assembly language programming courses once aped high-level language programming courses—students were taught algorithms such as sorting and searching in assembly language, as if assembly language were no more than the poor person’s C. Such an approach to computer architecture is now untenable. If assembly language is taught at all today, it is used as a vehicle to illustrate instruction sets, addressing modes, and other aspects of a processor’s architecture.


In the late 1980s and early 1990s computer architecture underwent another change. The rise of the RISC microprocessor turned the focus of attention from complex instruction set computers of the 80x86 and 68K families to the new high-performance, highly pipelined, 32-bit processors of the MIPS and SPARC families. Moreover, the increase in the performance of microprocessors made it harder and harder for classes to give students the hands on experience they had a few years earlier. In the 1970s a student could construct a computer with readily available components and simple electronic construction techniques. By the late 1990s clock rates had reached well over 100 MHz and buses were 32 bits wide. It had become virtually impossible for students to construct microprocessor-based systems as they did in the 1980s because very high clock rates require special construction techniques and complex chips have hundreds of connections rather than the 40 or 64 pin packages of the 8086/68000 era.


By the late 1990s computer architecture had stabilized again with Hennessey and Patterson’s two texts dominating the education world. Computer architecture was largely concerned with the instruction set architecture, pipelining, hazards, superscalar processors, and cache memories. Topics such as microprocessor systems design at the chip-level and microprocessor interfacing had largely vanished from the CS curriculum.


Computing in the 1990s


What was happening to the rest of computer science while computer architecture was reaching its maturity in the 90s? In short, rather a lot was happening in computer science. Computer science has expanded exponentially since its early days and the 1990s witnessed the introduction of several new subject areas such as object-oriented programming, communications and networks, and the Internet/WWW.


The growth of the computer market, particularly for those versed in the new Internet-based skills caused students to look at their computing curricula in a rather pragmatic way. Many CS students will join companies using the new technologies, but very few of them indeed will ever design chips or become involved with cutting-edge work in computer architecture. At my own university, the demand for courses in Internet-based computing has risen and fewer students have elected to take computer architecture when it is offered as an elective.


Should computer architecture remain in the CS curriculum?


Recent developments in computer science have put pressure on course designers to remove old material to make room for the new. The fraction of students that will ever be directly involved in computer design is declining. Universities are beginning to provide programs in multimedia-based computing and visualization at both undergraduate and postgraduate levels. Students in such programs do not see the point of studying computer architecture.


In such a climate some have suggested that computer architecture is a prime candidate for pruning. It is easy to argue that computer architecture is as irrelevant to computer science as, say, Latin is to the study of contemporary English literature. If a student never writes an assembly language program or designs an instruction set, or interfaces a memory to a processor, why should we burden him or her with a course in computer architecture? Does the surgeon study metallurgy in order to understand how a scalpel operates? Does the pilot study thermodynamics to understand how a jet engine operates?


Such crude analogies can easily be dismissed by constructing counter analogies. It’s easy to say that an automobile driver does not have to understand the internal combustion engine to drive an automobile. However, it is patently obvious that a driver who understands mechanics can drive in such a way as to enhance the life of the engine and to improve its performance. The same is equally true of computer architecture – a knowledge of the computer systems can improve the performance of software if the software is written to exploit the underlying hardware.


The digital computer lies at the heart of computer science. Without it, computer science would be little more than a branch of theoretical mathematics. The very idea of a computer science program that did not provide students with an insight into the computer would be strange in a university that purports to educate students rather than to merely train them.


Those supporting the continued teaching of computer architecture employ several traditional arguments. First, education is not the same as training and CS students are not simply being shown how to use commercial computer packages. A course leading to a degree in computer science should also cover the history and the theoretical basis for the subject. Without an appreciation of computer architecture, the computer scientist cannot understand how computers have developed and what they are capable of.


However, there are concrete reasons why computer architecture is still relevant in today’s world. Indeed, I would maintain that computer architecture is as relevant to the needs of the average CS student today as it was in the past. Suppose a graduate enters the industry and is asked to select the most cost-effective computer for use throughout a large organization. Understanding how the elements of a computer contribute to its overall performance is vital – is it better to spend $50 on doubling the size of the cache or $100 on increasing the clock speed by 50 MHz?


Computer architecture cannot be divorced entirely from software. The majority of processors are found not in PCs or workstations but in embedded applications. Those designing multiprocessors and real-time systems have to understand fundamental architectural concepts and limitations of commercially available processors. Someone developing an automobile electronic ignition system may write his or her code in C, but might have to debug the system using a logic analyzer that displays the relationship between interrupt requests from engine sensors and the machine-level code.


There are two other important reasons for teaching computer architecture. The first reason is that computer architecture incorporates a wealth of important concepts that appear in other areas of the computer science curriculum. This point is probably least appreciated by computer scientists who took a course in architecture a long time ago and did little more than learn about bytes, gates and assembly language. The second reason is that computer architecture covers more than the CPU; it is concerned with the entire computer system. Because so many computer users now have to work with the whole system (e.g., by configuring hard disks, by specifying graphics cards, by selecting a SCSI or FireWire interface), a course covering the architecture of computer systems is more a necessity than a luxury.


Computer Architecture – a Metamorphosis


One day, computer architecture courses may be dropped from the traditional CS curriculum, leaving only a few universities to teach architecture. Equally, computer architecture may be seen as topic suitable only for EE departments.


One way of dealing with computer architecture is to downgrade it so that it ceases to be a mainstream core subject and is broken into pieces and distributed throughout the curriculum. The basis of the von Neumann machine and the fetch/execute cycle could be sent off to the introduction to programming course; material dealing with memory systems could be slotted into the operating systems course; and buses and interfaces can find a home in courses on computer communications. Such an approach is possible, but it would leave students without a coherent understanding of the computer system.


In order to continue to thrive, I suggest that computer architecture courses in computer science departments should evolve to reflect the real needs of today’s computer science subjects. In short, computer architecture should be less “CPU-centric” and emphasize the entire computer system, and underpin all the other elements of the computer science curriculum.


Should Computer Architecture be Computer Systems Architecture?


Some computer architecture courses are, in one sense, a curiosity because they cover the architecture and organization of the processor but make relatively little reference to buses, memory systems, and high-performance peripherals such as graphics processors. Yet, if you scan the pages of journals devoted to personal/workstation computing, you will rapidly discover that much attention is focused on aspects of the computer system other than the CPU itself.


Computer technology was once driven by the paperless-office revolution with its demand for low-cost mass storage, sufficient processing power to rapidly recompose large documents, and low-cost printers. Today, computer technology is being driven by the multimedia revolution with its insatiable demand for pure processing power, high bandwidths, low latencies, and massive storage capacities.


These trends have led to important commercial developments in computer architecture; for example Intel's MMX instruction set that provides simple SIMD parallelism and saturated arithmetic processing. The demands of multimedia are being felt in areas other than computer architecture. Hard disks are designed to provide a continuous stream of data, and design techniques for low latency are becoming more important than those for high-speed retrieval. People can tolerate a degraded picture much better than a picture with even the shortest discontinuities. Such demands require efficient track-seeking algorithms, data buffering, and high-speed, real-time error correction and detection algorithms. Similarly, today’s high data densities require frequent recalibration of tracking mechanisms due to thermal effects. Disk drives now include SMART technologies from the AI world that are able to predict disk failure before it occurs. These developments have as much right to be included in the architecture curriculum as developments in the CPU.


Some of the most impressive advances in recent years have been in computer graphics. More and more of the burden of image processing is being moved onto graphics chips. Similarly, digital signal processing, once the province of the electrical engineer, is edging into mainstream computing as multimedia systems provide support for sound processing.


Supporting the CS Curriculum


It is in the realm of software that you can most easily build a case for the teaching of assembly language. During a student’s career, he or she will encounter abstract concepts in areas ranging from programming languages, to operating systems, to real-time programming, to AI. The foundation of many of these concepts lies in assembly language programming and computer architecture. Computer architecture provides bottom-up support for the top-down methodology taught in high-level languages.


Consider some of the areas where computer architecture can add value to the CS curriculum.


The Operating System


Computer architecture provides a firm basis for students taking operating system courses. In computer architecture students learn about the hardware that the operating system controls and the interaction between hardware and software; for example in cache systems. Consider the following two examples of the way in which the underlying architecture provides support for operating system facilities.


Some processors operate in either a supervisor or a user mode. The operating system runs in the privileged (and protected) supervisor mode and all applications run in the user mode. This mechanism is used to construct a secure environment in which the effects of an error in an application program can be prevented from crashing the operating system or other applications. Covering these topics in an architecture course makes the student aware of the support the processor provides for the operating system and enables those teaching OS courses to concentrate more on operating system facilities than the mechanics of the hardware.


High-level languages often make it difficult to access I/O ports directly. By using an assembly language we can teach students how to write device drivers and how to control interfaces. Many real interfaces are still programmed at the machine level by accessing registers within them. Understanding computer architecture and assembly language can facilitate the design of high-performance interfaces.


Programming and Data Structures


Students encounter the notion of data types and the effect of strong and weak data typing when they study high-level languages. Because computer architecture deals with information in its most primitive form, students rapidly become familiar with the advantages and disadvantages of weak typing. They learn the power that you have over the hardware by being able to apply almost any operations to binary data. Equally, they learn the pitfalls of weak typing as they discover the dangers of inappropriate operations on data.


Computer architecture is concerned with both the type of operations that act on data and the various ways in which the location of an operand can be accessed in memory. Computer addressing modes and the various means of accessing data naturally leads on to the notion of pointers. Students learn how pointers function at the machine level and the support offered for pointers by various architectures. This aspect is particularly important if the student is to become a C programmer.


The efficiency of procedure call/return and parameter passing mechanisms is vital to understanding the performance of a processor. By means of an assembly language you can readily demonstrate the passing of parameters by value and by reference. The use of local variables and re-entrant programming can also be taught. This supports the teaching of task switching kernels in both operating systems and real-time programming.


Students sometimes find the concept of recursion difficult. You can use an assembly language to demonstrate how recursion operates by tracing through the execution of a program. The student can actually observe how the stack grows as procedures are called.


CS Fundamentals


Computer architecture is awash with concepts that are fundamental to computer science generally and which do not appear in other parts of the undergraduate curriculum. A course in computer architecture can provide a suitable forum for incorporating fundamental principles in the CS curriculum. For example, a first course in computer architecture introduces the student to bits and binary encoding techniques. A few years ago much time would have been spent on special-purpose codes for BCD arithmetic. Today, the professor is more likely to introduce error-correcting codes (important in data communications systems and secure storage mechanisms) and data compression codes (used by everyone who has ever zipped a file or used a JPEG-encoded image).


Conclusion


Computer architecture is seen by some as an old-fashioned subject that should be replaced by more important material such as computer animation or web design.


I have argued that computer architecture course has always been in a state of change because it reflects the prevailing views of what constitutes computer architecture.

Although the traditional computer architecture course could be broken up and redistributed thought the curriculum, I have suggested that it should continue to evolve. In particular, I have suggested that modern computer architecture courses should more strongly reflect the needs of students in the new areas of computing such as multimedia, because these students require an overview of the entire system rather than an in depth analysis of the CPU. I have also suggested that computer architecture should pay more attention to the global curriculum and be used as a vehicle to introduce fundamental computer science concepts such as data compression.