Main Page | See live article | Alphabetical index

History of computing hardware

 This article is part of the
History of computing series.
 History of computing hardware (before 1960s)
 History of computing hardware (1960s-present)
 History of operating systems

This narrative presents the major developments in the history of computing hardware and attempts to put them into perspective. For a detailed timeline of events, see computing timeline. The history of computing, is an overview and treats methods intended for pen and paper, with or without the aid of tables.

Table of contents
1 Earliest devices
2 Punched card computing 1801-1940
3 Mechanical gear computing 1835-1900s
4 Analog computers, pre-1940
5 First generation of modern digital computers 1940s
6 Second generation 1947-1960
7 Third generation and beyond, post-1958
8 See also
9 External links

Earliest devices

Humanity has used devices to aid in computation for millennia. One basic example is a device for establishing equality by weight: the classic scales of justice. Another is simple enumeration: the checkered cloths of the counting houses served as simple data structures for enumerating stacks of coins, by weight. A more mathematical operation-oriented machine is the abacus—a Chinese invention, which could represent addition and subtraction. The first machines that could arrive at the answer to an arithmetical question more or less autonomously started to appear in the 1600s, limited to addition and subtraction at first, but later also able to perform multiplications. In the middle ages of Europe, one could attain a Master's degree by demonstrating the ones capability for doing long division, by whatever process available.

In 1623 Wilhelm Schickard built the first mechanical calculator. Blaise Pascal (1640) and Gottfried Wilhelm von Leibniz (1670) followed. Leibniz also established the binary system, central ingredient of all modern approaches. Their devices used techniques such as cogs and gears first developed for clocks. The difference engines of the 1800s could carry out a long sequence of such calculations in order to construct mathematical tables, but were not widely used.

Punched card computing 1801-1940

The defining feature of a "universal computer" is programmability, which allows the computer to emulate any other calculating machine by changing a stored sequence of instructions. In 1801, Joseph-Marie Jacquard developed a loom in which the pattern being woven was controlled by punched cards. The series of cards could be changed without changing the mechanical design of the loom. This was a landmark point in programmability.

In 1890 the United States Census Bureau used punch cards and sorting machines designed by Herman Hollerith to handle the flood of data from the decennial census mandated by the Constitution. Hollerith's company eventually became the core of IBM.

In the twentieth century, electricity was first used for calculating and sorting machines. By 1940, W.J. Eckert's Thomas J. Watson Astronomical Computing Bureau at Columbia University had published Punched Card Methods in Scientific Computation which was sufficiently advanced to solve differential equations, perform multiplication and division using floating point representations, all on punched cards and plugboards similar to those used by telephone operators. Astronomical calculations represented the state of the art in computing.

Mechanical gear computing 1835-1900s

In 1835 Charles Babbage described his analytical engine. It was the plan of a general-purpose programmable computer, employing punch cards for input and a steam engine for power. While the plans were probably correct, disputes with the artisan who built parts, and the end of government funding, made it impossible to build. Ada Lovelace, Lord Byron's daughter, translated and added notes to the "Sketch of the Analytical Engine" by L. F. Menabrea. She has become closely associated with Babbage. Some claim she is the world's first computer programmer, however this claim and the value of her other contributions are disputed by many. The Difference Engine II has been built and is operational at the London Science Museum; it works as Babbage designed it and has disproven the theory that Babbage was incapable of manufacturing parts of the required precision.

By the 1900s earlier mechanical calculators, cash registers, accounting machines, and so on were redesigned to use electric motors, with gear position as the representation for the state of a variable. People were computers, as a job title, and used calculators to evaluate expressions. During the Manhattan project, future Nobel laureate Richard Feynman was the supervisor of the roomful of human computers, many of them women mathematicians, who understood the differential equations which were being solved for the war effort. Even the reknowned Stanislaw Marcin Ulam was pressed into service to translate the mathematics into computable approximations for the hydrogen bomb, after the war.

During World War II, Curtis Herzstark's plans for a mechanical pocket calculator (The Curta Calculator) literally saved his life. See: Cliff Stoll, Scientific American 290, no. 1, pp. 92-99. (January 2004)

Analog computers, pre-1940

Before World War II, mechanical and electrical analog computers were the 'state of the art', and many thought they were the future of computing. Analog computers use continuously varying amounts of physical quantities, such as voltages or currents, or the rotational speed of shafts, to represent the quantities being processed. An ingenious example of such a machine was the Water integrator built in 1936. Unlike modern digital computers, analog computers are not very flexible, and need to be reconfigured (i.e., reprogrammed) manually to switch them from working on one problem to another. Analog computers had an advantage over early digital computers in that they could be used to solve complex problems while the earliest attempts at digital computers were quite limited. But as digital computers have become faster and used larger memory (e.g., RAM or internal store), they have almost entirely displaced analog computers.

First generation of modern digital computers 1940s

The era of modern computing began with a flurry of development before and during World War II, as electronic circuits, vacuum tubes, capacitors, and relays replaced mechanical equivalents and digital calculations replaced analog calculations. The computers designed and constructed then have been called 'first generation' computers. First generation computers were usually built by hand using circuits containing relays or vacuum valves (tubes), and often used punched cards or punched paper tape for input and as the main (non-volatile) storage medium. Temporary, or working storage, was provided by acoustic delay lines (which use the propagation time of sound in a medium such as wire to store data) or by Williams tubes (which use the ability of a television picture tube to store and retrieve data). By 1954, magnetic core memory was rapidly displacing most other forms of temporary storage, and dominated the field through the mid-1970s. (An example of a practical WWII era machine was the Target Data Computer employed in American submarines, that allowed the operator to input a few pieces of data, such as the sub's speed and heading, and some observed variables about a target vessel. The TDC would then calculate and display the exact aiming point for firing torpedoes. The TDC was a part of what led to total American dominance in submarine warfare in the Pacific. The WW II computer with the greatest significance, however, was the British Colossus which was used to work on the German 'Fis' cyphers and was the world's first programmable (if only in a limited way) electronic digital computer.)

This era saw numerous electromechanical calculating devices of various capabilities which had a limited impact on later designs. But in 1938 Konrad Zuse started construction of the first Z-series, electromechanical calculators featuring memory and limited programmability. He was (inadequately) supported by the German Wehrmacht which used his proto-computers for the production of guided missiles. Still, in 1941 Konrad Zuse completed world's first working programmable computer, and thus essentially started the modern era (all previous machines were either special purpose, non-programmable calculators, or did not work). The Z-series pioneered many advances, such as the use of binary arithmetic and floating point numberss. Zuse's 1936 patent application already mentions what today is known as von Neumann design. In 1945 Zuse also designed the first higher-level progamming language (Plankalkuel).

In 1940, the Complex Number Calculator, a calculator for complex arithmetic based on relays, was completed. It was the first machine ever used remotely over a phone line. In 1938 John Vincent Atanasoff and Clifford E. Berry of Iowa State University developed the Atanasoff Berry Computer (ABC), a special purpose computer for solving systems of linear equations, and which employed capacitors fixed in a mechanically rotating drum, for memory.

During World War II, the British made significant efforts at Bletchley Park to break German military communications. The main German cypher system (the Enigma in several variants) was attacked with the help of purpose built 'Bombes' (designed after Polish programmable electro-mechanical 'Bombas') which helped find possible Enigma keys after other techniques had narrowed down the possibilities. The Germans also developed a series of cypher systems (called Fish cyphers by the British) which were quite different than the Enigma. As part of an attack against these cyphers, Professor Max Newman and his colleagues (including Alan Turing) designed Colossus.

Colossus was the first programmable (to some extent) electronic computer. This statement is justified as Konrad Zuse's 1941 machine used electro-mechanical relays (although the difference seems somewhat irrelevant from a conceptual perspective: a switch is a switch no matter how it is implemented; today we use transistors; in the future we might use optics etc). Since solid state electronics had yet to be invented, Colossus used vacuum tubes, had a paper-tape input and allowed some programmability. It was built and used to decrypt German wartime cyphers. Ten second-generation Colossus machines were built (there were at least two variants), but details of their existence, design, and use were kept secret well into the 1970s. Winston Churchill is said to have personally issued an order for their destruction into pieces no larger than a man's hand. Due to this secrecy Colossus was not included in many histories of computing. There is an active project to build a copy of one of the Colossus machines.

Turing's pre-War work was a major influence on the design of the modern computer, and after the War he went on to design, build and program some of the earliest computers at the National Physical Laboratory and at the University of Manchester. His 1936 paper included a reformulation of Kurt Goedel's 1931 results as well as a description of what is now called the Turing machine, a purely theoretical device invented to formalize the notion of algorithm execution, replacing Goedel's more cumbersome universal language based on arithmetics. Modern computers are Turing-complete (i.e., equivalent algorithm execution capability to a universal Turing machine), except for their finite memory. Turing completeness is a threshold capability separating general-purpose computers from their special-purpose predecessors. It is as good a criterion as any for defining "the first computer", but unfortunately even with this restriction there is no simple answer as to which computer was the first. Babbage's Analytical Engine was the first design of a Turing-complete machine, Konrad Zuse's Z3 was the first Turing-complete working machine (but this was unknown to Zuse and was proved only in 1998 after his death), and the electronic ENIAC was the first working Turing-complete computer designed and used as such. The ABC machine was not programmable, though a complete computer in the modern sense in most other respects. George Steibitz and colleagues at Bell Labs in NY City produced several relay based 'computers' in the late '30s and early '40s, but were concerned mostly with problems of telephone system control, not computing. Their efforts were a clear antecedent for another electromechanical American machine, however.

The Harvard Mark I (officially, the Automatic Sequence Controlled Calculator) was a general purpose electro-mechanical computer built with IBM financing and with assistance from some IBM personnel under the direction of Harvard mathematician Howard Aiken. Its design was influenced by the Analytical Engine. It used storage wheels and rotary switches in addition to electromagnetic relays, was programmable by punched paper tape, and contained several calculators working in parallel. Later models contained several paper tape readers and the machine could switch between readers based on a condition. Nevertheless, this does not quite make the machine Turing-complete. Development began in 1939 at IBM's Endicott laboratories; the Mark I was moved to Harvard University to begin operation in May 1944. Unlike Konrad Zuse's 1941 programmable machine it still used the decimal system instead of the binary one.

The US-built ENIAC (Electronic Numerical Integrator and Computer), the first large-scale general-purpose electronic computer, publicly validated the use of electronics for large-scale computing. This was crucial for the development of modern computing, initially because of the enormous speed advantage, but ultimately because of the potential for miniaturization. Built under the direction of John Mauchly and J. Presper Eckert, it was 1,000 times faster than its contemporaries. Remarkably, even ENIAC was still decimal instead of binary. That is, modern machines in many ways are conceptually more similar to Konrad Zuse's 1941 binary programmable machine than to ENIAC.

ENIAC's development and construction lasted from 1941 to full operation at the end of 1945. When its design was proposed, many researchers believed that the thousands of delicate valves (ie, vacuum tubes) would burn out often enough that the ENIAC would be so frequently down for repairs as to be useless. It was, however, capable of up to 100,000 simple calculations a second for hours at a time between valve failures. It was programmable, not only by rewiring as originally designed, but later also with fixed wiring executing stored programs set in function table memory using a scheme named after John von Neumann.

By the time the ENIAC was successfully operational, the plans for the EDVAC were already in place. Insights from experience with ENIAC led to the EDVAC design, which had unrivalled influence in the initial stage of the computer revolution. The design team was led by von Neumann.

The essentials of the EDVAC design have come to be known as the von Neumann architecture: programs are stored in the same memory 'space' as the data, although this possibility was already mentioned in Konrad Zuse's 1936 patent application (Z23139/GMD Nr. 005/021). Unlike the ENIAC, which used parallel processing, it used a single processing unit. This design was simpler and was the first to be implemented in each succeeding wave of miniaturization, and increased reliability. Some view the EDVAC design as the "Eve" from which nearly all current computers derive their architecture.

The first working von Neumann machine was the Manchester "Baby", built at the University of Manchester in 1948; it was followed in 1949 by the Manchester Mark I computer which functioned as a complete system using the Williams tube for memory, and also introduced index registers. This University machine became the prototype for the Ferranti Mark I, the world's first commercially available computer except for Konrad Zuse's Z4 which was leased to ETH Zürich in 1950 (some also point out that LEO I was the computer that was used for the world's first regular routine office computer job in November 1951). The first Ferranti Mark I machine was delivered to the University in February, 1951 and at least nine others were sold between 1951 and 1957.

Later in 1951, the UNIVAC I (Universal Automatic Computer), delivered to the U.S. Census Bureau, was the first commercial computer to attract U.S. public attention. Although manufactured by Remington Rand, the machine often was mistakenly referred to as the "IBM UNIVAC". Remington Rand eventually sold 46 machines at more than $1 million each. UNIVAC was the first 'mass produced' computer; all predecessors had been 'one-off' units. It used 5,200 vacuum tubes and consumed 125 kW of power. It used a mercury delay line capable of storing 1,000 words of 11 decimal digits plus sign (72-bit words) for memory. Unlike earlier machines it did not use a punch card system but a metal tape input.

Second generation 1947-1960

The next major step in the history of computing was the invention of the transistor in 1947. This replaced the fragile and power hungry valves with a much smaller and more reliable component. Transistorised computers are normally referred to as 'Second Generation' and dominated the late 1950s and early 1960s. Despite using transistors and printed circuits these computers were still large and primarily used by universities, governments, and large corporations. For example the vacuum tube based IBM 650 of 1954 weighed over 900 kg, the attached power supply weighed around 1350 kg and both were held in separate cabinets of roughly 1.5 meters by 0.9 meters by 1.8 meters. It cost $500,000 or could be leased for $3,500 a month. However the drum memory was originally only 2000 10-digit words, a limitation which forced arcane programming, to allow responsive computing. This type of hardware limitation was to dominate programming for decades afterward, until the evolution of a programming model which was more sympathetic to software development, some decades later.

In 1955, Maurice Wilkes invented microprogramming, now almost universally used in the implementation of CPU designs. The CPU instruction set is defined by a type of programming.

In 1956, IBM sold its first magnetic disk system, RAMAC (Random Access Method of Accounting and Control). It used 50 24-inch metal disks, with 100 tracks per side. It could store 5 megabytes of data and cost $10,000 per megabyte.

The first high-level general purpose programming language, FORTRAN, was also being developed at IBM around this time.

In 1959 IBM shipped the transistor-based IBM 1401 mainframe, which used punch cards. It proved a popular general purpose computer and 12,000 were shipped, making it the most successful machine in computer history. It used a magnetic core memory of 4000 characters (later expanded to 16,000 characters). Many aspects of its design were based on the desire to replace punched card machines which were in wide use from the 1920s through the early 70s.

In 1960 IBM shipped the transistor-based IBM 1620 mainframe, originally with only punched paper tape, but soon upgraded to punch cards. It proved a popular scientific computer and about 2,000 were shipped. It used a magnetic core memory of up to 60,000 decimal digits.

Also in 1960, DEC launched the PDP-1 their first machine intended for use by technical staff in laboratories and for research.

In 1964 IBM announced the S/360 series, which was the first family of computers that could run the same software at different combinations of speed, capacity and price. It also pioneered the commercial use of microprograms, and an extended instruction set designed for processing many types of data, not just arithmetic. In addition, it unified IBM's product line, which prior to that time had included both a "commercial" product line and a separate "scientific" line. The software provided with System/360 also included major advances, including commercially available multi-programming, new programming languages, and independence of programs from input/output devices. Over 14,000 System/360 systems were shipped by 1968.

Also in 1964, DEC launched the PDP-8 much smaller machine intended for use by technical staff in laboratories and for research.

Third generation and beyond, post-1958

Main article: History of computing hardware (1960s-present)

The explosion in the use of computers began with 'Third Generation' computers. These relied on Jack St. Claire Kilby's and Robert Noyce's independent invention of the integrated circuit (or microchip), which later led to Ted Hoff's invention of the microprocessor, at Intel.

The History of computer hardware in communist countries was a bit different.

See also

External links