As a society, we‘ve become so accustomed to the convenience and ubiquity of computers that it‘s easy to forget just how far this world-changing technology has come in a remarkably short period of time. The smartphone you likely have within arm‘s reach right now boasts more raw computing power than the state-of-the-art supercomputers of just a few decades ago. In fact, the Apollo Guidance Computer that took astronauts to the moon in the 1960s had about as much processing power as a modern toaster oven^1!
To truly grasp the immensity of this technological revolution, we must peer back through the annals of history to its very origins – to the earliest, most primitive devices that first bore the name "computer". Join me now on a fascinating expedition through time as we unearth the relics and artifacts that laid the groundwork for the Information Age. These are the "ancient" machines that made our modern digital world possible.
When Brass Gears Did the Thinking: The Pre-Electronic Era
Long before the first electronic computers sparked to life in the 1940s, complex feats of calculation were already being mechanized with an intricate dance of moving parts. Possibly the earliest known ancestor of the computer was the Antikythera mechanism, a mysterious geared device used by the ancient Greeks to predict astronomical positions and eclipses as early as the 2nd century BCE.^2 This stunningly sophisticated machine, described as the world‘s first analog computer, used a complex system of over 30 bronze gearwheels to automate calculations of the cycles of the solar system. The Antikythera mechanism demonstrated that the concepts of computational automation pre-date electricity by millennia.
Emerging from the mists of antiquity to the brink of the industrial age, we find one of the most brilliant and idiosyncratic figures of the 19th century – the English mathematician and inventor Charles Babbage. Widely hailed as the "father of the computer", Babbage designed two of the foundational machines in computing history – the Difference Engine and Analytical Engine.^3
The Difference Engine, conceived in 1822, was a mechanical calculator designed to tabulate polynomial functions by the method of finite differences. It was essentially a steam-powered arithmetic device, capable of computing numbers up to 31 digits long. While a small demonstration model was eventually completed in 1832, the full machine was sadly never realized in Babbage‘s lifetime due to funding issues and his own perfectionism.
But it was Babbage‘s plans for the Analytical Engine in 1837 that truly foreshadowed the general-purpose programmable computers to come a century later. The Analytical Engine incorporated an arithmetical unit (the "mill") and integrated memory (the "store"), with input in the form of punch cards adapted from the Jacquard loom. It had an instruction set to control flow and could perform conditional branching and loops, in what was essentially the first conception of a universal Turing-complete computer language.
Ada Lovelace, an English mathematician and one the world‘s first computer programmers, wrote eloquently of the Analytical Engine‘s potential as a general-purpose device: "The Analytical Engine weaves algebraic patterns, just as the Jacquard loom weaves flowers and leaves."^4 Lovelace wrote what is considered the first published algorithm for the Analytical Engine, essentially becoming the first software engineer. Tragically, just like its predecessor the Difference Engine, the Analytical Engine was never fully constructed – but it planted the conceptual seeds for the programmable electronic computers that would rise a hundred years later.
World War Wires: The Birth of Electronic Computers
It took the urgent pressures of a world war to finally bring computing out of the realm of pure theory and into electromechanical reality. During the late 1930s and early 1940s, several nations independently developed electronic computational devices to gain a military edge, particularly for ballistics calculations and cryptography.
The German Z-series machines built by Konrad Zuse are considered the first electromechanical binary programmable computers. In particular, the Z3 of 1941 was the first working machine to feature binary arithmetic, floating point numbers, and a measure of programmability with punch tape.[^5] Unfortunately for Zuse, the German government did not support his work and much of it was destroyed in the Allied bombing of Berlin in 1943.
On the Allied side, the electronic computers that arose were focused largely on codebreaking – and the most famous of these was Colossus. Designed by British telephone engineer Tommy Flowers in 1943, Colossus was used to break the encrypted messages generated by the German Lorenz SZ40 teleprinter cypher machine.^6 Colossus incorporated 1500 thermionic valves (vacuum tubes) and was capable of performing boolean and counting operations, though it was not fully programmable and was optimized specifically for cryptanalysis. A total of ten Colossi were in use by the end of the war.
Across the Atlantic, the US was developing its own secret electronic codebreaking computer – the Bombe machine used against German Enigma. But it was the ENIAC (Electronic Numerical Integrator and Computer), completed in 1945, that became the first truly general-purpose programmable electronic computer. Designed by John Mauchly and J. Presper Eckert at the University of Pennsylvania, ENIAC used 17,000 vacuum tubes, weighed 30 tons, occupied 1800 square feet and consumed 150 kW of power.[^7] It could perform about 385 multiplications per second, a thousandfold faster than any previous electromechanical machines. The first programs run on ENIAC calculated artillery firing tables for the US Army‘s Ballistics Research Laboratory.
The Cores of Modernity: RAM and the Von Neumann Architecture
In the immediate post-war period, computer science truly came into its own through the pioneering work of mathematicians and logicians. People like Claude Shannon, Alan Turing, and John von Neumann made key conceptual breakthroughs that would define the basic architecture of almost all computers to the present day.
The first key innovation was the Von Neumann architecture, a computer design approach proposed by John von Neumann in 1945. In contrast to earlier computing machines that had fixed, special-purpose hardware, the Von Neumann architecture featured a central processing unit that could be re-programmed via software to perform different tasks.[^8] Programs were stored electronically in memory, rather than being manually rewired. This stored-program concept, where instructions are data just like any other data, is the defining feature of a modern general-purpose computer. Today, the Von Neumann architecture is ubiquitous in computing devices, from laptops to smartphones.
The second major breakthrough was the development of random access memory (RAM). Magnetic-core memory, invented in 1947 by An Wang and Way-Dong Woo, became the dominant form of random access memory for computers well into the 1970s.^9 Unlike previous memory media like delay lines, magnetic drums, and Williams tubes, core memory allowed data to be accessed in any order, not just sequentially. Each core memory unit was a tiny ring of magnetizable material that could store a single binary bit, magnetized one way for a "one", and the other way for a "zero". Core memory was non-volatile and reliable, though quite bulky and expensive by modern standards – the Apollo Guidance Computer used core rope memory that stored about 72 kilobytes per cubic foot.[^10]
These twin pillars of the Von Neumann architecture and RAM memory formed the core foundations of computer design that enabled the flourishing of general-purpose programmable computers in the 1950s and beyond.
Lighting the Fuse: The Dawn of Commercial Computing
With the basic blueprints for modern computers now laid out, the 1950s saw an explosion of new machines and companies vying to commercialize these wartime technologies for business and government applications.
In the UK, the Manchester Baby (also known as the Small-Scale Experimental Machine) was the first stored-program computer, running its first program in 1948. It served as the prototype for the Manchester Mark 1 and later the Ferranti Mark 1, the world‘s first commercially available general-purpose computer.^11
But it was in the United States where some of the most influential early commercial computers arose. The Universal Automatic Computer (UNIVAC I) was the first American commercial computer delivered to the US Census Bureau in 1951. Remington Rand eventually sold 46 UNIVAC I systems for over $1 million each.[^12] UNIVAC I also gained notoriety for successfully predicting Eisenhower‘s landslide victory in the 1952 presidential election based on just 1% of the voting data.
No discussion of early commercial computing can neglect to mention IBM, which became utterly dominant in the field for decades. The first IBM mainframes were the IBM 701 in 1952 and the Model 650 in 1953, the latter of which became the most popular computer of the 1950s. By 1964, the IBM System/360 introduced the concept of a compatible mainframe family with interchangeable peripherals and software.^13 The success of the System/360 solidified IBM‘s market dominance and made them the most profitable company in the computer industry for many years.
The March of Progress: Moore‘s Law and Miniaturization
From the room-filling 30-ton behemoths of the 1940s to the pocket-sized marvels we carry today, the story of computing has been one of relentless progress in making electronic brains smaller, cheaper, faster, and more energy efficient.
This trend was first codified by Intel co-founder Gordon Moore in 1965, in what came to be known as Moore‘s Law – the observation that the number of transistors on an integrated circuit doubles about every two years.[^14] For over five decades, Moore‘s Law has driven the astounding exponential growth in computing power that has unlocked new applications and capabilities with each passing generation.
To illustrate: one of the earliest transistorized computers, the TX-0 at MIT‘s Lincoln Laboratory in 1956, had just 64 words of 18-bit memory.^15 Today, a common 16 GB smartphone has around 34 billion times more memory! In terms of processing speed, the TX-0 could perform about 83,000 additions per second, while a modern iPhone can perform about 60 billion 32-bit additions per second^16 – a speedup of over 700,000 times. These almost incomprehensible leaps in performance are the tangible fruits of decades of tireless work by engineers and scientists to keep pushing the boundaries of computing forward.
Weaving the Dream: A Tapestry of Ingenuity
From Charles Babbage‘s gear-driven visions, to Alan Turing‘s insightful abstractions, to the audacious circuit-wrangling of Moore‘s Law – the history of computing is a rich saga of human ingenuity, a story of dreamers and tinkerers, of visionaries and problem-solvers. Each generation has woven new threads into this great tapestry, expanding the boundaries of what‘s possible.
So as you go about your day, take a moment to marvel at the computational wonders woven into the fabric of the modern world. The smartphone in your pocket, the laptop on your desk, the car navigation system, the wearable fitness tracker – each of these digital threads can be traced back to the fingers of pioneers like Babbage, Lovelace, Turing, Mauchly, Eckert, von Neumann, and Moore.
We are all living in a world irrevocably shaped by their dreams – dreams of machines that could extend the reach of the human mind, that could augment our memory and multiply our capabilities. So let us pay tribute to these pioneers by continuing to dream big about what the future may hold. Our computing journey has only just begun.
[^5]: Rojas R. (1997) Konrad Zuse‘s Legacy: The Architecture of the Z1 and Z3. IEEE Annals of the History of Computing, vol. 19, no. 2, pp. 5-16. [^7]: W. Barkley Fritz, "ENIAC-A Problem Solver," IEEE Annals of the History of Computing, vol. 16, no. 1, pp. 25-45, 1994.[^8]: J. von Neumann, "First Draft of a Report on the EDVAC," IEEE Annals of the History of Computing, vol. 15, no. 4, pp. 27-75, 1993. [^10]: Hall, Eldon C. (1996). Journey to the Moon: The History of the Apollo Guidance Computer. [^12]: Gray, George (2001). UNIVAC I: The First Mass-Produced Computer. [^14]: Moore, Gordon E. (1965). "Cramming more components onto integrated circuits"