The tube circuit resembled the ones shown in the photo linked below (although none of those in the photo are from ENIAC).
https://en.wikipedia.org/wiki/File:Women_holding_parts_of_th...
https://www.cs.drexel.edu/~bls96/eniac/simulator.html
And a programming manual:
https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=846...
It's really got a nice archaic character.
That is, they were not trying to follow the notion of a universal computing device that had already been defined by Turing and Church at the time. They were just trying to build something like a huge programmable calculator, but they ended up building a universal computation device anyway.
> Mauchly's teaching career truly began in 1933 at Ursinus College where he was appointed head of the physics department, where he was, in fact, the only staff member.
Though the article is very US focussed, keeping quiet that German engineer Konrad Zuse completed the Z3 in May 1941, five years before ENIAC, effectively creating the world's first working programmable and fully automatic digital computer. While ENIAC required days of manual cable patching to program, the Z3 was quickly programmed by a punched tape ("Lochstreifen"), and Zuse also has invented Plankalkül between 1942 and 1945, which is widely recognized as the world's first high-level programming language. The cooperation between Zuse and ETH Zurich eventually led to the first self-compiling compiler and eventually Algol 60 (see "The European Side of the Last Phase of the Development of ALGOL 60" by Peter Naur in ACM SIGPLAN "History of Programming Languages" from 1978). And there was also the British Colossus, which was also a "programmable computer" and successfully utilized vacuum tubes for code-breaking by early 1944.
What the article says is different: "the first large-scale, general-purpose, programmable electronic digital computer".
The claim of the article can be considered correct, and "electronic" is a part that cannot be deleted from it without falsifying the claim.
Before ENIAC, there have been digital computers that were much more general-purpose, because they run programs written on punched tape, instead of requiring a rewiring like ENIAC.
ENIAC, which evolved from the analog computers known as differential analyzers, had a structure closer to an FPGA than to a modern digital computer.
In contrast, an earlier relay computer like Harvard Mark I was intended as a successor of the mechanical digital computer designed by Charles Babbage, so it already had the same structure with a modern digital computer, except that it used different kind of memories for data and for programs, hence the name "Harvard architecture". The same was true for the Zuse computer.
The earlier ABC digital computer was electronic, but it can be considered as special-purpose, not general-purpose. The first relay computers at Bell Labs may also be considered as special purpose.
They had to program it by physically rewiring patch cables and flipping switches. There was no programming language, no stored program. The "software" was the hardware configuration itself.
It took another decade before FORTRAN (1957) gave programmers a way to write instructions in something resembling human language.
Surprisingly light though...
The Harvard Mark 1 ran its first program in 1944, but didn’t have branches as we understand them until 1946:
https://en.wikipedia.org/wiki/Harvard_Mark_I
Apparently, you could achieve loops by taping the input program into a physical loop, even in 1944.
The most obvious of the problems is that a computer isn't a singular technology that springs up de novo, but something that develops from antecedents over a long, messy transition problem that requires a judgement call as to when the proto-computer becomes an actual computer. A judgement call which is obviously going to be biased based on the other considerations. Consider, for a more contemporary example, what you would argue as the "first smartphone" or the "first LLM." Personally, I think the ENIAC is still somewhat too proto-computer for my tastes: I'd prefer a "first" that uses binary arithmetic and has stored programs, neither of which is true for the ENIAC.
The second major issue is it's also instructive to look at the candidates' influence on later development. Among the contenders for "first computer," it's unfortunately kinda clear that ENIAC has the most lasting influence. ENIAC's development produced the papers that directly inspires the next generation of machines. Colossus is screwed here because of the secrecy of the code-breaking effort. Meanwhile, Zuse and Z3 suffer from being on the losing end of WW2. ABC has a claim here, but it's not clear whether or not the developers of ENIAC drew influence from ABC or not.
The final major issue isn't so much an issue by itself but rather something that colors the interpretation of the first two issues: national pride. An American is far more likely to weight the influence and ingenuity of the ENIAC and similar machines to label one of them the "first computer." A UK person would instead prefer to crown Colossus or the Manchester Baby. A German would prefer the Z3.
ENIAC is notable because it was the first intentionally general purpose computer to be built.
[0] https://www.inf.fu-berlin.de/inst/ag-ki/rojas_home/documents...
The problem is that anything that gets into Wikipedia becomes ingrained in the Internet's collective mind, which then can't be changed.
That's a pretty academic take. Neither Eckert, nor Mauchly, nor Zuse knew about Alan Turing’s 1936 paper when they designed their machines. The classification of ENIAC (and the Z3) as a "universal Turing machine" is entirely a retroactive reinterpretation by later computer scientists. John von Neumann knew the paper and was aware of its significance, but he only turned up in the ENIAC project when the design was complete. At this time, Eckert and Mauchly were already well aware of ENIAC's biggest flaw (the massive effort to reprogram the machine, and in fact they came up with the stored-program concept which von Neumann later formalized). ENIAC’s funding and primary justification were for the very specific purpose of calculating artillery firing tables for the military. The machine was built for this purpose, which included the feature which retroactively led to the mentioned classification.
The earlier relay computers were Turing complete.
For ENIAC it also does not make sense to claim that it was Turing complete. Such a claim can be made for a computer controlled by a program memory, where you have a defined instruction set, and the instruction set may be complete or not. If you may rewire arbitrarily the execution units, any computer is Turing complete.
The earlier ABC electronic computer was built for a special purpose, the solution of systems of linear algebraic equations, like ENIAC was built only for a special purpose, the computation of artillery tables.
By rewiring the ABC electronic computer you could have also computed anything, so you can say that it was Turing complete, if rewiring is allowed.
The only difference is that rewiring was simpler in ENIAC, because it had been planned to be easy, so there were special panels where you could alter the connections.
Neither ABC nor ENIAC had enough memory to be truly general-purpose, and by the end of the war it was recognized that this was the main limitation for extending the domain of applications, so the ENIAC team proposed ultrasonic delay lines as the solution for a big memory (inspired by the use of delay lines as an analog memory in radars), while von Neumann proposed the use of a cathode ray tube of the kind used in video cameras (iconoscope; this was implemented first in the Manchester computers).
Because ENIAC was not really designed as general-purpose, its designers originally did not think about high-capacity memories. On the other hand, John Vincent Atanasoff, the main designer of the ABC computer, has written a very insightful document about the requirements for memories in digital computers, years before ENIAC, where he analyzed all the known possibilities and where he invented the concept of DRAM, but implemented with discrete capacitors. Later, the proposal of von Neumann was also to use a DRAM, but to use a cheaper and more compact iconoscope CRT, instead of discrete capacitors.
While the ABC computer was not general-purpose as built, the document written by Atanasoff in 1940, “Computing Machine for the Solution of Large Systems of Linear Algebraic Equations”, demonstrated a much better understanding of the concept of a general-purpose electronic digital computer than the designers of ENIAC would demonstrate before the end of 1944 / beginning of 1945, when they realized that a bigger memory is needed to make a computer suitable for any other applications, i.e. for really making it "general purpose".
Perhaps AI aided?
Vacuum tubes break too often. Once per year? But if you have a thousand of them you have to change one very often. So I guess they have a lot of space for humans repairing it.
ENIAC was built for a special purpose, the computation of artillery tables.
It was a bespoke computer built for a single customer: the United States Army's Ballistic Research Laboratory.
This is why it has been designed as the digital electronic equivalent of the analog mechanical computers that were previously used by the Army and why it does not resemble at all what is now meant by "general-purpose computer".
The computers of Aiken and Zuse were really intentionally general-purpose, their designers did not have in mind any specific computation, which is why they were controlled by a program memory, not by a wiring diagram.
What you claim about Z3 being general purpose by accident does not refer to the intention of its designer, but only to the fact that its instruction set was actually powerful enough by accident, because at that early time it was not understood which kinds of instructions are necessary for completeness.
All the claims made now about ENIAC being general-purpose are retroactive. Only after the war ended and the concept of a digital computer became well understood, the ENIAC was repurposed to also do other tasks than originally planned.
The first truly general-purpose electronic digital computers that were intentionally designed to be so were those designed based on the von Neumann report.
Before the completion of the first of those, there were general-purpose hybrid electronic-electromechanical digital computers, IBM SSEC being the most important of them, which solved a lot of scientific and technical problems, before electronic computers became available.
If this is correct, it was not a von Neuman machine originally, but it eventually became one, and at approximately the same time as the Manchester Baby.
As for the objection that it wasn’t stored program, I was interested to learn that it was converted to stored program operation after only two years or so of operation, using the constant table switches as the program store. But the Manchester Baby, which used the same memory for code and data was more significant in the history of stored program machines.
On the general question of “first computer”, I think the answer is whatever machine you want it to be if you heap enough conditional adjectives on it.
True. Mauchly was a physics professor interested in meterology, and he knew that predicting the weather and calculating an artillery shell's flight are mathematically the same type of problem, which was important to get funding. In the fifties, Eniac was even used to calculate weather forecasts (see https://ams.confex.com/ams/2020Annual/webprogram/Manuscript/...). So these were just two related special problems, and it would be a stretch to interpret this as an intention to build a general-purpose computer. The latter had to wait until the sixties.
Happy 80th anniversary, ENIAC! The Electronic Numerical Integrator and Computer, the first large-scale, general-purpose, programmable electronic digital computer, helped shape our world.
On 15 February 1946, ENIAC—developed in the Moore School of Electrical Engineering at the University of Pennsylvania, in Philadelphia—was publicly demonstrated for the first time. Although primitive by today’s standards, ENIAC’s purely electronic design and programmability were breakthroughs in computing at the time. ENIAC made high-speed, general-purpose computing practicable and laid the foundation for today’s machines.
On the eve of its unveiling, the U.S. Department of War issued a news release hailing it as a new machine “expected to revolutionize the mathematics of engineering and change many of our industrial design methods.” Without a doubt, electronic computers have transformed engineering and mathematics, as well as practically every other domain, including politics and spirituality.
ENIAC’s success ushered the modern computing industry and laid the foundation for today’s digital economy. During the past eight decades, computing has grown from a niche scientific endeavor into an engine of economic growth, the backbone of billion-dollar enterprises, and a catalyst for global innovation. Computing has led to a chain of innovations and developments such as stored programs, semiconductor electronics, integrated circuits, networking, software, the Internet, and distributed large-scale systems.
The motivation for developing ENIAC was the need for faster computation during World War II. The U.S. military wanted to produce extensive artillery firing tables for field gunners to quickly determine settings for a specific weapon, a target, and conditions. Calculating the tables by hand took “human computers” several days, and the available mechanical machines were far too slow to meet the demand.
In 1942 John Mauchly, an associate professor of electrical engineering at Penn’s Moore School, suggested using vacuum tubes to speed up computer calculations. Following up on his theory, the U.S. Army Ballistic Research Laboratory, which was responsible for providing artillery settings to soldiers in the field, commissioned Mauchly and his colleagues J. Presper Eckert and Adele Katz Goldstine, to work on a new high-speed computer. Eckert was a lab instructor at Moore, and Goldstine became one of ENIAC’s programmers. It took them a year to design ENIAC and 18 months to build it.
The computer contained about 18,000 vacuum tubes, which were cooled by 80 air blowers. More than 30 meters long, it filled a 9 m by 15 m room and weighed about 30 kilograms. It consumed as much electricity as a small town.
Programming the machine was difficult. ENIAC did not have stored programs, so to reprogram the machine, operators manually reconfigured cables with switches and plugboards, a process that took several days.
By the 1950s, large universities either had acquired or built their own machines to rival ENIAC. The schools included Cambridge (EDSAC), MIT (Whirlwind), and Princeton (IAS). Researchers used the computers to model physical phenomena, solve mathematical problems, and perform simulations.
After almost nine years of operation, ENIAC officially was decommissioned on 2 October 1955.
ENIAC in Action: Making and Remaking the Modern Computer, a book by Thomas Haigh, Mark Priestley, and Crispin Rope, describes the design, construction, and testing processes and dives into its afterlife use. The book also outlines the complex relationship between ENIAC and its designers, as well as the revolutionary approaches to computer architecture.
In the early 1970s, there was a controversy over who invented the electronic computer and who would be assigned the patent. In 1973 Judge Earl Richard Larson of U.S. District Court in Minnesota ruled in the Honeywell v. Sperry Rand case that Eckert and Mauchly did not invent the automatic electronic digital computer but instead had derived their subject matter from a computer prototyped in 1939 by John Vincent Atanasoff and Clifford Berry at Iowa State College (now Iowa State University). The ruling granted Atanasoff legal recognition as the inventor of the first electronic digital computer.
In 1987 IEEE designated ENIAC as an IEEE Milestone, citing it as “a major advance in the history of computing” and saying the machine “established the practicality of large-scale electronic digital computers and strongly influenced the development of the modern, stored-program, general-purpose computer.”
The commemorative Milestone plaque is displayed at the Moore School, by the entrance to the classroom where ENIAC was built.
“The ENIAC legacy heralded the computer age, transforming not only science and industry but also education, research, and human communication and interaction.”
A paper on the machine, published in 1996 in IEEE Annals of the History of Computing and available in the IEEE Xplore Digital Library, is a valuable source of technical information.
“The Second Life of ENIAC,” an article published in the annals in 2006, covers a lesser-known chapter in the machine’s history, about how it evolved from a static system—configured and reconfigured through laborious cable plugging—into a precursor of today’s stored-program computers.
A classic history paper on ENIAC was published in the December 1995 IEEE Technology and Society Magazine.
The IEEE Inspiring Technology: 34 Breakthroughs book, published in 2023, features an ENIAC chapter.
One of the most remarkable aspects of the ENIAC story is the pivotal role women played, according to the book Proving Ground: The Untold Story of the Six Women Who Programmed the World’s First Modern Computer, highlighted in an article in The Institute. There were no “programmers” at that time; only schematics existed for the computer. Six women, known as the ENIAC 6, became the machine’s first programmers.
The ENIAC 6 were Kathleen Antonelli, Jean Bartik, Betty Holberton, Marlyn Meltzer, Frances Spence, and Ruth Teitelbaum.
“These six women found out what it took to run this computer, and they really did incredible things,” a Penn professor, Mitch Marcus, said in a 2006 PhillyVoice article. Marcus teaches in Penn’s computer and information science department.
In 1997 all six female programmers were inducted into the Women in Technology International Hall of Fame, in Los Angeles.
Two other women contributed to the programming. Goldstine wrote ENIAC’s five-volume manual, and Klára Dán von Neumann, wife of John von Neumann, helped train the programmers and debug and verify their code.
To honor the women of ENIAC, the IEEE Computer Society established the annual Computer Pioneer Award in 1981. Eckert and Mauchly were among the award’s first recipients. In 2008 Bartik was honored with the award. Nominations are open to all professionals, regardless of gender.
Last year a group of 80 autistic students, ages 12 to 16, from PS Academy Arizona, in Gilbert, recreated the ENIAC using 22,000 custom parts. It took the students almost six months to assemble.
A ceremony was held in January to display their creation. The full-scale replica features actual-size panels made from layered cardboard and wood. Although all electronic components are simulated, they are not electrically active. The machine, illuminated by hundreds of LEDs, is accompanied by a soundtrack that simulates the deep hum of ENIAC’s transformers and the rhythmic clicking of relays.

This machine prints and tabulates the answers to the problems solved by the ENIAC.
Bettmann/Getty Images
“Every major unit, accumulators, function tables, initiator, and master programmer is present and placed exactly where it was on the original machine,” Tom Burick, the teacher who mentored the project, said at the ceremony.
The replica, still on display at the school, is expected to be moved to a more permanent spot in the near future.
ENIAC’s significance is both technical and symbolic. Technically, it marks the beginning of the chain of innovations that created today’s computational infrastructure. Symbolically, it made governments, militaries, universities, and industry view computation as a tool for improvement and for innovative applications that had previously been impossible. It marked a tectonic shift in the way humans approach problem-solving, modeling, and scientific reasoning.
The ENIAC legacy heralded the computer age, transforming not only science and industry but also education, research, and human communication and interaction.
As Eckert is reported to have said, “There are two epochs in computer history: Before ENIAC and After ENIAC.”
The remarkable evolution of computer hardware during the past 80 years has been sparked by advances in programming languages—the essential drivers of computing.
From the manual rewiring of ENIAC to the orchestration of intelligent, distributed systems, programming languages have steadily evolved to make computers more powerful, expressive, and accessible.
The evolution of computing will continue along multiple trajectories, with the emphasis moving from generalization to specialization (for AI, graphics, security, and networking), from monolithic system design to modular integration, and from performance-centric metrics alone to energy efficiency and sustainability as primary objectives.
Increasingly, security will be built into hardware by design. Computing paradigms will expand beyond traditional deterministic models to embrace probabilistic, approximate, and hybrid approaches for certain tasks.
Those developments will usher in a new era of computing and a new class of applications.