Dear Commons Community,
I have just finished reading Turing’s Cathedral by the technology historian George Dyson. It is an in-depth look at the beginnings of digital technology here in the United States. All of the luminaries in the early days of computing including John von Neumann, Alan Turing, Nils Barricelli, Julian Bigelow, Stan Ulam and others are included. von Neumann and those affiliated with the Institute for Advanced Study (IAS) located in Princeton, New Jersey are the main focus.
I found the book a fascinating read but it isn’t for everyone. Dyson at times can get deep in the weeds of computer science, mathematics, digital engineering, and physics. However, in between the science, he covers the biographies of the main characters in good depth. Dyson himself is the son of the mathematician, Freeman Dyson, who was on the faculty of IAS. Dyson also covers well the tension and antagonism at IAS between the theoretical mathematicians and those who wanted to apply mathematics to the building of the first computers. He also devotes significant time to the development of nuclear weapons and the cold war in which von Neumann and others were deeply involved. There were those at IAS including Albert Einstein who were completely opposed to these efforts.
Below is a review that appeared in The Guardian when Turing’s Cathedral was first published in 2012.
Tony
——————————————————————————————————————————————-
The Guardian
Turing’s Cathedral by George Dyson – review
This study of cold war academic John von Neumann and his early computer is engrossing and well-researched
Sat 24 Mar 2012
The foundation myth of the internet invariably involves an iconoclastic and romantic technology entrepreneur, who, free from government restraint, enlists free-floating venture capitalists in building the Next Great Thing. It’s a myth that borders on delusion, for some of the key technologies that led to the internet were underwritten by government subsidies and arose in the context of larger-than-life geopolitical battles.
Thus, cryptography, which powers much of today’s electronic commerce, advanced in the background of the second world war, while packet switching – a cold war-era technology that made the internet possible – was to guarantee resilient communications in the event of a nuclear attack. More recently, 9/11 and the wars it unleashed have magically transported biometric technologies such as automated facial recognition from the battlefields of Afghanistan and Iraq into our offices and living rooms.
In Turing’s Cathedral, George Dyson shows that the history of the modern computer belies the foundation myth as well. Dyson, who has previously written on the history of the Aleut kayak and a failed American attempt to send a mission to Mars, traces one particular effort to build and operate a computer – the unassumingly named Electronic Computer Project (EPC) based at the Institute for Advanced Study (IAS) at Princeton.
EPC was underwritten by various parts of the American government shortly after the second world war. The idea was to use computers to forecast the consequences of a thermonuclear explosion; eventually, the IAS computer was also put to more peaceful uses in biology and meteorology.
The project’s godfather – the Hungarian émigré John von Neumann – was a polymath whose interest in computing had roots in both politics and academia. A superb mathematician who also made landmark contributions to economics and game theory, Von Neumann believed that computers might push mathematicians – who constituted the most powerful group at the institute – to appreciate the theoretical challenges posed by applied work. At the same time, his aversion to totalitarianism made him eager to help bolster the military might of his adopted homeland.
It took a genius of Von Neumann’s scale to overcome the immense opposition to the project at the institute, which was a fascinating microcosm of intellectual life at the time (Dyson’s book is worth reading for its treatment of the institute’s early history alone). Building and operating a computer on the institute’s premises meant opening its doors to engineers – a development that professional mathematicians, averse as they were to any work that didn’t require chalk, blackboard, paper and pencils, didn’t like at all. The institute’s humanists hated mathematicians and engineers alike and, now that the war was over, didn’t shy away from expressing their discontent.
It didn’t help that Einstein, who was then at the institute, opposed the idea of “secret war work” and feared that “the emphasis on such projects will further ideas of ‘preventive’ wars.” However, “preventive wars” were exactly what the hawkish Von Neumann wanted: in the immediate aftermath of the second world war, he briefly advocated the idea of a quick preventive war with the USSR to be followed by a Pax Americana. He also had no qualms about working with the government, eventually leaving the institute in 1953 to join the United States Atomic Commission – a government agency that would soon humiliate his friend and colleague Robert Oppenheimer by stripping him of his security clearance.
Strictly speaking, Von Neumann’s was not the first computer. However, it played an extremely important role in getting the nascent computer industry off the ground. First of all, its origins in academia made it easier to get working scientists to pay close attention to what computing had to offer. Second, Von Neumann wanted to ensure that any work that the institute did on the EPC was put in the public domain and widely disseminated rather than patented by engineers (this noble effort was marred by Von Neumann’s consulting gig with IBM – not well-publicised at the time – which required him to grant all of his own subsequent inventions to the company). Third – and most important – Von Neumann chose not to optimise his computer to do only pressing or lucrative tasks; he knew that its most useful applications had not been anticipated yet. By arguing that “the projected device… is so radically new that many of its uses will become clear only after it has been put into operation”, Von Neumann helped to usher in the era of general-purpose computing which, alas, may now be finally coming to a close, as consumers embrace single-purpose apps and tightly controlled computing devices.
While Dyson doesn’t shy away from discussing obscure technical and theoretical aspects of Von Neumann’s computer, he also provides ample social and cultural context. Gottfried Leibniz, Francis Bacon, and Bishop Berkeley appear next to more contemporary luminaries such as Norbert Wiener (the originator of cybernetics), Vladimir Zworykin (a pioneer of television) as well as numerous members of the Huxley family (Aldous, Julian and Thomas). Dyson, who grew up at the institute, where his father Freeman Dyson was a fellow, also brings a charming personal touch to the narrative.
Alas, the book is not perfect. Dyson, who spent a decade writing and researching it, bombards the reader with a mind-boggling stream of distracting information that adds little to his tale. We get to learn of the discrepancy between the British and Canadian war records of Jens Fredrick Larson, the architect of the institute’s main hall; the price of oysters served at lunch meetings of its building committee; the price of nappies in Los Alamos hospitals in the 1950s.
Dyson’s efforts to connect Von Neumann’s cold war computing to today’s Silicon Valley result in a slew of untenable generalisations. Is it really true that “Facebook defines who we are, Amazon defines what we want, and Google defines what we think”? Occasionally, Dyson makes mystical claims that no serious historian would endorse. What to make of his statement that “only the collective intelligence of computers could save us from the destructive powers of the weapons they had allowed us to invent”? This is a very odd way to tell the story of numerous disarmament campaigns, of fervent antiwar activism of the 1960s, of the emergence of groups like Computer Professionals for Social Responsibility that sought to draw clear ethical boundaries between academia and the defence industry. Surely, all of that mattered more than the “collective intelligence of computers”?
Despite these shortcomings, Turing’s Cathedral is an engrossing and well-researched book that recounts an important chapter in the convoluted history of 20th-century computing. An equally rich history of Google and Amazon is long overdue.
Evgeny Morozov is the author of The Net Delusion (Penguin)