This set of Reviews covers Michael Mahoney's Histories of Computing (Harvard University Press, 2011), the 2013 movie Elysium, Paul E. Ceruzzi's Computing: A Concise History (MIT Press, 2012), and James Cortada's The Digital Flood (Oxford University Press, 2012).
Michael Sean Mahoney, Histories of Computing, ed. and introduction by Thomas Haigh, Harvard University Press, 2011, 250 pp .
Normally when a distinguished professor retires, his students organize a festschrift to recognize him both by his work and his students' work. In the case of Princeton's Michael Mahoney, a sudden heart attack cut short his life. Instead of a festschrift, Thomas Haigh, who took a graduate course from Mahoney, has organized a set of Mahoney's major papers on the history of computing.
Reflecting Mahoney's interests, Haigh divided the 13 papers evenly among shaping the history of computing, constructing a history of software, and the structures of computation. Haigh and Mahoney use histories in the plural to show the many ways to approach computing.
New graduate students in the history of technology in general as well as in the history of computing and computing practitioners comprise the obvious audience for this collection. They will learn how a master historian approaches a big subject. Particularly important are learning how to ask the right questions (especially those about paths not followed) and find appropriate sources. In this sense, the historian is a skilled sculptor, intensively studying the subject and raw material before beginning.
Described in these articles, but not overtly compared with doing history, is creating software. Often software development projects did not use similar skills, resulting, as Mahoney discusses several times, in expensive delays. Those creators spent too little time at the beginning thinking about what they actually wanted and needed to do and instead moved ahead quickly at great cost.
The five articles on the structures of computation will present a challenge to anyone not seeped in the wide range of theoretical computing. The articles reflect Mahoney's deep expertise in a complex field and his ability to trace the many social, institutional, and other strands of influence that intertwined to provide the mathematical basis of computation, skillfully demonstrated by his history of 1950s computing and mathematics at Princeton.
Mahoney wrote his first article about the state of the history of computing in 1988. A quarter-century later, he would be thrilled to witness not just a flourishing academic discipline with conferences and other signs of a healthy field, but more importantly, the rich range of questions and techniques employed by its practitioners. Indeed, the latest recipient of the Computer History Museum Prize (awarded to the best book in computing history by ACM Special Interest Group for Computers, Information, and Society [SIGCIS]), Joseph November, was a Mahoney student. And that is a living festschrift to a dynamic, dedicated teacher.
is an associate professor of history at Texas A&M University. Michael Mahoney was his undergraduate advisor back in the 20th century. Contact him at firstname.lastname@example.org.
Elysium, directed by Neill Blomkamp and produced by TriStar Pictures, 2013 .
It is an open secret that Hollywood science fiction films are not true works of science fiction. More often than not, they are action-adventure exercises where rogue bands of scrappy humans set out to sabotage or destroy the apotheosis of—albeit evil—technological advancement. The various Terminator films represent this dire trend where Skynet and its cyborgs have essentially taken over the planet and must be stopped by the last remnants of humanity. Such films deliver an antitechnological, if not essentially Luddite message. With Elysium, South African director Neil Blomkamp chooses an entirely different tact.
Set 50 years in the future, Blomkamp's follow-up to his apartheid allegory District 9 examines another societal conflict: the clash between the 99 percent and the 1percent. In the film, the wealthy have abandoned Earth and taken up residence in a habitat ring (resembling the space station in 2001) that orbits the planet. The remainder of humanity dwells in an environment overwhelmed by disease and pollution. As with District 9, the director has a great flair for creating places of squalor. Max (Matt Damon), a paroled criminal, has been exposed to lethal levels of radiation at his workplace in Los Angeles (the film was shot in Mexico City) and must get to an advanced medical bay (a highly sophisticated computing system that both diagnoses and cures patients of illness) on the Elysium habitat in order to be cured. He falls in with a band of revolutionaries who employ him to kidnap a corporate CEO in exchange for getting him to Elysium. They also outfit him with a military exoskeleton that they hardwire into his brain. This gives him a fighting chance against Kruger (Sharlto Copley, another veteran of District 9), an undercover operative of Elysium, sent by Secretary of Defense Delacourt (Jodie Foster) to neutralize any and all threats.
Although what transpires thereafter is a fairly standard David-versus-Goliath contest, readers of the Annals
will be interested in how Blomkamp utilizes technology in general and computers in particular, which like his political message, is fairly far afield from Hollywood convention. There is a great deal of technological forecasting on display here, including extremely sophisticated weaponry, advanced shuttlecraft, Max's exoskeleton, direct computer interface with the human brain (and consequent hacking thereof), medical bays, and the technological and aesthetic wonder of Elysium itself. Robots are frequently employed for service work as body guards, parole officers, medical orderlies, and so on. The director, however, does not assign any ethical attributes to this equipment. It is unclear whether Blomkamp has familiarized himself with Melvin Kranzberg's laws of technology, 1
but the first one is very much in operation; the machinery is morally neutral. Furthermore, by not giving either the computers or robots much in the way of human attributes, the audience has no desire or inclination to see them as anything other than morally neutral. There is no HAL 9000 from 2001
; the computer is not a character.
The film turns on how the larger society utilizes technology, with a particular focus on the computerized medical bays. In Elysium, agency for good or evil rests squarely with humanity. When computer access is highly restricted, then it is humans who have acted badly. Conversely, when Max has succeeded in opening access to all, then again, it is a human who has found the moral courage to act justly. Max and his childhood friend Frey (Alice Braga) stand on their own, while Delacourt and Kruger have no one to blame for their actions but themselves.
Blomkamp's use of computers and hence vision for the future demonstrates a level of maturity above the Hollywood norm. Computers here are as they have ever been: a means to solve a particular problem or to lighten a particular burden. Elysium places the user under the moral microscope. Technology, then, is not just some monster, like a bug-eyed alien from space, that needs to be destroyed without any regret or afterthought. That kind of storytelling is overly simplistic and, frankly, too often lets humanity off the hook morally. In Elysium, humanity is the author of its dystopian future, and thus for historians of computing, the movie evokes Kranzberg's sixth law: technology is a very human activity. This makes for a much more realistic science fiction film if also for a more harrowing tale.
Anthony P. Pennino
is an assistant professor in the College of Arts & Letters at the Stevens Institute of Technology. Contact him at email@example.com.
Paul E. Ceruzzi, Computing: A Concise History, MIT Press, 2012, 199 pp .
Paul Ceruzzi's contribution to the MIT Press Essential Knowledge Series is a concise history of computing up to present day social networks. The intended public of the MIT series consists of nonexperts interested in the fundamentals of a discipline. For this audience, Ceruzzi outlines the main aspects of computing, despite being aware that “new developments transform the field while one is writing, thus rendering obsolete any attempt to construct a coherent narrative” (p. ix).
The historical account is structured according to four main threads that constantly resurface in the six chapters of the book. The first thread, named the digital paradigm, emphasizes the long-lasting choice to code information, computation, and control using only two digits, 0 and 1. The second thread, convergence, instead points out that different technologies, ranging from the telephone to the calculating machine, have contributed to the making of the computing tools we have today. The third thread, referred to as solid-state electronics, concentrates on the role of electronics in the development of increasingly smaller and more powerful computing devices. The last thread, named the human-machine interface, reflects on the interaction of human beings with computing tools and addresses both the quest for a mechanical equivalent of human intelligence as well as the issues related to the most effective machine designs.
Ceruzzi begins his narrative recounting the roots of the digital age and the several streams that contributed to computing over time and to the making of its master technology, the computer (Chapters 1 and 2). He concludes with the aspects of greater impact on the general public nowadays—that is, the Internet and the World Wide Web and the rise of social networks like Facebook and Twitter (Chapter 6). In between, he deals with the stored-program principle, the development of computer architecture and programming languages, and the relevance of World War II as a turning point in the history of computing (Chapter 3).
The role of solid-state electronics is examined throughout the book starting with the transistor, the electronic device that made possible the transition from mainframes in need of a dedicated space to minicomputers that could be “brought … out of the computer room and into the hands of the user” (p. 71) (Chapter 3). The chip (Chapter 4), the microprocessor (Chapter 5), and their role in the development of computing are investigated as well.
In the book's conclusion, Ceruzzi is unwilling to speculate about future trends, but rather invites the reader “not [to] lose sight of the general themes that have driven computing and computer technology from its origin” (p. 155) and comes back once again to the four threads behind his account and to the long-term perspective that they provide to his story, structuring it from beginning to end.
Ceruzzi's narrative choices are the chief asset of the book and make it a distinctive and welcome contribution to the popularization of the history of computing. In fact, Ceruzzi systematically challenges the simplistic approaches too often adopted by popularizers—that is, the emphasis on endless and linear progress and the focus on chronology, pioneers, or national traditions. His account instead makes clear that the lineage of the computer is complex and that this technology has many ancestors and more mothers and fathers than can be named in a concise history. Ceruzzi is resolute as well in remarking how unsatisfactory a history based on “firsts” is, and he insists on the complementarity of the computer projects, which implemented the stored-program principle after World War II, rather than their chronological order. He also engages the reader in the historiographical debate on the role of technology in shaping history and shows how deterministic accounts that describe solid-state technologies as a driving force in computing remain inadequate, while a mutual interaction of technologies and human agency offers, for instance, an interpretive key for understanding the development of the personal computer.
Despite its small format, Ceruzzi's book nevertheless contains appendices helpful for the nonexpert reader, such as a glossary and a list of further readings. To increase the usefulness of the book for its intended public, more illustrations and some infographics might have served better than the quotations taken from the main text and reprinted in bold character on full pages, as they do not add to the information already presented by the author.
All in all, Ceruzzi's account makes a brief, but comprehensive and original introduction to the history of computing. It can be instructive for the undergraduate and graduate students approaching the history of computing for the first time, and it is especially recommended to the nonexpert reader who may be skeptical about the media hype about the latest developments of computers or social networks and is rather interested in a deeper understanding of what computing is and how computing technologies came to be an integral part of our daily lives.
is a postdoctoral fellow in the Department of Philosophy, Literature, History of Science and Technology at the Technische Universität Berlin. Contact her firstname.lastname@example.org.
James W. Cortada, The Digital Flood: The Diffusion of Information Technology Across the U.S., Europe, and Asia, Oxford University Press, 2012, 789 pp .
After exploring the evolution of office equipment firms in Before the Computer and thetransformations that data processing wrought upon the American private and public sectors in The Digital Hand, James Cortada has broadened his scope even further to publish the first truly global history of computing. The Digital Flood is an ambitious book that synthesizes several decades of country-specific case studies into a transnational narrative outlining the factors responsible for the diffusion of information technologies (IT) in the United States, Europe, and Asia.
To accomplish this task, Cortada must strike a delicate balance between acknowledging the uniqueness of each nation's engagement with computers and deriving generalizable lessons about the circulation and use of new technologies. His solution is to organize the book into geographically themed chapters, unified by a theoretical framework derived from economically inspired “wave theories” of history. Specifically, he suggests that the diffusion of IT follows a cyclical pattern, with more recent adopters replicating the behavior of their predecessors as best they can within the constraints of existing political, economic, and cultural conditions.
The majority of The Digital Flood is devoted to what Cortada terms Wave One, the period between World War II and the 1990s when information technologies became central to the ongoing operations of large government and business organizations. The first nation to undergo this transition was the United States, thanks largely to the massive influx of military funding for Cold War computing projects. Advances in hardware coincided with the expansion of academic computer science departments and a network of vendors led by IBM that encouraged the rapid adoption of mainframes across the country and the globe.
Essentially, The Digital Flood presents the history of information technology as a fugue, with the United States establishing a theme and other nations subsequently performing variations modeled after and simultaneously interacting with the original. In the midst of the postwar recovery, for example, Western European governments also sponsored computer research programs, but their efforts faced a significant competitive challenge from IBM, whose marketing infrastructure dwarfed those of its Old World rivals. Meanwhile, on the other side of the Iron Curtain, the various members of the Warsaw Pact struggled to reconcile their desire to limit contact with the West and the need to engage with a global economy increasingly reliant on IT.
As Cortada's narrative moves into the 1960s, Asian governments and corporations appear more frequently alongside their American and European counterparts. Not only does The Digital Flood highlight the rise of the Japanese electronics industry under the auspices of the government-sanctioned conglomerates, known as keiretsu, but also the adoption of IT exports as an economic development strategy in South Korea, Singapore, and Taiwan. In every instance, Cortada emphasizes how each country benefitted from nationally specific social factors, ranging from the strength of South Korea's military traditions to Singapore's cultural pluralism. At the same time, he calls attention to the growing role of Asian firms in Western economies, such as competition with American consumer electronics firms or the respective acquisition of Britain's International Computers Limited and France's Compagnie Machines Bull by Fujitsu and NEC.
By the 1990s, Cortada argues that all the previously referenced regions, except for portions of the former Soviet Bloc, had reached the end of Wave One and entered the early stages of Wave Two. Where the former phase focuses on the integration of computers into institutional operations, Wave Two is defined by networked information technologies capable of reshaping the work and leisure of private citizens. Once again, however, the timing of any country's transition into Wave Two varied sharply, and The Digital Flood's final two case studies—China and India—call attention to the proliferation of Wave Two technologies such as the cell phone and Internet within economies that are still in the midst of Wave One. Such devices, along with substantial government infrastructure investments, enabled both these countries to become major players in the 21st century IT market, despite their relatively late embrace of digital computing.
Indeed, the Chinese and Indian examples serve as reminders that the diffusion of information technologies remains a work in progress, one that does not lend itself readily to straightforward chronologies or unilateral explanations. The Digital Flood skillfully avoids both of these pitfalls. Although one might take issue with the ambiguous dividing line between Waves One and Two in his analysis—a theme that Cortada addresses at length in a separate appendix—it is difficult to fault him for imposing a structure that will facilitate comparative discussions of technological proliferation. Whether treated as a reference work or a model for future studies of information technology's spread in Latin America, sub-Saharan Africa, or the Middle East, The Digital Flood is certain to retain a prominent place on the shelves of computer historians for many years to come.
is a research fellow at the Chemical Heritage Foundation. Contact him at email@example.com.