David Alan Grier, When Computers Were Human, Princeton University Press, 2005, VIII + 411 pp., $35, ISBN 0-691-09157-9 (hardback).
This book covers the pre-automatic age of computing when computation was done by humans (latterly with the help of calculators) rather than electronic computers. Indeed, the Shorter Oxford English Dictionary (1993) gives the first definition of computer as "a person who makes calculations," in use from the mid-17th century. The second definition, from the late 19th century, is "an apparatus for making calculations." So the term computer has been used to describe humans for far longer that its current general usage.
The book starts with a personal introduction, presenting the author's grandmother who studied mathematics at the University of Michigan, gaining a degree in 1921 (one of six female students out of a total of 12, a high ratio) in the early 1920s. She went on to become a "computer." In a later time, there might have been more opportunities for her, but at that point (and indeed for most of the period when human computers existed), this occupation was one of the few available for women with some mathematical ability. With this evocative first "scene," I suspect this was a great inspiration for the book and has helped to give it passion.
The book's overall structure is formed around the visits of Halley's Comet to the sun every 76 years or so. This appropriately illustrates the advances in computing (both human, mainly, and more recently, machine-based) over a regular time period since the first predicted return in 1758.
The position of the 1682 comet was measured and studied by Edmund Halley (1656–1742). He used the gravitational theory of Isaac Newton (1642–1727) in his prediction. However, because more than two bodies were involved, the calculations needed were laborious and beyond Halley's mathematical capability for detailed analysis.
Subsequently, only one mathematician attempted to predict the date accurately, namely the Frenchman Alexis-Claude Clairaut (1713–1765). He divided the computation with two friends, Joseph-Jérôme Le Lepart and Nicole-Reine Étable. This is perhaps the first example of a set of human computers working in unison, an important aspect in the development of human computing.
In fact, there came to be a number of different levels in the organization of early manual computing. A mathematician would work on the high-level mathematical aspects; the next one or more people would divide up the calculational tasks, to be undertaken in parallel; and finally the actual computers would carry out the calculations themselves. Of course, checking for errors was important. It was possible to repeat calculations and check results, but this was wasteful of labor, and other more efficient techniques were developed.
For the comet's 1835 return, another Frenchman, the astronomer Philippe Gustave Le Doulcet, Compte de Pontécoulant (1795–1874), provided what was probably the best set of calculations to predict the perihelion's date. This was gradually refined from 7 November to the evening of 12 November. The actual date was 16 November. Other calculations varied by up to 16 days from this date.
By the year before the 1910 return, there were only two predictive calculations available. One was Pontécoulant's calculation for this return. This was still generally accepted as adequate because he included Neptune's gravitational pull and no new planets had been discovered. However, Andrew Cromelin of the Royal Greenwich Observatory in England disagreed and led a team to recalculate the timing, which proved to be within two days, 16 hours, and 48 minutes of the actual time.
The book ends with an epilogue of the comet's 1986 return, by which occasion of course electronic computers were available. This time, Donald Yeomans, a young researcher at the Jet Propulsion Laboratory in California used a Univac 1108 computer programmed in Fortran IV to undertake the calculation. He calculated old visits from 1682 onward and, in 1977, predicated the next perihelion would be on 9 February at 15:50 Universal Time, with an accuracy of 6 hours. It was actually at 10:48, 5 hours and 2 minutes before the predicted time.
The next return of Halley's Comet will be in 2061, although unfortunately many readers, including myself, are unlikely to witness the event. However, for those that are here, it will be interesting to see if the perihelion can be predicted with yet more accuracy or if there are unknown variations that make this impossible.
Interspersed with the accounts of the Halley's Comet calculations within the book are more general accounts and vignettes concerning human computing. Much early computing was aimed toward astronomical calculations. However, other applications included nautical tables for marine navigation, census statistics with data from Hollerith tabulating machines, military calculations such as shell trajectories, and telephone transmission calculations.
Of course, the development of mechanized calculators increasingly meant that human computers had additional aids. Gradually, these became more elaborate. During World War II, early automatic computing devices were developed secretly at Bletchley Park in the UK to enable code breaking at sufficient speed, although this aspect is a notable omission from the book.
A late and ultimately doomed effort involving human computing was the Mathematical Tables Project of the Work Projects Administration (WPA) in New York. The book documents the efforts to keep this project alive after World War II, but naturally, the relentless advance of the electronic computer eventually won and the project was doomed to close. All the complex calculational toil that was once done manually by people would become buried within fully automated and inanimate computers instead. However, as I mentioned earlier, the history of human computing is still longer than that of modern computing.
A book like this can never be comprehensive, and indeed, I suspect that much interesting historical information on human computing has been lost over time. However, the book successfully gives a historical flavor of important developments. There could be more technical mathematical information on approaches used, for example, on how calculations were divided in parallel. However, this would detract from the book's overall readability so it is an understandable omission.
Overall, this book provides a wonderful survey of human computing from 1682 onward. It is written in an accessible style, yet is highly scholarly. There are full chapter notes, a bibliography, and a comprehensive index for those who wish to use this book as a starting point for serious research. It also contains a useful list of important people, organizations, and concepts with brief information for those who wish to refresh their mind on a name.
However, all these resources can easily be ignored by those who want to read the book from beginning to end as a historical tale of scientific endeavor. The 76-year cycle of Halley's Comet keeps the book moving along and works well as an overall structure. I recommend the book to all historians of computing, both professional and amateur.
Jonathan P. Bowen, , firstname.lastname@example.org
Jeremy M. Norman, ed., From Gutenberg to the Internet: A Sourcebook on the History of Information Technology, historyofscience.com, 2005, 899 pp., $89.50, ISBN 0-930405-87-0.
This is a monster of a book, but one that needs to be on the shelf of anyone seriously interested in computer history. Oversized (8.5 × 11 inches) and more than 900 pages long, it's an unbelievably priced bargain.
We've all run across a reference in a paper and wished we had a copy. In most cases, the time to track down a copy exceeds the time to read it. And so, we put it on our "to do" list and usually forget about it. Norman's anthology attempts to fill this gap with excerpts from 63 seminal papers from the history of information technology. The book contains more than 100 illustrations, many of which will be new (and interesting) to most readers.
Chapter 1, "From Gutenberg's Press to the Foundations of the Internet," is a 60-page introduction in which Norman traces two informational shifts: from manuscripts to printed materials and from printed materials to the Internet. Although a bit tedious in places, he provides a robust perspective in which to view the excerpts. The book is written for a general audience and not just for the historian or information technologist.
Chapter 2, "An Annotated Chronology of Wide-Ranging Scientific, Social, and Commercial Developments in the History of Information Technology from the Years 100 to 2004," is a 40-page annotated time line. This time line is more comprehensive than I have seen previously and is available at the publisher's Web site, http://historyofscience.com.
Norman provides an introductory note to each chapter. The excerpts are grouped under the following headings (starting with Chapter 3); the numbers in parenthesis are the number of excerpts in the section:
3. Human Computers (2)
4. Mechanizing the Production of Tables (5)
5. The Earliest Data Networks (6)
6. Origins of the General Purpose Programmable Computer—Babbage's Analytical Engine (3)
7. The Theory of the Universal Machine (4)
8. Logical Design and Production of the First Electronic Digital Computers (12)
9. The Origins of Computer Programming (6)
10. Early Applications of Electronic Computers (7)
11. Computing and Intelligence (7)
12. Communication Theory (3)
13. Origins of the Internet (8)
The Web site also contains the book's front matter, three pages from the introduction, and an index to the names he uses in the book. Thankfully, the front matter includes a detailed table of contents in which Norman identifies the excerpts by author so that interested readers can check out the contents to decide if they need the book. There is also a list of the 36 illustrations he uses in the introduction and introductory notes.
Norman is a rare book and manuscript dealer specializing in the history of science, medicine, and technology. He has made a significant contribution to historical scholarship by publishing this book. Universities and organizations involved in information technology (almost all organizations) should make sure that From Gutenberg to the Internet is in the reference section of their libraries. Given the modest price, the same should be said for any serious historian of computing or information technologist interested in the discipline's origins.
Tim Bergin, email@example.com
Christophe Lécuyer, Making Silicon Valley: Innovation and the Growth of High Tech, 1930–1970, MIT Press, 2006, 393 pp., $40.00, ISBN 0-262-2281-2.
This fascinating study is based on the author's 1999 dissertation at Stanford University—fittingly, a central player in the region he writes about. Now a historian at the Chemical Heritage Foundation, Lécuyer has authored a valuable analysis—an example of industrial archeology research—combining changing technology and emerging business structures to demonstrate the various waves of change that have swept this region from the 1930s into the early 1970s.
Much of the four-decade period under discussion, of course, predates serious computer design and manufacturing work (only at the very end do we hear of Apple's rise), but the author provides an important contextual background for that development. Using a variety of archival resources, Lécuyer delves into the sometimes conflicting roles of personality and entrepreneurship as well as technology shifts. His approach is to center chapters on specific companies as illustrative of major themes and trends.
Although now barely remembered or even conceivable, the San Mateo and Santa Clara counties were covered with fruit orchards through and after World War I. Truck farming was the chief business, with San Jose the region's primary shipping and business hub. It is only with the rise of radio (both broadcasting and short-wave point-to-point) that a paradigm shift began, combining the lovely living conditions of the central California coast with a thriving radio-hobby community (pooling expertise and demand) to create a market for vacuum tubes. Some manufacturing had begun on a small scale with military contracts during World War I, but most of the industry was then based in the east.
By the 1920s, General Electric and, especially, RCA dominated vacuum-tube patents and manufacturing. The emergence of the power vacuum-tube business (of which the chief player was the all-but-forgotten firm Eitel-McCullough) marked the real start of the area's technology shift. Diversification followed in the difficult Depression years (here the book focuses on Litton Industries), when government contracts were scarce and commercial work not much easier to find.
One of Lécuyer's central themes is the varied role of defense contracts on the region as he explores the economic and political forces that sustained the wartime and postwar emergence of the microwave-tube business (as the book shows through the changing role of Varian Associates). Those contracts sparked huge growth during the war, but dumping of surplus vacuum tubes in 1945–1946 nearly bankrupted the companies that could not compete. This theme recurs with defense cutbacks under MacNamara in the early 1960s and, again a decade later, when sharp cuts led to a series of personnel layoffs and company mergers. By the 1960s, these cycles began to force a marketing shift from government needs to a more commercial emphasis, laying further ground for the changes to come.
The postwar "revolution in silicon" in the region centers on new firms—the mid-1950s formation first of Shockley Semiconductor (itself a demonstration of the pull of California and Stanford University to those who had initially moved east) and then Fairchild Semiconductor, which were the "fathers" (albeit reluctantly) of firms such as Intel in the 1960s. Development of the new integrated-circuit business by these growing companies created the thriving base for the rise of the PC hardware and software businesses in the mid-1970s. But to get there, thinking had to shift from analog to digital technology and new forms of management, product design, and technical competency had to be, and were, developed.
Lécuyer closes his readable narrative just as the PC revolution is about to transform and expand Silicon Valley even more. As such, then, he focuses on the early development of what is now a phenomenon, a place that has become a metaphor for the 21st century. This history is both insightful and important. I found it a compelling read.
Christopher H. Sterling, , firstname.lastname@example.org