Issue No. 01 - January-March (2011 vol. 33)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/MAHC.2011.12
Paul N. Edwards, A Vast Machine: Computer Models, Climate Data, and the Politics of Global Warming, MIT Press, 2010, 528 pp.
Paul N. Edwards's first book, The Closed World, identified the profound role computers played in shaping Cold War discourse and reality. There, Edwards explored the intimate relationship between the military and computers, in which each helped make the other amidst global conflict. Edwards's newest book, A Vast Machine, examines a different reciprocal relationship with global implications: the relation between data and models, particularly computer models, in creating knowledge about Earth's climate.
Data and Models
Edwards's central argument in A Vast Machine is that data and models coproduce planetary climate knowledge, all knowledge about global climate demands modeling, and computers are essential to this science. Although Edwards never clearly defines his view of what a model is, his book characterizes different types of models that act as both creators and carriers of scientific knowledge. Because Edwards focuses his analysis on the co-construction of climate data and models, this book makes a significant contribution not only to the historical construction of climate science but also to our understanding of the role of models in science and the ways digital computers have changed scientific practice.
Edwards avoids both ideological debate about climate change and philosophical debate over the realism or reductionism of models to focus instead on the epistemology of climate science. A Vast Machine details how scientists created a global knowledge infrastructure to measure and model Earth's climate, seeking to explain "how we know what we know about climate" (p. xiv). As Edwards notes, if climate is the average state of the atmosphere over time, then understanding Earth's climate requires an historical approach. Climate science is thus an historical science, one with analyses of the past that are made relevant to the present through sophisticated computer models and simulations. To understand how climate science is made, Edwards chronicles how historical, empirical data from various scientific fields have been collected, conceptualized, constructed, and reconstructed. This scientifically complicated and politically controversial process creates an authoritative (to climate scientists) yet always-provisional picture of Earth's past, present, and future climates.
Individual actors appear throughout the narrative, but Edwards's focus remains the development of the global infrastructure for climate science. In this way, the book compliments other climate-related histories, such as Spencer Weart's, 1 that focus more on individual scientists and discovery processes.
Edwards originally planned to write an international history of climate modeling, which early chapters reflect. The majority of A Vast Machine, however, offers a Western-centric analysis on the interdependence of data and models in modern climate science. Although Edwards uses a broad periodization, covering 17th-century trade-wind theories and 19th-century projects to standardize meteorological observation, he concentrates mostly on post-World War II developments. Chapters are generally, but not entirely chronological, and most open and close with helpful, nontechnical summaries. The introduction offers readers a guide depending on their interests. (Although he suggests academics read the whole book, Annals readers might find the history of models and data in Chapters 6 though 13 most appealing.)
A Vast Machine castigates the skeptics of climate science who demand more "real data" in place of abstract computer models. Both data and models, Edwards explains, require one another for any coherent understanding of worldwide climate. "Without models," he explains, "there are no data" (p. xiii). Global climate models brim with empirically derived data, but the data would be useless without models to help standardize and represent it globally. Edwards' argument recalls what Harry Collins once labeled the "experimenter's regress." Edwards asserts, "If we cannot trust models without evidence, neither can we trust evidence without models" (p. 412). Skeptics' demand for more raw data misunderstands the very meaning of data in climate science, a meaning that computers helped reshape.
Coproducing Global Knowledge
The book's core hinges on this model-data symbiosis—dual processes that Edwards calls "making global data" and "making data global." Climate knowledge first needed an infrastructure for creating and collecting data from points around the world, which it borrowed from weather-observation infrastructures. Unfortunately, the infrastructure for making global data spread unevenly across the globe with different standards of evidence, instruments, theories, data sets, and models used in different ways throughout history. It also included data from disparate scientific fields including, among others, meteorology, oceanography, and various forms of ecology. The resultant "metadata friction" makes the basis of planetary climate data inherently inconsistent, incomplete, and heterogeneous. Climate science requires a hierarchy of models to make this amalgamated data useful.
By making data global, Edwards refers to climate scientists' use of models to build their heterogeneous data into complete, standardized, and coherent global data sets and, ultimately, to simulate Earth's entire climate. Climate scientists must flip the historical data produced by weather infrastructures to obtain a more precise and definitive account of the atmosphere's history. This "infrastructural inversion," as Edwards calls it, requires scientists to unpack and scrutinize old records, figure out how they were made and what might be wrong with them, understand how they compare with each other, and determine how they need to be adjusted—all processes requiring mathematical modeling. As Edwards explains, "all knowledge about climate change depends fundamentally on modeling.… [P]utting together a trustworthy and detailed data image of the global climate—getting enough observations, over a long enough time span—requires you to model the data, to make them global" (p. 352).
Over the past 60 years, increasingly powerful computers provided the means to make climate data global. Because global systems are too huge and complex to study experimentally, computers calculate planetary climate simulations, which permits experimentation and data reanalysis. As early as 1950, computer pioneers, such as John von Neumann, used the ENIAC to solve numerical equations that modeled a small region of Earth's atmosphere as a simple grid. Computers soon permitted multidimensional models of the atmosphere's general circulation over increasingly larger areas and periods. Edwards acknowledges the contributions of Scandinavian theorists to this process and frequently cites histories of numerical weather prediction such as Kristine Harper's Weather By the Numbers. 2 Yet, Edwards reclaims von Neumann's position as a "principal architect" of this development (p. 111).
Edwards argues that all climate knowledge comes from three types of computer models: data, simulation, and reanalysis. Data models redress systematic errors, make readings from different instruments compatible, and interpolate readings to grids in simulation models. Climate simulation models calculate large-scale atmospheric dynamics, much like weather prediction models, but over longer periods. Reanalysis models, also derived from weather forecasting, blend actual weather observations with climate simulation outputs. As Edwards notes, climate knowledge is "models all the way down" (p. 263). Computers synthesize ever-growing data from satellites, expanded networks of atmospheric observation, and ongoing infrastructural inversion. Scientists have taken the vast machine of Earth's climate and placed it inside the vast machine of computers.
The final chapters of A Vast Machine shift from the parameters of computer modeling to the politics of global warming. Edwards' contribution to this expanding literature explains why global warming received relatively little policy attention before the 1970s: "computer models, the tool on which warming theories depended for their credibility, had yet to acquire full scientific legitimacy" (p. 359). Although these models still fight for full political legitimacy, scientific life today would be unimaginable without digital simulation modeling. From TheLimits to Growth3 to the Kyoto Protocol, Edwards briefly outlines the increasing importance and challenges for computer models and data-model symbiosis within the political contexts of global warming. Whereas earlier topics garner sometimes-exhaustive coverage, these latter chapters lack similar detail. I would have traded earlier minutia for greater elaboration on the relations between computer simulation and geopolitics.
Metadata friction and the impossibility of representing perfectly each molecule of the atmosphere in a global model means climate science can only provide, at best, a provisional understanding of Earth's ever-changing climate. Instead of a definitive picture, climate science produces a range of estimates and possible scenarios. Edwards notes how "something closer to the high end of that range—a climate catastrophe—looks all the more likely as time goes on" (p. 355). Controversy exists on that point, both among climate scientists and political actors. Yet, Edwards shows how the climate knowledge infrastructure—interlocking technical systems that in some ways mirror the interlocking systems of Earth's climate—brings "controversy within consensus" among the vast majority of scientists (p. 438).
The newness of Edwards' topics limited his use of traditional archives. During his 15-plus years of research, however, Edwards essentially created his own archive. A Vast Machine uses that array of primary evidence, especially peer-reviewed scientific journals, conference proceedings, and documents published by organizations that constitute the weather and climate science infrastructures. Additionally, Edwards interviewed most first-generation climate modelers. These interviews hold increasing historical importance as these scientists pass away, just as Edwards's own climate science mentor, Stephen Schneider, did this past summer.
Edwards' sharp and detailed analysis throughout A Vast Machine deserves wide readership, and his interdisciplinary topic makes the book applicable to various audiences, especially Annals readers. As might be expected, however, this complicated subject results in a complicated book. The inherent challenges of creating climate science belie simple explanation. A Vast Machine is theoretically rich, occasionally too rich, as jargon and neologisms sometimes constrain clear explanation. Terminology such as "data shimmering" and "infrastructural inversion" might uniquely describe challenges faced by climate scientists, but for some readers, they might add an interpretive layer of confusion to an already challenging subject. Nevertheless, Edwards offers motivated readers—from nonspecialists to policy makers and scientists—a better understanding of computers' necessary role in producing global climate knowledge.
A Vast Machine, like Edwards' first book, provides a metanarrative, this time about climate science epistemology. It offers a thorough, historical explanation of how raw data and computer models co-create an imperfect yet authoritative climate knowledge. Edwards' prevailing message calls for us to embrace uncertainty, both to understand climate science epistemology and adapt to the changing world climate science predicts. This book is certainly worth reading for the details of that message.
University of California, Santa Barbara
Paul Wonnacott, The Last Good War, Booklocker.com, 2007, 318 pp.
The intertwined themes in this novel could be represented by the cover photograph of an Enigma coding machine with a red banner bearing the Nazi swastika in the background. The first and major theme is the Enigma beginning with the Polish accomplishments of the 1930s in acquiring a machine and its manuals and the reconstruction of the internal wiring and continuing with the wartime code-breaking work at Bletchley Park during World War II. Other themes are the contributions of Polish fighter pilots to the Battle of Britain and the role of Polish tank squadrons on European soil after D-Day.
The main fictional characters are Anna Raczynska, an undergraduate student at Poznan University; Stanislaw Ryk, an officer in the Polish Air Force and Anna's childhood friend; and Kazimierz Jankowski, a Polish cavalry officer. In the mid-1930s, Anna joins the Special Meteorological Project, which is a cover for the Polish cryptographic work. Anna meets Kaz through a mutual friend, and they are married a few weeks before the outbreak of war in 1939 but are soon separated by the mobilization of the Polish army. Anna and Ryk escape to Denmark in Ryk's World War I German triplane and eventually arrive in England, where Anna begins to work at Bletchley Park and Ryk joins a British fighter squadron. Kaz, after being captured by the Soviets and surviving the Katyn Forest massacre, finds his way to England via Egypt and participates in the war in Europe as a tank commander. Wonnacott, an economist who has held academic positions at Middlebury College, the University of Maryland, and Columbia University, uses these fictional events as a framework for an account of code breaking during World War II and the war in Europe.
In the last chapter, "Epilogue: History, Fiction, and Lies," the author discusses the problems of convincingly having fictitious characters interact with real people in historical events. He introduces his task with these words: "But there is no denying it, much of this book is not exactly true… Nevertheless, the main story is historically accurate"—words that he justifies in an informative discussion of the persons and events appearing in the novel. ( Annals readers might begin by reading this last chapter first and then again after reading the book from beginning to end.) In my opinion, this last chapter would have been enhanced by the inclusion of a short annotated bibliography referencing both code breaking and military history.
I enjoyed The Last Good War for its merits as a work of fiction and for its insights into actual historical events, and I recommend it highly to any person interested in World War II.
University of Alberta
Contact department editor Hunter Heyck at firstname.lastname@example.org.