Pages: pp. 3-4
Ted Nelson (1974)
This year marks the 50th anniversary of ENIAC. While ENIAC was not a stored, programmed computer, it has come to represent the beginning of the digital era. Prior to ENIAC, there were other machines and projects called computers and computation. Perhaps the most important (and overlooked) of these machines were the early 20th-century analog machines. It is these machines this issue of the Annals will take up.
Little attention has been given to the diffusion of computer technology (and the innovations that often result). Starting in 1925 with a continuous (or product) intergraph, Vannevar Bush and his students at the Massachusetts Institute of Technology (MIT) developed a series of analog machines. Puchta (this issue) takes up the product intergraph—part of the analog tool set Bush and his students worked out between World Wars I and II. This series of machines, which Owens describes, 1 ended with the differential analyzer—an analog machine designed to solve complex differential equations. After 1931, the technology spread: Other machines were built in the United States and abroad. This technology also informed other technological projects (for a discussion of Bush's Memex machines, see Burke 2 and Nyce and Kahn 3,4).
Engineers at the Moore School at the University of Pennsylvania were the first to duplicate Bush's differential analyzer. This machine was completed in 1935. General Electric built other machines in Schenectady, New York, in 1943 and at the University of California at Los Angeles in 1947. The latter machine was the last full-scale differential analyzer built in the United States.
British physicist D.R. Hartree introduced the differential analyzer to England. In 1933, he worked with Bush's analyzer group at MIT, and by 1935 he had funds to build an eight-integrator machine. Bowles (this issue) surveys the 1930-1945 machines and discusses the reception they had on both sides of the Atlantic. Small 5 has discussed the 1940-1975 analog machines.
Norway and Sweden also built differential analyzers. There were also French, German, and Russian differential analyzers. (There is little published in English on these machines, and not all of them can be traced to Bush.) Like Hartree, Rosseland, a professor at Oslo's Theoretical Astrophysics Institute, went to MIT in 1933. By 1939, Rosseland had put together a 12-integrator differential analyzer. Holst (this issue) describes Rosseland's career and the Oslo analyzer. Johansson (this issue) traces the history of Swedish analog machines. He also looks at how analog machines were used in industry—a topic on which little has been written.
Because digital computers and computation have been so successful, they have influenced how we think about both computers as machines and computation as a process—so much so, it is difficult today to reconstruct what analog computing was all about. Owens (this issue) describes how at MIT a computing center and by extension computing itself got redefined (from analog to digital). Owens argues that this shift was successful because basic terms and categories such as "speed" and "efficiency" were redefined in the process. Tympas (this issue) makes much the same point.
These shifts and changes in rhetoric and vocabulary have led to a rewriting of history. It is a history in which digital machines can do things "better" and "faster" than other machines. In this history, formalism (the algorithmic) yields stronger, more robust descriptions than any other kind of representation. Project Whirlwind's shift from analog to digital has often been cited as proof of this. 6 In short, digital machines have come to represent the next evolutionary step in machines. The result is that, for many, analog machines have disappeared.
However, what is at stake here are not matters of speed or precision. Rather, it is an argument about what can be rendered and understood through a machine that does computation. Advocates of the digital machines believe they had speed and precision on their side. Those who worked with analog machines have worried about the reduction (from phenomenon to model to equation and computation) digital computation required. It was this willingness to trade technical advantages of speed and precision for realism (point-by-point correspondence between the thing itself and the computer) that has troubled those like Bush, who believe in analog machines.
For Bush, analog computers could parallel, say, the process of thought, visualizing and exploring data directly, without resorting to algorithms. With analog machines, there are few or no steps between natural objects and the work and structure of computation. 7 This is why a small group of mathematicians and computer scientists still work with analog machines and why their research has pragmatic and theoretical yield. 8,9 In short, analog machines continue to raise questions about the relationship computation should have to the objects it models. This challenges the digital paradigm, in which questions like these tend to be treated as only operational (how would we program that) issues.
Today, computability (what is computable, how is it computable) is pretty much settled. In fact, the digital paradigm is now so pervasive that challenges to it are also framed (and won) in its terms. Analog machines do not start from the same point. As such, they allow us to ask fundamental questions about what one wins and loses with computation. It has been argued that technology has strong benefits because it is good to think with. Analog machines take this one step further. They offer us a way to reconsider what we have come to take for granted—how we model and think about objects in the world.
I would like to thank Mark Bowles, Jonathan Mills, and Herb Stahlke for their comments on drafts of this introduction.