Issue No. 04 - October-December (2006 vol. 28)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/MAHC.2006.63
David Alan Grier , Editor in Chief
The fall of 1980 was a good time to arrive in Seattle. The city had recovered from the Boeing Recession of the 1970s, an economic downturn initiated by cancelling the Supersonic Transit Project. The Pike Place Market, the old outdoor marketplace on the waterfront, had recently undergone a facelift and was starting to bring visitors downtown. The centerpiece of the market was a seafood stall where the clerks flung whole salmon back and forth across the counter. Almost unnoticed in the multistoried facility was a small coffee shop that was named for the chef of the Pequod, the ship in Herman Melville's Moby Dick, Starbucks.
On the north side of downtown, the University of Washington had just completed a new computer center to house the school's Control Data mainframe. The more environmentally conscious students had suggested that the excess heat from the computer could be used to warm the center's library, which held a nice collection of classic textbooks and the first complete four volumes of the Annals of the History of Computing. A university vice president liked the idea and made a grand statement about how the computers of the future would have to be better integrated into buildings.
Near the entrance to the university was a floating bridge that spanned Lake Washington. The roadway briefly climbs from the water, to give the drivers a clear view of Mt. Rainier, and then speeds straight across the lake. At the far end of the bridge stood an office building, small and nondescript, that nearly blended into the landscape. If the traffic was moving freely, which was a rare event even in the simple times of the early 1980s, a driver might speed past the building and not see it. That building was the home of the fledgling Microsoft Corporation.
An industry emerges
At the time, the personal computer software industry was a small part of the computer business and seemed to be an unpromising way of telling the story of technological innovation. If you wanted to understand the technical contributions of the 20th century, you looked at the evolution of hardware or, for the more daring, the creative ideas of software. The business of software, especially software for the personal computer, was thought to be a simple and straightforward story.
In the course of the next decade, the story of computer innovation changed radically. The PC software industry replaced the hardware industry as the symbol of innovation.
Shifting to the PC
This issue, the first of two that looks at the PC software industry, grew out of a pair conferences, which were organized by the Software History Institute and Burt Grad—one of the guest editors (along with Paul Ceruzzi) of this special issue. The first was held in Needham, Massachusetts in May 2004. The second was held the following fall at the Computer History Museum in Mountain View, California. Both of these events were opportunities to interview the founders of the software industry, to learn how these individuals built their business, and how they supported technological innovation.
The attendees of these conferences spoke of the tremendous opportunity of the early 1980s and also the tremendous unknowns. They did not know how any PC software industry should operate. They did not know how they were going to solve all the technical problems that they should face. They were not even certain that these small devices were the kinds of things on which they should risk their capital and energy. Yet they were intrigued by these little computers and believed that sometime in the future, they would be important.
In that fall of 1980, I discovered those issues of Annals that were stored in the library that was warmed by the excess heat of a mainframe. These issues told the story of early computing technology: the Binac; Fortran I, II, and III; Zuse's relay machines; and the CDC 6600. They were simple stories—interesting tales of how engineers and programmers contributed to the field.
These issues were edited by Bernard Galler, who was then a professor of computer science at the University of Michigan. Bernie had become interested in computing history over the prior decade and had decided that the early stories of electronic computation should be preserved. He has told the story of the founding of Annals in these pages "How the Annals Came to Be," (vol. 26, no. 1, pp. 4–7; http://doi.ieeecomputersociety.org/10.1109/MAHC.2004.1278846). It is with sadness, tempered with gratitude for all that he has done for this magazine, that I must acknowledge Bernie's passing on 4 September of this year.
Preservation and perseverance
Bernie Galler earned a bachelor's degree in mathematics at the University of Chicago and a master's degree in the same subject at UCLA. He then returned to the University of Chicago, where he completed a PhD in 1955. He joined the mathematics department at the University of Michigan the same year and was a founding member of the Computer Science Department 15 years later. He did important early work on the development of computer languages and created a language called MAD for the Michigan Algorithm Decoder. He retired from the University in 1994. His reputation among the members of the Annals editorial board was one of grace and dedication and vision. "Bernie was just the nicest person," recalled one long-serving member.
Bernie oversaw eight volumes of Annals. When he left the editorship, the personal computer was becoming an important tool for business. Microsoft was preparing to leave the little building by the bridge and moved northwest to the Redmond campus. Personal computer software was starting to be an important industry in its own right and Starbucks was becoming a global brand. Perhaps the only thing that was not changing about Seattle was the salmon. They still throw them across the counters at the Pike Market.