Pages: pp. 2-5
It was a brief moment of heartache. I was reading the contents of a file I had taken from my father's papers. Most of the documents were unrevealing. It held an employee identification badge from Burroughs Corporation, a copy of his retirement agreement, and a series of letters from colleagues whose time to depart had not yet come. Leafing through the letters, I could find generous sentiments and pledges of lasting friendships but no hint of the events that produced such generosity or guaranteed the perpetual bonds.
Tucked among these items was a small piece of notepaper with the old, corporate logo printed on the corner. In Dad's handwriting, neat and firm from the days that he had studied mechanical drawing, were a few sentences that described an exchange of favors. In the spring of 1964, Dad had agreed to visit the University of Michigan and speak to electrical engineering students about the development of the B-5000 computer, a machine that was radically different from the traditional design that John von Neumann had articulated nearly 15 years before.
In return, the professor of those students would attend the next meeting of CUBE, the Burroughs users group. At that meeting, he would give a talk on the future of electronic computers and discussed the computer research being done at the University of Michigan. I caught my breath when I saw the name of the professor. It was Bernie Galler, the founding editor of this magazine.
I uttered a brief sigh and brushed away the tear that slipped into the corner of my eye, for I realized at that instant that I had no sympathetic audience for this little discovery. No one would care that, early in 2007, I had found a connection between two important men in my life. Dad had been gone for three years. Bernie had left us in September 2006. I considered telling the Annals editorial board but suspected that it would draw a tepid reaction at best. My siblings would be no better, as neither of them followed our Dad into the world of high technology. For a second, I toyed with the idea of sending a note to my cousin, who played violin in the youth orchestra that Bernie used to support, but thought better of it. She is in her first year of college and already considers me the weird family member. I thought it best not to test my luck.
This discovery was not without its benefits, however, for it provided a useful example of the nature of history and historical work. The discovery of an old letter, even one that connects the activities of two dear friends, cannot be considered history. Such a discovery can provide a thrill that enlivens the often deadening task of examining archival files and deciphering manuscripts, but it is not history, nor the goal of history research. If it were, history would be nothing but a collection of dates and facts. The basic story of history would be a dull and boring narrative. It would start with some event in the distant past and then present a list of things connected by the words "and then." "First the world was formed," it would proclaim, "and then we had people, and then we had cars, and then we had computers."
If we are writing real history, we are proving the worth of our efforts by answering the fundamental question of existence: "So what?" History tells us why we should care about things, why certain ideas are important and others are not.
In the world of technology, history is often reduced to the task of who first conceived an idea or incorporated that idea into a machine. Such work is at least modestly important, as we live in a capitalist society that rewards inventors, and for some, it indeed answers the "So what?" question. However, for most people, such history falls short of the goal. Almost anyone who has read the assertions and counterassertions of those who claim to have invented the stored-program electronic computer can attest that such a story has certain intriguing features, but those features fail to sustain a general interest. Ultimately, most readers tire of the arguments about the origins of the stored program and ask, "Why should we care?" Perhaps some of the stored program's original contributors have not fully received their due from the popular audience, but they share this fate with many innovators: the inventor of the paved sidewalk, the original creator of the scissors, the hybridizer of the first orange carrot.
The question "So what?" demands that we find a context for our study, a collection of ideas to which we can compare the fruits of our endeavor. In pursuing computing history, we have a wide selection of contexts, almost the whole scope of human endeavor. Computers touch so many facets of life that they can be studied within technical, social, economic, political, artistic, commercial, educational, cultural, and even religious contexts. In studying the history of the computer, we have only begun to explore these different contextualizations.
In the early 1970s, which marks the start of serious computer history for all practical purposes, we generally looked at the invention of the computer within a technical context. Since that time, we have broadened our horizons to include business and political contexts. Much of the remaining contexts are only lightly explored.
Few writers, for example, have looked at the development of the electronic computer within the context of labor. We know that the workers of the 1950s were concerned that the computer would put them out of work. One of the early appearances of the computer in popular literature, William Marchant's Broadway play Desk Set, expresses some of that anxiety about the relationship between a computer and the employees in a company. The basic tension in the play, beyond the usual romantic concerns, dwells on the fears of a library staff that believes they are being replaced by a computer named the Electron-Magnetic Memory and Research Arithmetical Calculator or "Emmarac" for short.
At the climax of the play, everyone is mistakenly fired by Emmarac but quickly rehired when the error is uncovered. Emmarac "was not meant to replace you," affirms the computer engineer in the story." "It was installed to free your time for research—to do the daily mechanical routine." 1
Marchant's play is not a great work of literature, even though it was eventually made into a movie (1957) with Katharine Hepburn and Spencer Tracy. " The Desk Set could not have been more mechanical," wrote a prominent reviewer, "if Mr. Marchant had fed his formula into the electronic brain and waited ten seconds for the script." 2 In particular, Marchant failed to foresee the cycles of innovation and replacement that would continually test the people who worked with these machines. He didn't see that clerical workers might be replaced by a computer two, four, or six times over the course of their careers.
The computer has had a tremendous impact on the division of labor, and the division of labor has shaped the development of computing machines. Once, computers were programmed by engineers or scientists. At the ElectroData Company of the 1950s, an early computer manufacturer that eventually became a division of Burroughs, all programming was done in a section of the company called the Mathematical Division. 3 By the early 1960s, programmers were divided into system programmers and applications programmers. Applications programmers were even starting to find themselves identified with more detailed labels, such as systems analysis, or systems developers. These categories not only defined a labor force, they also shaped the nature of computing languages, operating systems, and computer applications.
In 1964, Dad and Bernie were both engineers who worked with computers. Dad did marketing. Bernie did research and teaching. Both belonged to the ACM, the major professional organization for those in the computer field. Dad was an early member and had a four-digit membership number. Bernie was about to become president of the organization. However, 1964 was one of the last years that the two of them would stand on common ground. Bernie would reform the ACM in a way that would make it more responsive to computer researchers and designers. Dad would push computer marketing away from the details of digital technology. He felt that it was his job to explain the benefits of computing services, not the details of computing architecture. In 1966 or 1967, he let his ACM membership lapse and never looked back.
We are poorly motivated if we study history only to understand the actions of fathers, no matter how important or obscure they may have been. Such a motivation easily constricts our vision, and we quickly find that our history is little different from the tales that we tell around a dining room table. Mom starts to reminisce, and her brother starts to let his words flow and before long, you are listening, once again, to the story of growing up during the Depression. When done well, such tales can be entertaining and help us understand our origins but when done badly, they pull us into a tight, inward-looking circle and suggest the limitations that hedge our lives more than the opportunities that we would like to have.
Family tales are family tales, even if they connect your ancestors to the great themes of an age or suggest patterns of life that might have been common in hundreds or thousands of other communities. Family tales rise to the level of history only if they can always answer that question, "So what?" Ultimately this means that they cannot be focused on a distant time, or a faraway land, or on people we know not. History is the mirror in which we see ourselves. It gives us some perspective on our lives. It allows us to probe who we are, what issues are important to us, what forces are shaping our lives, and what direction we may be going. Only this can be the answer to the question "So what?"
Computing history gives us an insight that few other forms of history can give. It is deeply rooted in the themes of western society, yet it has successfully crossed cultural boundaries. It is the purview of a small group of technicians and technocrats, yet it is one of the most pervasive forces in modern life. It can provoke the whole scale of human emotions, ranging from fear, anger, and anxiety to love, devotion, and even worship.
Computing history can't be easily done without the kind of support that the Annals has given the field over the past 29 years. Though some cultural historians try to treat the technology as a black box with powerful effects and unknown causes, such work misses a key element of computers. Computers divide the world into two groups: those that understand the technology and those that do not; those that can use computers to shape their world and those that are shaped by others. To view this divide from one side of the fence means that you cannot fully understand the interaction between two very different types of people. You cannot get a good picture of our lives if you think that computing technology is truly universal and treats all people equally.
Dad used tell a story about a visit to the University of Michigan, a visit that may have been the time that he addressed Bernie Galler's class of electrical engineers. In his talks of the time, he liked to call the Burrough's B-5000 a "non−von Neumann machine," even though it had already acquired the name of a "stack machine." In this talk, he drew some pointed questions from an older man in the audience.
The questioner, who seemed to be a professor, kept trying to push Dad to admit that all computers were universal machines and hence all owed a debt to John von Neumann, even if they were organized along novel lines. Dad tried to rebuff him gently and explain that the B-5000 was truly a different kind of machine. The questioner, who never volunteered his name, interrupted the talk three or four times that evening. At the end, his voice belied a little anger. He left the room before the presentation ended and never came forward to talk with Dad.
Afterward, a group of the graduate students took Dad out for dinner at the Little Brown Jug, a college dive on the edge of campus. It had existed when Dad was a student at the university and has changed little since. The dinner table conversation was lively and fun. The students were interested to know what it was like to work in the computer industry and what opportunities might exist for them. "For an hour, I was at the center of the world," Dad would recall. "I held the kind of job that they all wanted to have."
As the evening came to an end, one of the students asked a question. "Do you know the guy who asked the questions about the non−von Neumann machine?" he asked.
"No," was Dad's response. "I've never seen him before."
The group hushed for a moment. "He was Arthur Burks," the student said quietly. "He was one of the leaders of the ENIAC team and he wrote a rather important paper with von Neumann." 4
I doubt that Dad let his emotions show. He was good with an audience. I suspect that he made some gracious comment and moved the conversation forward on other lines. However, I know that the revelation rattled him. "One moment, I was the center of the circle and the next moment, I'm not," he would later say. "I don't think I offended anyone but I learned that my hold on the attention of those students was quite fragile."
In a field based so thoroughly on knowledge and reason, our place must always be vulnerable. We get shoved out of the circle when we miss a trend, disregard a useful new idea, forget an old lesson, or even fail to recognize a key individual. Further complicating our position is that relentless cycle of innovation that pushes new ideas at us and demands that we keep abreast of the times. This cycle even touches those who spend our time studying the events of the past. They, too, need to keep abreast of the times in order to avoid the surprise of dismissing some event in the past that has only proven to be useful. History tries to help us understand what is important and what is not, but that process is not only about the past. It is about the present, and it is about us. That is the center of history.
This issue of the Annals contains articles on the history of computing that came from a conference sponsored by the Charles Babbage Institute at the University of Minnesota and organized in honor of retiring director Arthur Norberg. These articles, and the conference, are described in an essay by the conference organizer and the new director of the Institute, Tom Misa. In addition, we are pleased to feature Jack Minker's second article on the development of computing at the University of Maryland.