The Empty Box
SEPTEMBER 2006 (Vol. 39, No. 9) pp. 9-11
0018-9162/06/$31.00 © 2006 IEEE

Published by the IEEE Computer Society
The Empty Box
David Alan Grier , George Washington University
  Article Contents  
  Captivated by Programming  
  The Birth of an Industry  
  Conclusion  
Download Citation
   
Download Content
 
PDFs Require Adobe Acrobat
 

Software is such a fixed presence in our lives that it's hard to imagine a day when it no longer commands center stage.

For many years, I've held the opinion that hardware is a myth. It doesn't exist. It's just a figment of our collective imaginations.
Everything is software. Word processors? Software. Databases? Software. Ringtones? Software. Of course, something has to make the software work, but that entity itself is merely another form of software. A laptop, for example, is merely a condensed form of software. A portable electronic game is software packed so tightly that, like a black hole, it sucks the attention from anyone who dares to come near it.
Of course, I will concede that hardware once existed. It roamed the land in vast herds that stretched from horizon to horizon. The earth shook when these herds began their annual migration from their breeding grounds in California to the great trade shows in Las Vegas.
Sadly, hardware is now completely extinct. It was a victim of overpopulation, the degradation of its prime desktop habitat, and the terrible glaciers of hyperbole that swept out of Madison Avenue in the early 1990s and scoured the earth clean.
Captivated by Programming
My father never shared my ideas about hardware. If he had had his way, I would have begun my career as a hardware engineer. "They are the people who really understand computers," he would say.
An early employee of Univac, Dad was well versed in the concepts of computer hardware. During the early years of my childhood, he filled my days with games and toys designed to teach me how digital circuits operated. He bought me electronic kits, taught me about flip-flops, and explained the operation of core memory. We even spent a wonderful Saturday afternoon together deriving the logic circuit to do binary addition. It was interesting at the time, but it didn't stick.
I was more captivated by the idea of programming. Dad had given me a copy of a program, roughly a dozen cards in length, that would print a slogan on a Univac printer. In the version he gave me, the slogan read "Hello David," but I quickly learned how to modify it. Using a key-punch machine, I would copy the key card up to a certain point. Then, I would type a new slogan. I produced "Happy Burthday Peter" for my brother and "I Lov you Amy Sue" for my sister. (Spelling was not my strongest skill at the time.)
On one occasion when I was modifying the program, I either started the phrase at the wrong point or I typed something too long. When we loaded the program into the computer in the machine room at my father's office, it caused the printer to make a terrible noise and shoot page after page out of the printer. The operator saw the growing pile of green and white paper on the floor and panicked. Several minutes passed before he was able to bring the computer under control. Once things were again running normally, I was temporarily banned from the machine room. Although I was slightly embarrassed by the mistake, I was exhilarated as well. "This is the exciting part of computing," I thought. "This is what I want to learn."
So, instead of following my father's guidance, I shunned the study of circuit diagrams and voltage meters. Instead, I spent my time at Dad's office in front of the card punch with a programming manual on my lap. I tried to learn each statement of a programming language and determine what I might do with it. I had more failures than successes, but I slowly progressed through the book. I will confess, though, that none of the successes was quite as dramatic as my initial encounter with the printer.
The Birth of an Industry
On one trip to Dad's office, he told me that IBM had recently announced it was going to sell software. "They used to give it away for free," he said, "and now they want to sell it." I had not heard the term "software" before and didn't know what it was. For some reason, I pictured a bin of nails, the kind made of rubber that you can buy at a neighborhood hardware store. Dad explained that "software" was another word for "programs." "It will be interesting to see if they can make a real business out of it," he added.
On 23 June 1969, IBM announced that it would "unbundle" its software and sell it as a product. Although they acknowledge that a few software firms operated in the late 1950s and early 1960s, most observers mark that date as the start of the software industry. "The industry was born," commented an early software entrepreneur, "almost literally like a rib out of IBM."
Early efforts
During the late 1960s and early 1970s, IBM had a tremendous influence on the new software business, as IBM mainframe users were the largest market for the new industry. Nonetheless, it was no more able to shape that industry in its own likeness than my father was able to push my interests in the direction of hardware. The software business was shaped by a group of young entrepreneurs who had to define how the business operated, how it related to customers, and even how it defined the product of software.
IBM originally tried to identify the new software firms by the acronym ISV, which stood for independent software vendor. The word "independent meant non-IBM," noted a leader of Software AG, one of the software firms to emerge in the early 1970s.
Clearly meant to distinguish the new companies from IBM, the term had a bit of a demeaning quality to it. It suggested that the new firms were not only independent from IBM but were also independent from new technologies, new developments, and even from each other. From IBM's perspective, only one firm received the title "software vendor," without the modifier of "independent." That firm was, of course, IBM itself.
The challenge for these firms was to erase the word "independent," to show that they represented a unified industry with common concepts, ideas, and ways of doing business. The first thing they had to do was define software's nature as an item of exchange. Some suggested that it was a service. Others considered it to be a manufactured good. A third cohort argued that it was an entirely new kind of product.
This debate was far from academic for it determined how the companies could raise capital. If they could argue that the software was an asset or that it created assets, they could use it to finance their company. If they could not make that argument, then these new companies were nothing more than a collection of independent programmers with no way to assemble capital.
The young CEO of Fortex Data Corporation, a startup software company, recalled that a banker refused to give the company credit by claiming, "You don't have any assets here, nothing of value." Fortex had customers; it had contracts; it had a staff of able programmers. All these had value, but the bank was looking for something it could claim if the company failed. It took repeated efforts to convince the bank that the firm had assets. "Eventually, we had a $150,000 loan from the bank," reported the CEO, "and it was all secured by our accounts receivable. That was the only thing they would loan against."
Defining the endeavor
New businesses often define their field of endeavor by forming a trade association around their common interest. Examples include the National Readymix Concrete Association and the Machine Rulers, Bookbinders, Printers and Kindred Trades Overseers Association. Such organizations define standard practices, recommend accounting methods, and advocate supportive legislation. In 1969, no one had formed such an organization for software developers, so the developers turned to an organization with similar interests, the Association of Data Processing Services Organizations, commonly known as ADAPSO.
Formed in 1961, ADAPSO also had been a rib taken from the sleeping body of IBM. Through the early 1950s, IBM had operated a service bureau, an organization that provided data-processing services to small companies that did not want to lease computers or punched-card equipment. IBM divested itself of this bureau in 1956 as part of the resolution of an antitrust suit.
The newly independent company, now called the Service Bureau Corporation, found itself competing with a group of accounting firms, banks, and companies that did the payroll and performed bookkeeping services for small businesses. "It was such an entrepreneurial industry," recalled one observer. "All over the country, in every metropolitan area of any size, there was a little company springing up to provide this service."
Many of these service firms did not use computers. Some used punched-card tabulators. Some had nothing more than accounting machines. In 1961, they found a common interest in the services they provided. A decade later, they found a common interest with the new software industry. "Some of the software guys started showing up at ADAPSO meetings," remembered one participant, "because there were little bits of the program that were of interest to us."
Reshaping the information industry
The "little bits" of the ADAPSO program that were of interest to software vendors increased rapidly during the 1970s. By the middle of the decade, a substantial fraction of the group was engaged in creating and selling software. In 1977, the software vendors created a subcommittee on the image of their industry. The group's goal was "to completely flip around" their relationship with IBM. They wanted software to be at the center of the data-processing industry with IBM merely one of several independent suppliers of hardware.
One member of that subcommittee was Rick Crandall, the founder and CEO of Comshare, who had come not from the software industry but from the computer services business. However, as he came to know the software developers, Crandall decided that he wanted to help expand the software industry.
"We designed a game plan," Crandall recalled, "where we would set up regular and continuous appointments with the press." They started with Business Week and moved to Fortune, Forbes, Barron's, and The Wall Street Journal, then they started the cycle again. For three years, the committee members offered topics they thought were newsworthy and received rejections in return. Finally, on 1 Sept. 1980, their efforts were rewarded with a cover story in Business Week. "There was a picture of a computer box with the top opened up," Crandall said, "and the title of it was 'The Empty Computer,' meaning that the computer was empty without software inside."
Nothing happens in a moment, no matter how dramatic that moment may be. Crandall does not claim that the Business Week cover immediately changed the public perception of the software business, but it "opened the way for many more articles." Step by step, these articles helped reshape the information industry. Software moved toward the center of this industry, and hardware moved toward the periphery. Some of us believe that it has disappeared entirely.
Conclusion
Software is such a fixed presence in our lives that it's hard to imagine a day when it no longer commands center stage—when it, too, becomes a myth. Yet, it seems to be traveling on the same path toward mythology that hardware has already trod. As developers embrace standards, precompetitive agreements, and transparent functionality, the reality of software fades until there might come a time when there is nothing left of it.
Just last week, I saw a hint that the reality of software was slipping away. I was attending one of those events that occur with some regularity in Washington: A company hires an exclusive club and invites a collection of reporters, sympathetic academics, and supportive lobbyists to an expensive lunch. During the lunch, usually just as a tasty but fattening dessert is being served, a leader of the company announces a new product, a new policy, or a new way of dealing with the federal government.
At this event, the dessert was a mocha-flavored cake—not my favorite—and the announcement concerned how one of the larger software companies was going to behave toward its competition. The speech moved predictably through one idea to the next. It promised that the firm would be more transparent, more cooperative, more accountable, and nicer.
In the middle of the talk, the speaker characterized his company's largest product, an operating system, as nearly invisible. "In and of itself, it's not very exciting," he said. "It's like a dinner table without the food or a Broadway stage without the actors."
No one at my table seemed to notice this statement. While consuming their mocha cake, they were far more concerned with whether the company was still hoping to put its own cuisine on the bare table of its operating system or perhaps fill its empty Broadway stage with actors of its own choosing.
I was not too surprised by the idea that a company would like to minimize the importance of a program found on some 90 percent of all personal computers. The statement suggested that the operating system was "the empty box" of our age and that government antitrust regulators might want to look elsewhere to find the computer's real value.
Yet, such a statement was unsettling, as it questioned my basic faith in software. It suggested that someday software would be as invisible as hardware. It proposed, quietly and subtly, that at some point in the future, I would be forced to explain an operating system to some uncomprehending child by saying, "Once we had a thing called software. It was made by a large and wealthy industry and, most important, it was real."
David Alan Grier is the editor in chief, IEEE Annals of the History of Computing, and the author of When Computers Were Human (Princeton University Press, 2005). Grier is an associate professor in the Center for International Science and Technology Policy at the George Washington University. Contact him at grier@gwu.edu.