MIT 150th Anniversary and MIT Museum Project Whirlwind Presentation
The Massachusetts Institute of Technology (MIT) celebrated its 150th anniversary in 2011 ( http://mit150.mit.edu). The celebration scheduled between 7 January and 5 June 2011 included hundreds of special events, symposia, publications, and a major exhibition at the MIT Museum.
The MIT 150 Exhibition ( http://museum.mit.edu/150) opening at the MIT Museum was the inaugural (7 January) event of the institute's celebration. Running through 31 December 2011, the exhibition consisted of 150 objects representative of MIT's history and ambitions for the future. The exhibition curator, Deborah Douglas, noted that the list of 150 objects was determined by "crowd sourcing" from the MIT community rather than the typical curator-driven approach. Naturally, a number of the objects fell under the exhibition's Analog/Digital MIT theme.
Throughout the year, the MIT Museum also had many presentations on a variety of topics, including the 15 October public event that I attended, entitled "Project Whirlwind, Sage, and Pioneering MIT Computer Projects." A couple dozen participants from the Whirlwind development era in the late 1940s and 1950s were present for the presentation, making the event also sort of a Project Whirlwind reunion.
The primary focus of the event was a panel session with Project Whirlwind leaders Jay Forrester and Robert Everett (see Figure 1
), moderated by museum director John Durant.
Figure 1. Jay Forrester and Robert Everett at the MIT Museum Project Whirlwind event. (Courtesy of Nalin Springer.)
Durant's first question for Forrester was, "What was Project Whirlwind?" Forrester explained that Whirlwind had a varied history. At first it was to be an analog computer for predicting the controllability of future airplanes—unlike the Link Trainer, which was for training pilots of an existing airplane. After a year or so, they concluded the future-airplane task was not possible with an analog computer. At that point Perry Crawford, who was with the US Navy's Special Devices Division, which was in charge of project oversight, suggested they switch to digital computing. Bob Everett remembered that one day Forrester came by and said, " 'We are now working on a digital computer,' and I said, 'What's that?' "
Forrester talked about the risks inherent in the project. Whirlwind was transferred to the Office of Naval Research, which generally spent only enough money on a project to support one mathematician and an assistant. The Whirlwind project needed 100 times as much money, so funding involved an "annual inquisition." Some people in the Electrical Engineering Department thought computers needed to do decimal arithmetic, and the Whirlwind team had to argue for the efficiencies of binary arithmetic. The average life of a vacuum tube was 500 hours, and their machine would have tens of thousands of vacuum tubes, which if you do the arithmetic, meant major reliability problems.
Some useful prior efforts existed, however. Coming out of World War II, the MIT Radiation Laboratory had knowledge of pulse circuits and vacuum tubes. The Whirlwind team knew of the early computer work at Harvard and of work with EDVAC at the University of Pennsylvania where John von Neumann pushed the idea that a computer could run on a program stored in its own memory—"a very big breakthrough," said Forrester. (The EDVAC was a serial system, but Whirlwind needed a parallel computer system to handle the speed required for the real-time work they had in mind.)
Mercury delays lines were one memory possibility, but it took a millisecond for a bit of data to cycle from one end of the delay line to the other. A Williams tube using a 2D grid on a cathode-ray tube was another possibility, but they were unreliable. Forrester said that he wanted a 3D storage system and first thought about using glow discharge tubes, but he gave that up as impractical. Forrester went through another idea or two before settling on ferrite cores (little doughnuts of ceramic magnetic material) arranged in an a 2D array with interconnecting wires so any individual core was instantly accessible. (Multiple such core planes provided access to the bits of a word of computer memory.) Forrester explained that a guy in New Jersey could occasionally produce a ferrite core that had the square-wave on/off property needed for digital computing. He would run his hand through brown power and say, "That feels square to me." Forrester's team at MIT spent lots of money on research before they understood how such cores could be produced reliably.
As the Whirlwind team and its government sponsors discovered numerous possible applications for a real-time, stored-program, digital computer, the original flight simulator task went away. Everett added that their original intention was to build a 32-bit machine, but instead they built half a machine—a 16-bit machine.
Their budget became too big for the Navy, and there was a danger that the project would die. However, Perry Crawford saw possibilities for the machine for the air battle emerging circa 1948. The Soviets had produced a nuclear bomb and related bombers. The existing US air defense system was ineffective. Crawford, Forrester, MIT's Jerome Wiesner (later MIT's president and President Eisenhower's science advisor), and George Valley (who was instrumental in the conception of the US Air Defense System) had various interpersonal connections among them that led to Valley becoming aware of the Whirlwind group's work. At the time, the Air Force was not very aware of the possibilities of digital computing for air defense: "We began to talk about a computer to run an air defense system when nobody in the military knew what a digital computer was," Forrester said. However, Valley could see a possible computer-based solution involving Whirlwind, so the Air Force adopted the project with substantial funding. The Cape Cod system was built as a demonstration and was followed by the SAGE system.
An operational system required solving the vacuum-tube reliability problem. They found and removed the cause of 500-hour tube life, extending tube life to 500,000 hours. They added a "marginal checking system" that detected when things were drifting out of spec, and they required that each SAGE center have two parallel systems. This led to 99.8 percent up times.
At this point, museum director Durant began to take questions from the audience. The first question was, "Did you realize at the time that this was an amazing thing that was setting the stage for the future of computing?" Everett replied that they knew that lots of things could be done if they solved the technical problems of building such a fast, reliable computer. They didn't anticipate the spread of computers we see today, but they had a lot of ideas about what such a computer could do.
Forrester added that in 1948, before any such computer had functioned, they prepared a report on the future of computers in the military that culminated in a chart covering a 15-year period (1948–1963) across the top and 10 areas of military application down the side, with every square filled in with what could be done (including time for politics and administration) with the associated costs. This forecast was done before any reliable, high-speed, general-purpose, digital computer had yet operated. Forrester said that the air defense 15-year forecast was completed by the MIT Lincoln Laboratory a couple of years ahead of the 1948 forecast. The total estimated cost for the efforts on the chart was $2 billion.
The next question was, "Did you imagine miniaturization such as we have today in our cells phones and wrist watches?" Everett said, "Yes, we didn't know how to do it, but it was clear that computers had the remarkable characteristic that they got better as they got smaller."
Forrester was asked about software and the overall system thinking, as opposed to the hardware work he and Everett had mostly been describing. Forrester explained that they had to develop a large team of programmers that hadn't existed before. (Eventually, RAND took over the programming.) Along the way, he observed that "young women who studied music at Wellesley" were good candidates for programmers—maybe something having to do with "logic and organization of symbols." Everett noted that these women were smart and had no preconceptions about computing. Forrester emphasized that it is "important to have a team that don't know you can't do it." Forrester also explained that people were coming back from the war under the GI bill and applying for the graduate studies in electrical engineering at MIT. He would review their applications looking for people to recruit to the Whirlwind project.
From the audience, John Frankovich (an early software innovator on Whirlwind) said that as the computer became operational, the separately run Scientific and Engineering Computation Group support by the Navy allowed students and others from around the campus to develop lots of software, including the first algebraic compiler. Forrester noted that a number of new fields got started based on Whirlwind such as oil field exploration analysis. Frankovich added that pioneering work was done on Whirlwind in the fields of numerical milling machine control, studies of radio station radiation patterns, TV frequency assignment calculations, photographic lens design, and so forth. Many students used Whirlwind "for lots on non-SAGE stuff" and then went elsewhere to "spread the gospel."
Jack Gilmore, an early Whirlwind software designer, related that Charles Adams, a genius at imagining what software could do, had a team of men and women programmers. After hours they would go into the basement of the Barda Building (where Whirlwind was built), and Adams would give some assignments for projects that needed to be done while the others would critique the work, optimizing the resulting programs.
Another question from the audience was about Whirlwind and priority interrupts to handle real-time events that happened while other parts of a program were running. Everett explained that Whirlwind did not have a hardware priority-interrupt system. Rather, the system design was based on putting tasks to be done in buffers, and the program frequently looked over the various buffers choosing which task to do in appropriate priority order. Frankovich noted that the TX-2 at MIT's Lincoln Laboratory (to where the SAGE work moved from MIT before the project then moved to MITRE) implemented a 33-level hardware priority-interrupt system with state saving. Many visitors from computer vendors around Boston saw that, and it became a standard approach for commercial computer products.
Gilmore told another story of visiting a Canadian laboratory in those early days. When he explained to people about the possibility of single-person interactive use of Whirlwind, he was asked to "leave the room" by the lead person who was incredulous that Gilmore would say that individuals would ever have a whole computer to themselves.
Frankovich also noted that military sponsorship of Whirlwind and activities such as the TX-2 development had major benefits such as development of the first virtual memory, operating system innovations, and early experiments that led to the Internet.
Forrester noted that he left computing in 1956 because he felt computing's pioneering days were over—"more happened in the decade from 1946 to 1956 than in any decade since," he said. Everett concurred, "Like a child who learns more in the first two years of his life than in the rest of it." Everett concluded that "Jay left me with the secondary task of finishing the SAGE system."
The Q&A session ended with members of the audience putting their names in a bag for a drawing to give away a dozen copies of Tom Green's Bright Boys: 1938–1958, Two Decades That Changed Everything (AK Peters, 2010), a book that covers the history of the Whirlwind system. Tom Green was in the audience, and his publisher had provided the dozen copies of the book. He also provided a copy for the museum's collection that Director Durant asked all the Whirlwind pioneers present to sign. The book describes in fascinating detail what was only touched on during the session with Forrester and Everett.
Following the Q&A session, Deborah Douglas led a tour past some of the computer aspects of the 150 Exhibitions ( http://museum.mit.edu/150/theme/analogdigital-mit) including Vannevar Bush's 1931 differential analyzer, a core memory unit from Whirlwind (see Figure 2
), the TX-0 computer console, a Spacewar console, a SAGE system display, and the Multics system's shelves of documentation.
Figure 2. Whirlwind core memory unit on display at the MIT Museum. (Courtesy of Nalin Springer.)
While describing the Whirlwind core memory unit, Douglas mentioned that when the museum first acquired this unit, something looked wrong to her in terms of its being an original unit—it didn't have the patch wires and incremental fixes she expected to see on a original unit. After she finished her story on the complicated provenance of this particular object, Everett told her softly that the first Whirlwind units were also completely tidy. To this day, the Whirlwind pioneers are proud of the skill and craft with which they accomplished their breakthrough in computing.
Even though the MIT 150 Exhibition was only on display through the end of 2011, the MIT Museum remains a wonderful place to visit while touring Boston. There are always fascinating exhibits, the price is reasonable, and it can be seen within a couple of hours. In addition to the MIT history and technology, Arthur Ganson's kinetic art is a must-see exhibit.
retired from BBN in 1995, after working as a computer programmer, technical manager, and general manager. Contact him at email@example.com.
IPSJ 4th Information Processing Technology Heritage Certification Ceremony
The 4th Information Processing Technology Heritage Certification Ceremony of the Information Processing Society of Japan (IPSJ) was held on 6 March 2012 at the Nagoya Institute of Technology, Japan, during the 74th IPSJ National Convention. This time the IPSJ certificated 12 additional artifacts:
• parts of a differential analyzer of Institute of Industrial Science, the University of Tokyo (1952–1955),
• HIPAC MK-1 parametron computer (1957),
• NEAC-1101 parametron computer (1958),
• MELCOM-1101 transistor computer (1963),
• ASPET/71 optical character reader (1971),
• Busicom 141-PF electronic calculator (1971),
• NEAC System 100 office computer (1974),
• MCC board with LSI packages for FACOM M190/Amdahl 470V6 (1976),
• automated teller terminal AT-20P (1977),
• FAST LISP of Kobe University (1978–1979),
• EVLIS Machine of Osaka University (1978–1979), and
• T1100 laptop computer (1985).
(The manufacturing year of each artifact is listed in parentheses following the item.)
The IPSJ also certificated the NTT History Center of Technologies (the exhibition area of the historic computers of NTT) as an additional satellite museum of historical computers. So far IPSJ has certified 55 artifacts and six satellite museums.
During World War II, the Tokyo Imperial University's Aeronautical Research Institute researched mechanical differential analyzers and built a prototype. After the war, Masaru Watanabe and his colleagues continued to research mechanical differential analyzers using the prototype. They developed a basic system with four integrators in 1953 and completed a large differential analyzer with eight integrators and automatic tracking capability in 1955. They realized the high accuracy of 0.03 percent and used it for orbit calculation of small rockets. Existing parts of a torque amplifier, an adder, and an optical head for automatic curve tracking were certified (see Figure 3
), which will be exhibited at the University of Tokyo's Institute of Industrial Science.
Figure 3. Parts of the differential analyzer of Institute of Industrial Science, University of Tokyo, are preserved in the institute's archives: (a) torque amplifier, (b) adder, and (c) optical head.
Two parametron computers, HIPAC MK-1 and NEAC-1101, were also certified. The former is the Hitachi's first computer built in 1957 and is exhibited at the Hitachi Research Laboratory, Tokyo. This is the oldest existing parametron computer. The latter is the NEC's first computer built in 1958 and is exhibited at the NEC Fuchu Plant in Tokyo. The parametron boards of both computers were displayed at the heritage ceremony room (see Figure 4
). The parametron is a logic element invented by Eiichi Goto of University of Tokyo in 1954.
Figure 4. Parametron boards of HIPAC MK-1 and NEAC-1101.
ASPET/71 is a super-high-performance optical character reader developed in 1971 by a joint project of Electro-Technical Laboratory (ETL) and Toshiba led by Taizo Iijima of ETL. The machine was designed based on a new visual pattern-recognition theory invented by IIjima. It could accurately read low-quality printed alphanumeric characters. The practical use of the Chinese character reader and automatic ZIP code reading and sorting machine largely depended on this technology. ASPET/71 is preserved at National Museum of Nature and Science in Tsukuba (see Figure 5
Figure 5. ASPET/71 super-high-performance optical character reader built in 1971 and preserved at the National Museum of Nature and Science in Tsukuba, Japan. (Courtesy of National Museum of Nature and Science.)
The Busicom 141-PF was the first electronic calculator equipped with an Intel 4004 microprocessor. The Intel 4004 was the first commercially available general-purpose microprocessor developed in 1971. It was originally developed for a calculator of Busicom, a Japanese calculator company. Intel engineers and Masatoshi Shima of Busicom at that time jointly developed it. The certificated Busicom 141-PF was displayed at the heritage ceremony room.
Two additional Lisp machines, FAST LISP of Kobe University and EVLIS machine of Osaka University, were also certified. Lisp machines were researched and developed at laboratories and universities in Japan from the late 1970s to 1980s. FAST LISP was implemented in 1978–1979 by Kazuo Taki and his colleagues at Kobe University using a microprogrammed Lisp interpreter on bit-sliced processors. The EVLIS machine was a parallel processing Lisp machine with multiple processors, implemented in 1979–1982 at Osaka University, led by Hiroshi Yasui and his colleagues. Taki and Yasui attended the ceremony and received a plaque of certification. FAST LISP will be exhibited at Kobe University, and the EVLIS machine is preserved at Osaka University.
Further information on the IPSJ Heritage Certification program is available at http://museum.ipsj.or.jp/en/heritage/index.html.
is a principal at the Computer Systems and Media Laboratory, Japan. Contact him at firstname.lastname@example.org.
2011 NEC C&C Foundation Prize Ceremony
Since 1985, the NEC C&C Foundation has presented its annual C&C prizes to recognized distinguished persons who made outstanding contributions to R&D activities in the integration of computers and communications technologies. The 2011 C&C Prize Ceremony was held on 28 November 2011 at the ANA Intercontinental Hotel Tokyo. NEC C&C Foundation awarded the 2011 C&C Prize to Akira Yoshino and to Norman Abramson and Robert M. Metcalfe (see Figure 6
). The citations read as follows:
• Akira Yoshino: For pioneering contribution to the development and commercialization of the lithium-ion battery.
• Norman Abramson and Robert M. Metcalfe: For outstanding leadership resulting in the invention, standardization, and commercialization of Internet packet access, beginning with ALOHAnet and then Ethernet.
Figure 6. 2011 C&C Prize recipients with President Hajime Sasaki: (a) Akira Yoshino, (b) Norman Abramson, and (c) Robert M. Metcalfe. (Courtesy of NEC C&C Foundation.)
Hajime Sasaki, the president of the NEC C&C Foundation, opened the ceremony with a welcome speech. Yasuharu Suematsu, the chairman of the award committee, recognized the 2011 C&C Prize recipients. Sasaki then presented the prize to the three recipients, each of whom delivered an acceptance speech.
Akira Yoshino is a fellow of the Asahi Kasei Corporation Japan. He played an extremely important role in realizing the high-power, small rechargeable lithium-ion batteries (LIB) widely used today in mobile and personal information devices. He started his research on LIB in 1981 to find suitable materials for electrodes, which was the most important development at that time. He succeeded in making an operational test model of this new secondary battery using electroconductive polyacetylene as the negative electrode and lithium cobalt oxide (LiCoO 2) as the positive one. This was the world's first use of an LIB in a non-aqueous electrolyte showing high electromotive force of approximately 4 V. Although this cell was functional, the polyacetylene limited the available capacity and stability.
Yoshino thus searched for new carbonaceous material to use as the negative electrode and found that carbonaceous materials with a certain crystalline structure provided greater capacity. In 1985, he successfully fabricated the secondary battery based on this new combination of component materials, enabling stable charging and discharging over many cycles for a long period for the first time. This was the birth of the current LIB. During his acceptance speech, he showed the video of his experiment to verify the safety of his prototype battery and show it would not to explode.
Norman Abramson is a professor emeritus of the University of Hawaii, and Robert M. Metcalfe is a professor of innovation at the University of Texas at Austin. They made decisive contributions to the development of ALOHAnet, Ethernet, and related basic local area network (LAN) technologies. Ethernet is the most widely used LAN standard and has had an immense impact on information technology. The ALOHAnet protocol adapted for the carrier sense multiple access with collision detection (CSMA/CD) Ethernet has had a significant impact on information technology since the 1970s.
In 1968, Norman Abramson moved from Stanford University to the University of Hawaii, where he directed the development of the ALOHAnet, which was a wireless data network connecting computer facilities on the Hawaiian islands using the ALOHA channel. The ALOHA channel used a shared medium access communication method on a ultra-high frequency (UHF) wireless network, which was designed with a simple but effective method for dealing with data-packet collisions. In addition, it led directly to CSMA, CSMA/CD, and CSMA/CA (collision avoidance), which were later incorporated into various generations of standards for Ethernet and Wi-Fi.
At the end of 1970, the ALOHAnet was complete. It connected the Hawaiian islands and was the world's first wireless packet-data network. In 1972, the ALOHAnet was connected to the ARPANET in North America using a satellite channel. In 1973, the first network to utilize random access packet transmission in a satellite channel was put into operation using the NASA ATS-1 satellite in an experimental network. It included the University of Hawaii, the NASA Ames Research Center in California, the University of Alaska, Tohoku University in Sendai, the University of Electro-Communications in Tokyo, and the University of Sydney. This network, called PacNet, operated at 9,600 bits per second (bps) on an ALOHA channel using low-cost satellite earth stations.
The ALOHA protocol is part of the Data Link Layer (OSI Network Layer 2) Protocol, which is different from a point-to-point protocol and is today classified as a medium access control (MAC) network protocol using a shared medium. It is based on the arbitration technology connecting plural network terminals first implemented in the ALOHAnet. Later, this protocol was optimized for wired systems and used for Ethernet by Metcalfe as CSMA/CD.
Ethernet was built on Abramson's ALOHAnet idea of wireless multiple access using randomized retransmissions and developed further as high-speed CSMA/CD for use in a LAN. Early Ethernets were able to run much faster than the ALOHAnet because they transmitted on copper cables instead of wirelessly. Decades later, Ethernet moved back to wireless (Wi-Fi) and today again looks much like the ALOHAnet. Ethernet in its many forms has now become the packet plumbing of the Internet.
While pursuing a doctorate in computer science at Harvard University, Robert Metcalfe worked on MIT's Project MAC and then at the Xerox Palo Alto Research Center (PARC) in 1972, where he developed a coaxial cable LAN system. He was a team leader for networking Altos to a laser printer and to the Internet. He invented the networking system initially called the Alto ALOHAnet. Ethernet randomized retransmission as in the ALOHAnet, but it improved packet throughputs under load with CSMA/CD. Together with David Boggs in 1976, Metcalfe published the article "Ethernet: Distributed Packet-Switching for Local Computer Networks" in the Communication of the ACM
Metcalfe encouraged Xerox to freely license its Ethernet patents and to cooperate with Intel and DEC to create a standard LAN system through IEEE. As a result, 10-Mbps Ethernet was submitted for the new IEEE Project 802 in 1980. IEEE subsequently standardized IEEE802.3 CSMA/CD in 1982. Afterwards, the 10-Mbps 10BASE-T Ethernet finally established its position as the global LAN standard. In his acceptance speech, Metcalfe expressed his thanks to the industries and people who contributed to put Ethernet to practical use and disseminate it.
Each year, C&C Prizes are awarded to no more than two groups. Recipients are given a certificate, a plaque, and a cash award (¥10,000,000 for each group). The foundation has awarded the prize to 88 people since 1985.
See additional details about the 2011 C&C Prize Ceremony at http://www.candc.or.jp/en/2011/ceremony.html.
Contact Akihiko Yamada
IEEE Packet-Speech Milestone Celebrated at MIT Lincoln Laboratory
In 1971 Jim Forgie of MIT Lincoln Laboratory experimented with the two-year old ARPANET to show the feasibility of sending speech over that first packet-switching network. At the time, in the context of traditional dial-up full-duplex telephone communications, many people doubted packetized speech, in which packets flow over varying network paths with varying time delays. In 1974 the Advanced Research Projects Agency (ARPA) began a multi-institution packet-speech program, lasting through 1982, that firmly demonstrated the utility of packet speech. This was the initiating technology of what we know today as voice over IP (VoIP) and capabilities such as Vonage and Skype.
The IEEE Milestone program is an IEEE History Committee activity administered through the IEEE History Center. 1
Milestones recognize technological innovation and excellence and are proposed, nominated, and sponsored by an IEEE organizational unit and go through a rigorous vetting process. On 8 December 2011, a Milestone in packetized speech was recognized at the MIT Lincoln Laboratory, which had been the central player in the ARPA packet-speech program. The plaque read:
First Real-Time Speech Communication on Packet Networks, 1974–1982
In August 1974, the first real-time speech communication over a packet-switched network was demonstrated via the ARPANET between MIT Lincoln Laboratory and USC Information Sciences Institute. By 1982, these technologies enabled Internet packet speech and conferencing linking terrestrial, packet radio, and satellite networks. This work in real-time network protocols and speech coding laid the foundation for voice over Internet Protocol (VoIP) communications and related applications including Internet video conferencing.
This IEEE Milestone was sponsored by the IEEE Signal Processing Society and the Boston section of IEEE. The public dedication was attended by more than 100 people, including many of the 1974–1982 participants, several of whom came from across the country (see Figure 7
). Welcomes were given by Karen Panetta, chair of the Boston chapter, Mostafa Kaveh, president of the IEEE Signal Processing Society, and Eric Evans, director of MIT Lincoln Laboratory.
Figure 7. MIT Lincoln Laboratory packetized-speech project participants who attended the IEEE Milestone celebration. From left to right, Gerald O';Leary, Eric Evans, Mostafa Kaveh, Harold Heggestad, Peter Blankenship, Cliff Weinstein, Steve Blumenthal, Stephen Casner, Randy Cole, Earl Craighill, John Makhoul, Don Johnson, Peter Staecker, Robert Kahn, Joe Tierney, William Kantrowitz, Duane Adams, Connie McElwain, Carma Forgie, Gil Falk, Karen Panetta, and Bruce Hecht.
Cliff Weinstein, leader of Lincoln Laboratory's Human Language Technology Group and a key member of the 1974–1982 research, 2
sketched the history of the effort, from Jim Forgie's early feasibility study, through the milestone years, to Bob Gray's July 2005 paper in the IEEE Signal Processing Magazine
, which brought the 1974–1982 work to the attention of a 21st century audience. Weinstein emphasized the multi-institutional composition of the research effort (see Figure 8
); they needed people at other locations to demonstrate long distance, packetized, two-person telephone conversations and teleconferencing, and the various institutions had different computers and end-user equipment and helped develop the necessary network protocols enabling communication among varied devices. Weinstein also played an audio tape from May 1978 of an early demonstration of voice conference among the Lincoln Laboratory in Massachusetts and USC Information Sciences Institute and Culler-Harrison, both in southern California.
Figure 8. 1982 experimental wideband voice/data Internet demonstration.
Bob Kahn, the ARPA program manager when the packet-speech project was initiated, was the keynote speaker. He too emphasized that numerous people from many places participated in the program. He also noted that part of his purpose in creating the speech program was to show the importance of packet technology. ARPANET had been created to demonstrate packet switching, and it could transmit data of many types. However, communications such as file transfers between computers are not particularly visible to a wider audience. Voice, video, and other such communications were highly hearable or viewable, and Kahn noted Steve Casner's packetized video work took this idea beyond speech. Kahn also described changes that had to be made to the internal ARPANET algorithms and its external interface to allow high bandwidth, low-delay speech to be fully demonstrated (what Jim Forgie had predicted could be done). Also, the Transmission Control Protocol (TCP) originally contained both the TCP and IP functions, but it had to be split to enable applications such as packet speech (and its protocols) to communicate directly with IP. A goal was 1-kilobit-per-second (Kbps) speech (at a time when 4.8-Kbps or 2.4-Kbps speech were regarding as minimums), and this goal was reached, partly because packet speech did not require a full-duplex connection (saving 50 percent immediately). Packets also did not have to be sent during silences, which saved an additional significant percentage. Finally, Kahn explained that there are still possibilities for improving the way speech is transmitted over the Internet, and he envisions a time when speech may be the primary user interface to computers.
Danny Cohen, another major participant in the 1974–1982 research, was scheduled to be the second keynote speaker, but he was unable to travel, and Steve Casner presented Danny's amusing (as is Danny's way) slides.
The formal session ended with Peter Staecker, 2013 IEEE president, and Eric Evans, director of Lincoln Laboratory, unveiling the plaque and handing out 20 or so miniature versions of the plaque to participants in the 1972–1984 research.
The late afternoon session ended with a reception for everyone in attendance, again hosted by Lincoln Laboratory. Over hors d'oeuvres, old friends and colleagues continued to converse (as they had prior to the start of the event), remembering old times and providing updates on current activities. Younger attendees met with the pioneers who 40 years ago started the work that is so commonplace in our smart phones and other contemporary communication devices.
Contact David Walden