The Community for Technology Leaders

Magnavox and Intel: An Odyssey

Stanley Mazor
Peter Salmon

Pages: pp. 64-67


Today we have high-resolution videogames connected to our television sets, but let us reflect on a pioneering system in this field from 30 years ago. As an Intel applications engineer in 1976, my job (Mazor) was to find new customer applications for microcomputers and to translate customer needs to chip designers like Peter Salmon, who used our technology to solve customer problems. Analog integrated circuits (ICs) were prominently used in the entertainment products, but digital circuits were just making their debut—particularly with digital readouts for time, station, and counters. I visited manufacturers of videogames, gambling machines, pinball machines, and consumer electronics to find new microcomputer applications. Although microcomputers are versatile, they were not fast enough to deliver a video stream in real time. Hence, additional circuitry was needed between a microcomputer and a TV monitor. Arcade videogames and gaming machines used a large amount of video screen buffer memory and ICs, along with microcomputers and ROM resident game-control programs.

Unlike arcade videogames that are produced in the thousands, consumer products are sold in the millions. Accordingly, several special factors strongly differentiate consumer products. First, because these products are mostly bought at Christmas, retail stores make their choices and place orders at the June Consumer Electronic Show (CES) and stock them in September. At CES, retailers are particularly concerned with whether a demonstration is a "real" product that will be available in volume that year. Second, home videogames need Federal Communications Commission (FCC) approval to insure they do not radiate energy, which in addition to normal design and manufacturing issues, causes an unpredictable delay. Missing any of these deadlines would delay a consumer product's release an entire year. Third, consumer products are extremely cost sensitive and there is a sweet spot—usually in the $100 to $150 retail price range—that severely impacts design choices.

Magnavox contracted with Intel for a custom videogame "stunt" chip (8244) 1,2 in 1977. As the Intel liaison, my job was to work with the Magnavox designers and project managers to stay within all these restrictions and produce the IC chip for less than $20 each.

Magnavox Odyssey game console

Magnavox enjoyed considerable success with its Odyssey analog home TV Pong game (1973), but they wanted to build a more versatile and programmable console. Based on the proliferation of arcade games by Midway, Bally, and Atari, Magnavox launched the Odyssey2 project ( http://odyssey2.classicgaming.gamespy.com/articles/timeline/index.php) to build a home game console featuring ROM game cartridges because the company realized that more money could be made from selling cartridges than from the console itself. This home game console connected to the antenna input on a TV set and used a custom Intel videogame chip, controlled by a microcomputer, that ran a game program from a removable ROM cartridge (see Figure 1). This system had both keyboard and joy stick inputs to allow truly interactive games. Magnavox developed approximately 30 different ROM game cartridges, mostly designed by a contractor. Many of these games were first popularized by arcade videogames (quarter eaters).

Graphic:

Figure 1   Magnavox Odyssey2console with cartridge.

A standard TV screen has more than 300,000 pixels, so approximately 4 million RAM bits are needed to store a colored image, and that's just using 4-bits per color. (TV displays use an additive red, blue, green [RBG] color model.) Most arcade videogames had more colors and used more memory. But in 1976, RAM was expensive and was not practical for low-cost home TV games. Intel and Magnavox overcame this problem by having just four one-inch squares of colored video (a sprite) that could be placed anywhere on the screen. This made it possible to deploy colored objects on the screen with a minimum of RAM at a minimum cost. Additional chip facilities included a character generator for text and numbers. The example screen in Figure 2 shows that the graphics were pretty primitive by modern standards, but this was state-of-the-art in 1980.

Graphic:

Figure 2   Sample TV screen display showing one-inch squares of video, or sprites. The original was a color image, using just 4-bits per color.

Intel 8244 chip

As the Intel liaison to Magnavox on the Intel 8244 chip, I made monthly trips in 1977 to Fort Wayne Indiana to visit Magnavox's project leader Gene Kale. This video IC contained counters synchronized with the TV's raster scan to determine what video information to send to the TV. It had four variable objects (sprites) defined in on-chip RAM plus eight group objects. These grouped outputs provided background, titles, and scores using predefined fixed characters from an on-chip ROM. The chip also had a programmable sound generator enabling various sounds and an interface for joysticks.

The 8244 IC had four sprites; each sprite's 8 × 8 pixel matrix implemented a colored, moveable object in a game as shown in Figure 3. The game's program loaded each 64-bit sprite RAM with a particular game's object. Each of the sprites had its own position register that placed the object on the 2D TV screen in a single color defined in the color register. Dynamic motion occurred during a game when the program changed the sprite's position on the screen, usually during video vertical retrace, which was a dead period inserted to avoid screen jitter.

Graphic:

Figure 3   Example 8 × 8 pixel matrix sprites.

Object animation occurs when the program changes the sprite's pixel pattern as shown in the circle to simulate a hand movement. Objects could also be displayed using two colors by changing the sprite. Because the screen was redrawn 60 times/second, an object could be displayed in two successive passes under program control. Part of the object was drawn on the first pass in the first color and the remaining part drawn in a second color on the next pass. Two sprites could also be combined to create a larger game object, such as the tall cowboy shown in Figure 4. Because the objects intersected and overlapped, the actual output was a composite of these four sprite objects, along with the grouped objects and a selected background color.

Graphic:

Figure 4   The lower half of the sprite for the cowboy figure in Figure 2.

The background color of the screen was chosen under program control. Group objects were used for titles and scores. In Figure 2, the score is 6 to 5 and the remaining "bullets" are indicated by the dots. These characters were fixed within an internal ROM memory on the graphics chip, and their selection and position was under program control. The sound generators worked by placing appropriate bit patterns into recirculating shift registers. This gave maximum flexibility while minimizing the computer program's work. Sounds are an important ingredient to videogame players.

Intel/Magnavox relationships

In addition to selling this video chip, Intel also provided standard 8048 microcomputer IC chips. This microcomputer also contained 1 kilobyte of program ROM memory, which in this case provided game console control and keyboard input. The 8048 permitted external program memory expansion, so each 2 Kbyte ROM cartridge extended the program memory with a specific game. The game cartridge size was chosen to make it easy to handle, but it only contained a small printed circuit board with one single IC ROM chip. Intel also expected to get orders for ROM chips to be used in cartridges, but that business mainly went to Intel competitors. Electronically erasable programmable real-only memory (EEPROM) chips were also provided to Magnavox to help them prototype and debug game program cartridges.

Before doing the detailed chip design, Intel provided a breadboard prototype of the 8244 chip to Magnavox for their initial testing and evaluation. I awaited signoff of the prototype as a certification of the design specification, but the Magnavox chief engineer was reluctant to give me the final acceptance approval, which caused considerable strife to the both parties, especially in light of the tight deadlines.

Intel relied on the expertise and needs as defined by their Magnavox customer, and the 8244 was developed exclusively for Magnavox. Peter Salmon had a few minor bugs in his chip design, and the last fix was done on the fourth mask revision in July 1977. He patiently hand-wired a few off-chip logic pieces to a partially working chip to test his proposed bug fix. Subsequently, the Intel team made a triumphal trip to Magnavox in Fort Wayne to deliver the first production chips.

In the meantime, however, Magnavox had decided that the videogame business was "too rich" for their blood and were exiting the business. They might have hoped that the Intel team would fail to deliver silicon chips on time, avoiding a multimillion dollar penalty payable to Intel. As Magnavox backed away from this business, they eliminated most of their game programmers leaving an open field for former Intel employee Ed Averett, 3 who designed most of their games on a royalty basis.

Ultimate results

Reportedly, a prototype Magnavox Odyssey2 game created long lines of interested customers at CES in January 1979. Magnavox is a subsidiary of Philips, and there are rumors that there were disputes within these companies on this project. We don't have first-hand information about these, but several websites provide interesting further reading. 3,4 However, Magnavox sold nearly 1 million Odyssey2 systems and many game cartridges. Intel was also in the digital watch business and decided to pull the plug on its consumer products pursuit concurrently with its decision to exit the watch business. Accordingly, by 1979 I stopped visiting consumer electronics companies. Within a couple of years, the home videogame industry crashed. Even so, Intel produced millions of 8048 microcomputers and followed it with the even more popular single chip controller 8051.

The 8244 video-game chip had 40,000 transistors. The "random logic" was hand-drawn using color pencils. At the time, there weren't many CAD tools to assist in getting an error-free chip, so Salmon offered cash bonuses to draftsmen for each design rule violation they found—a nickel on the first day to a quarter on the last.

Recently, Nvidia announced the GT200 graphics processor unit that has 1.4 billion transistors—a 35,000 times increase. Although the Nvidia chip's power is impressive, the 8244 was also impressive for the day and provides a nostalgic glimpse of the modern chip industry's history.

The Early Days of the Arpanet

Peter T.Kirstein

Arpanet's first access control

In September 2003, University College London (UCL) had had their Arpanet terminal interface processor (TIP) for only six weeks. There was a major networks meeting in Brighton, which was the first public demonstration of international usage.

Immediately after the meeting, I had an Arpanet project meeting at UCL and then invited all the attendees to my house for dinner. There were at least a dozen attendees, including most of the developers of the TIP software from BBN. Of course, all of them were suffering withdrawal symptoms because they had not been onto the net for a whole week, so they lined up to get on my home system. Although they could dial in easily, they could not get any further because the TIP requested a password. The BBN developers were particularly astounded; their software did not at the time have any access control, so they could not understand this request.

At UCL, we were very concerned about security. I had a DEC PDP-9 host and had found a security hole in their software. For a fraction of a second, after users connected in but before they had time to do anything else, it was possible to seize the session and force it to go through our host. We had used this to request that the user put in a password, before we released the connection and let the normal software continue. I believe that this was the first network-level access control on the Arpanet!

The first use of the Arpanet by a head of state

Toward the end of 1975, DARPA was worried that international usage of the Arpanet would raise questions in the US Congress or Senate and asked me to keep a low profile. I agreed, of course, in principle. In January 1976, the Queen was opening a new building in the British Royal Signals and Radar Establishment at Malvern in England. The British were starting a collaboration between US and British defense contractors on the new ADA language. As part of that, a link between RSRE and UCL, and hence into the Arpanet was being established, and it was arranged that the Queen would inaugurate the link.

During the preparation for this event, I was told that the provision of the link between RSRE and UCL was given the second highest priority of any link by the then carrier, the British Post Office. The only activity with higher priority was repairing telephone exchanges in Northern Ireland that were blown up by the IRA! We intended that the link would be inaugurated by the Queen logging in to an account at ISI in Los Angeles and sending a welcome e-mail. The only accounts I had at ISI were in my own name, so I requested an account for her called HM EII because to use another account would be lèse majesté (i.e. the crime of violating majesty). (This is the only time I have used that phrase completely correctly!)

When the event happened, all considerations of a low profile were forgotten. All the relevant DARPA officials, from the director downward, wanted to participate in the event—even though it was at 6:00 a.m. EST.

Readers may contact Peter T. Kirstein at P.Kirstein@cs.ucl.ac.uk.

Acknowledgments

Other people involved in this project were Gene Kale (Magnavox project manager) and, from Intel, Sam Schwartz (designer), Gary Bastian (designer), Ed Averett (marketing), and Bill Lattin (program manager).

References and notes



FULL ARTICLE
55 ms
(Ver 3.x)