, Auburn University
Pages: pp. 173-174
In June 1985, I became the editor in chief of Design & Test after my predecessor, Roy Russo, became president of the IEEE Computer Society. It was a time of start-up fever in the VLSI CAD industry, and new companies were sprouting up daily. Small startups had no problem harnessing university research. But that wasn't an easy task for large companies that suffered from NIH (not invented here) syndrome. So large companies organized behind the multicompany consortia, the Microelectronics and Computer Technology Corp. (MCC) and the Semiconductor Research Corp. (SRC). In an interview with D&T, MCC president and chief executive officer Bobby Inman commented, "Folks working on large problems don't really have concerns about revealing something to a competitor." 1
Both small and large organizations made lasting contributions. Gateway Design Automation, a small CAD company in Littleton, Massachusetts (which Cadence later acquired), announced the Verilog hardware description language. A large cooperative effort between the US government and industries produced VHSIC (very high speed IC) Hardware Description Language (VHDL). 2
If I were to pick a theme that best represents the year 1985, it would be Applications of Artificial Intelligence (AI) to Electronic Design and Test. In the case of very large and complex systems, experienced humans could frequently outperform computer algorithms. Computer scientists showed that increased circuit size and the complexity of problems such as logic minimization, physical design, and testing could exponentially increase the runtime of computer programs to solve these problems. Even the use of hierarchy and partitioning would not yield acceptable results without programming for some form of human-intelligence-based heuristics. Many knowledge-based design automation systems of that time combined programmed algorithms with the recorded databases of human experience. For the August 1985 issue of D&T, guest editor Donald Thomas of Carnegie Mellon University put together an issue focusing on AI techniques. Thomas was also a regular D&T editor for the areas of synthesis and verification.
The October 1985 issue of D&T is memorable. With the theme Design and Test in Japan, this issue described the ongoing work on VLSI design automation systems, synthesis, simulation, and testing that was going on in Japan, some of which was appearing for the first time in English. The guest editor for that issue was the D&T editor for the Far East, Akihiko Yamada of NEC Corp.
In the April 1986 issue, a news item applauded Jack Kilby, the inventor of the IC, for receiving that year's IEEE Medal of Honor. 3 Several years later, Kilby would go on to win a Nobel Prize for the same invention. Those in the semiconductor industry today will find another news item from that issue to be amusing. This item, under the heading, "Analysts forecast end of semiconductor slump," 3 goes on to say, "The worst recession in the semiconductor industry's history will bottom out late this year, returning to substantial growth in 1987…13 analysts presented their forecasts for 1986 and beyond."
CAD platforms were hot in 1986. Workstations had arrived, and VLSI design engineers would soon have them on their desks. The June 1986 issue featured Design Workstations (guest editor Jerry Werner of MCC, who later joined the D&T editorial board as the editor for conferences, organized the issue). Today, as personal computers replace many workstation functions, there is a parallelism with the D&T archives that describe workstations overtaking mainframes.
The year 1986 is also notable because automatic synthesis tools and silicon compilers started to become commercially available. The October 1986 issue of D&T featured a roundtable discussion on logic synthesis in design 4 ( D&T initiated roundtable discussions in 1986; roundtable editor Charles Radke of IBM organized the events). Moderator Aart de Geus, then with General Electric, asked the panelists: "Imagine . . . you are the Ann Landers of logic design. What advice would you give designers?" Robert Brayton of IBM responded emphatically, "If you see your CAD support people not moving toward a logic synthesis system, get on them and say, "This is the way of the future. Let's get with it." Incidentally, that issue's theme was Design for Testability; guest editor Ray Mercer, then with the University of Texas at Austin, organized the issue. A sentence from my editorial, 5 "Failure to Design for Testability May Be Chip-Wise But System-Foolish," became a frequently cited quotation.
Gordon Adshead, who was the D&T editor for Europe, put together the December 1986 issue, which carried the theme, Design and Test in Europe. In his editorial, Adshead described the challenges and accomplishments of the European Strategic Programme for Research and Development (Esprit) project. 6 The year 1986 also saw D&T forming stronger alliances with the IEEE's technical committees on design automation (DATC) and test technology (TTTC). The newsletters of those technical committees would regularly appear in D&T, and their respective editors, Jere Sanborn of IBM and Mark Harrison of AT&T, joined the editorial board.
A notable article from 1987 was "Aliasing Errors in Signature Analysis Registers" by Williams et al. 7 The authors provided an original mathematical analysis of aliasing probability, which is the probability of error masking, an important design parameter for built-in self-test. Since its publication, other authors have frequently cited this article in their works.
An invention of Randal Bryant of the Massachusetts Institute of Technology, the switch-level simulation algorithm had withstood the test of time by 1987. Modeling transistors as switches with conduction strengths and signal nodes as capacitors, this method could accurately simulate bidirectional signals through pass transistors, buffers, and buses in digital circuits. Its speed made it a verification tool of choice for custom VLSI chips. D&T's August 1987 special issue on Switch-Level Techniques is among the most comprehensive documents on this subject even today. It includes a survey article by Bryant, 8 who had by then moved to Carnegie Mellon University.
In 1987, a race was on between two types of CAD platforms—the high-performance CAD workstations and the special-purpose accelerators (or parallel processors). The October 1987 issue carried the theme, Parallel Processing for VLSI CAD. It described several point accelerators (computers designed to speed up a specific task). Rob Rutenbar of Carnegie Mellon University said in a roundtable discussion, "Point accelerators…may not be viable from the viewpoint of someone who wants to build and sell them, but they may be viable if the consumer of the CAD application is also the producer of the point accelerator." 9 In retrospect, that statement seems correct. Large organizations, such as IBM and Lucent's Bell Labs, designed and successfully used special-purpose computers. Over time, however, it became more difficult for these machines to realize any speed advantage. By the time a company designed and built a special-purpose computer, the next-generation general-purpose computer would be on the market, matching the performance at a lower cost.
In 1988, I handed over D&T to the next editor in chief, Sumit Dasgupta. My last editorial, which appeared in February 1988, summarized the state of the magazine through my tenure. I have remained an avid reader and an occasional contributor. I thank the editors and managing staff for their efforts on continuously improving the magazine and serving the design and test community through it. I thank the present editor in chief, Rajesh Gupta, for celebrating the commemorative year and for the opportunity to reminisce.