Issue No. 04 - July/August (2003 vol. 20)
Through a happy confluence of technical, economic, and trade policy developments, the semiconductor industry has maintained its tremendous momentum for more than three decades as a key player in the global economy and a major employer of technical talent. The industry has been very successful at manufacturing ever-shrinking components such as transistors and interconnects, and ensuring that they perform predictably and reliably. This shrinking has outperformed the most optimistic expectations. Current process technology advances are bringing manufacturing to 130-nm process nodes just as they are solving process challenges for the 90-nm node. In many ways, 90 nm defines a turning point for the industry: It provides 2× more transistors per square millimeter than a 130-nm process, and when combined with the move from 200-mm to 300-mm wafers, it provides a 5× increase in the number of dies per wafer. For the first time, 100-million-transistor chips are within the reach of system designers, bringing increased systems integration capabilities and the tremendous cost reductions associated with volume chip manufacturing.
There are, however, signs that 90 nm also marks another turning point. The industry may be ready to manufacture chips containing millions of gates, but it sure isn't ready to bring error-free designs to the market any more quickly. As the Gigascale Silicon Research Center explains, industry challenges can be grouped into problems of the small, problems of the large, and problems of the diverse. Problems of the small—caused by shrinking device dimensions—have thus far dominated all other problems and received the concerted focus and investments of the industry. However, problems of the large—those concerning the enormity of the design-verification and manufacturing-test tasks—are now beginning to limit the industry's continuing progress. A recent report, Securing the Future: Regional and National Programs to Support the Semiconductor Industry, released in May 2003 by the National Research Council's Board on Science, Technology, and Economic Policy, casts serious doubt that positive trends in economic growth can continue even if the industry can overcome newly identified problems of the small in far-nanometer technologies.
This issue explores practical solutions to two of the most pressing problems of the large: verification and test. Along with large-scale integration for SoCs, these tasks must deal with designs modeled, composed, verified, and tested at higher abstraction levels. They also rely on underlying power tools that use the latest algorithmic advances. Ravi Hosabettu and his coauthors describe a formal-verification approach that employs modern algorithmic advances in theorem proving to verify pipelined microarchitectures. João Marques-Silva and Luís Guerra e Silva describe recent advances in solving Boolean satisfiability problems and how these solutions can provide fast verification of combinational circuits. Ozgar Sinanoglu and Alex Orailoglu describe practical techniques to improve test time by compacting output responses from deeply embedded cores on chip. Alfredo Benso and his coauthors outline a hierarchical BIST infrastructure for SoCs. Ian Harris presents an overview of the research in test generation and fault models for systems with hardware and software.
Also in this issue, Nicola Nicolici and Bashir Al-Hashimi explore how test synthesis and test scheduling affect power dissipation, and Dionisios Pnevmatikatos and his coauthors present a low-cost I/O subsystem for network processors.
It is my pleasure to present this issue to you. Please take a moment to fill out the enclosed reader response card. I look forward to your comments and suggestions!
Editor in Chief
IEEE Design & Test of Computers