The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.11 - November (1999 vol.32)
pp: 42-45
Published by the IEEE Computer Society
ABSTRACT
<p>Test engineers are already hard pressed to ensure the quality of ICs despite ever-shorter time to market and skyrocketing test costs. Nanometer technologies will only add to the challenge.</p>
Play the word association game with electronics, and the first words that come to mind might be "inexpensive" and "reliable." Everyone expects electronic items to become continually less expensive and quickly outdated. We also expect them to work when we bring them home. Do any of us ever question how and why this happens?
A large part of the answer lies in significant improvements in IC manufacturing technology. Today, the semiconductor industry produces devices with feature sizes of 0.25 μm—250 nanometers—hence these devices are called both nanometer and deep-submicron devices. At this size, we can put millions of transistors on the same piece of silicon that only accommodated a few thousand transistors just a decade ago. Primarily, it's this increased transistor density that drives improvements in IC performance and functionality while reducing costs—all of which makes for better and cheaper consumer electronics.
Nanometer technology's characteristics differ fundamentally from those of its older counterparts—differences that will challenge designers and test engineers. The most significant change is that the design implementation's delay is shifting toward domination by interconnect delay rather than gate delay. The design's timing is unpredictable until designers determine the layout and the interconnects between the cells. Design and test methodologies are changing to compensate for this shift in delay to keep the design iterations to a minimum.
The more dense integration of nanometer technologies also amplifies crosstalk issues. In EE Times ("Compiled ASIC Libraries Ease Path to 0.25-Micron Technologies," 23 February 1998; http://www.eetimes.com/news/98/995news/compiled.html), David Lammers and Richard Goering write
But the industry has hit the wall in the quarter-micron generation. There are serious, serious, serious problems with yields at a quarter-micron. ... The power, signal-integrity, and metal-migration challenges are exceeding the capability of the tools. Instead of 80 percent yields, crosstalk is cutting yields for quarter-micron designs down to 30 or 40 percent at some companies.
The problems they mention are real. Manufacturers' resources will be stretched ever-thinner as they strive to bridge the widening gap between technical challenges and consumers' ever-growing expectations.
Bridging the Gap
Test bridges the gap between the imperfection of IC manufacturing and consumer expectations of flawless products. All IC manufacturers test their products for defects. They discard defective ICs and ensure that only defect-free chips make their way to consumers.
Test engineers test ICs on automatic test equipment (ATE), which runs tests typically created by test automation tools. These tools examine the design and intelligently create tests for potential manufacturing defects. Given today's very large designs, the test process relies heavily on automation.
Designs typically start at very high levels of abstraction, using some high-level hardware-description language such as Verilog or VHDL. Three important steps in transforming the design into an IC are

    simulation, which verifies the design's functionality;

    logic synthesis, which translates the design to lower levels of abstraction, such as circuit schematics; and

    • layout, which takes circuit schematics and creates masks for the fabrication process.

The translation process is iterative. During the various iterations, several tools assist in translating the design into an IC. For example, formal verification tools ensure that the translation process doesn't introduce errors. Timing analysis tools measure the design's performance at various stages in translation. Several analysis tools—such as those for power estimation and floor planning—aid the process of creating the IC.
Figure 1 shows some of the steps in the process of converting a design into an IC. Nanometer technologies have changed the typical iteration between logic synthesis and timing and power analysis; it now includes placement and routing. Including placement and routing in the flow enables logic synthesis to bridge the gap with place and route tools for more accurate timing information. Test technology has adapted to the new methodology, and the figure highlights the test-related steps—design for test (DFT), DFT planning, and test pin assignment—in the flow. After a number of iterations of the design, designers perform a detailed place and route. To ensure proper convergence in this flow, the place and route tool should acknowledge the placement information used during the design creation step.


Figure 1. Test steps like design for test are moving into the early design iterations.

Finally, engineers run programs to create test patterns to apply on the ATE.
New Challenges
New problems arising from nanometer technologies drive design automation to integrate all the tools needed to successfully take a design from concept to reality. Test, once thought of as simply a "back-end" process in the design flow, is moving closer to the front end: Today's design flows incorporate test-related structures at the beginning of the design cycle.
In this issue, we focus on test and design-for-test (DFT) technology and its automation. Test falls into three basic areas. The first area concerns defects and their associated analysis; this is the fundamental reason for IC testing. Engineers generate and apply tests for defects, and various issues arise in doing so. Thus, we classify the other two areas broadly as test application and test generation.
Analyzing defects
Defects occur in many ways during the manufacturing process. Some are catastrophic failures caused by, for example, cracks in the wafers, and some are caused by impurities in the fabrication process. Although the test engineer can easily identify catastrophic defects with randomly created tests, detecting spot defects requires targeted tests. Tests for ICs typically focus on these spot defects with the goal of allowing only a few hundred defective ICs per million to slip past the testing process.
Generating tests at the detailed design level at which defects occur is inefficient. This limitation has led to the concept of modeling defects at a higher level of design abstraction to increase test generation efficiency. Today, models exist for defects such as stuck-at and transition faults. In this issue, "Nanometer Technology Effects on Fault Models for IC Testing" by Robert C. Aitken describes these basic models and how nanometer technology affects them.
Tests for different fault types vary and typically target a specific set of defect mechanisms. The abstraction process that maps defects to fault models adds some fuzziness to the relationship between defects and their associated test methods. As a result, there is an overlap in the type of defects tested by different test methods, and research is examining what an optimal mix would be. 1
Potentially sensitive to a new set of failure mechanisms, nanometer technologies may require new fault models to detect failure mechanisms not covered by existing models.
Applying test
More faults mean more tests to apply, adding to the exploding number of tests that ATE testers must run. Already, larger designs mean more potential fault locations and hence more tests to detect the faults. Figure 2 gives an estimate of the rising test data volume for future designs.


Figure 2. Test time starts to rise sharply after 5 million gates.

Test data volume directly impacts the test application time, which adds to the manufacturing cost of individual ICs. Testers must change to support the high volume of test data for future ICs. In this issue, "Nanometer Technology Challenges for Test and Test Equipment" by Wayne M. Needham addresses the equipment aspects of testing ICs built using nanometer technologies.
Generating tests
To cope with the exploding design size and the corresponding amount of test data, test engineers are automating as much testing as possible. Using fault models, electronic design automation tools generate test patterns—patterns of input signals—that the tester applies to a circuit. For the general design—testing for all types of faults at every circuit connection—no solution can generate test patterns in a reasonable amount of time. This is why EDA tools use several heuristics, which augment the algorithms and help search the entire space of possible test patterns in a reasonable amount of time. The automation tools select the most effective patterns for a particular circuit.
To make the test generation problem tractable, designers must insert test structures—actual circuits such as scanable flip-flops that support test—into IC designs as they are synthesized. This is one way test is moving to an earlier level in the overall design process, and it is also an example of design for test.
With this underlying structure, test generation algorithms can generate a set of effective tests. In this issue, "Current Directions in Automatic Test-Pattern Generation" by Kwang-Ting (Tim) Cheng and Angela Krstic discusses test generation technology and how nanometer technology will affect it.
For nanometer technologies, the test community is focusing on methods that generate test patterns as compactly as possible. Compact test patterns will shorten the test application time, which reduces costs significantly. With a seven-year depreciation and a cost figured in for operators and space, the tester time could be about two to seven cents per second.
Other efforts focus on compacting the test patterns' stimuli and responses and embedding them in the IC itself. Built-in self-test (BIST) does just that and is gaining attention as a way to reuse technology. 2 BIST, however, comes with some limitations in terms of diagnosability of the IC failures caught on the tester. In addition, BIST relies on additional DFT strategies that must avoid unknown values in the compacted responses of the test data for the compacted response to be valid.
Test automation itself consists of

    • test generation and fault simulation (TGFS),

    • design rule checking (DRC), and

    • design for test.

Although test generation has the most visibility, creating a testable IC also requires a lot of work in design rule checking and design for test.
Test generation algorithms have limitations. DRC is the process that checks the design against the limitations of test generation algorithms. Another process, DFT, modifies the design to make it easily testable. Because they are interrelated concepts, DRC should recognize DFT structures to provide the necessary information for the test generator.
These two concepts come together in methods such as scan design. Scan connects a design's state elements into a shift register. The shift register content is accessible in a special configuration, available only during test. Full scan and other DFT technologies are part of predictable methodologies for successfully developing ICs. "Robust Scan-Based Logic Test in VDSM Technologies" by Kenneth D. Wagner discusses the importance of rigid design methods to handle the large ICs that nanometer technology will enable.
Other changes
Test must respond to design strategies that focus on reuse, lower voltages, and new materials.

Reuse. The industry is moving toward design flows that reuse preexisting designs called cores or IP (intellectual property). The integration of these parts poses a significant problem for automation tools. If the cores came with an equivalent gate-level design that was modifiable, most tools could easily deal with these reusable parts when incorporated in a larger design. However, these cores come with restrictions that make them difficult or impossible to modify. For example, they can be rigid (the layout is already determined), or vendors may not reveal the core's gate-level model.

From a test perspective, the story revolves around the reuse of test patterns that come with a core and testing logic outside the core, without having the core's gate-level model. To solve these problems, the IEEE P1500 standards group is developing standards for core test. 3,4


Lower voltages. In nanometer technologies, the supply voltage and threshold voltages are also decreasing. As a result, the quiescent current—current measured when all the switching activity of the IC has settled—in the new technology generations is increasing. I DDQ is a test technique that relies on monitoring this current for variations that can indicate defects.

I DDQ testing works well when the leakage current of a defect-free IC is very small. As leakage current increases, the small increase in leakage current caused by a defect becomes more difficult to distinguish. 5 Despite such a limitation, I DDQ testing has significant value, and current research seeks to extend this technique's effectiveness for nanometer technologies.


New materials. Along with decreasing device size, the underlying technologies are also changing. For example, the use of copper interconnects and the introduction of silicon on insulator (SOI) will impact test. New technology has the potential to create new defects that need to be tested.
Conclusion
In the past, designers have always looked on test as overhead. Design for test was an intrusion on the design that increased the chip's area and affected the design's timing. Test solutions that the industry finally developed and adopted were very sensitive to these negative connotations.
Nanometer technology changes this equation. Silicon is relatively free, and gate delay is negligible. These changes permit new and innovative solutions that were not possible before. New pressures on the design process are developing in the industry. Time to market and the manufacturing cost of a chip will be the focus in the next few years. Test has to enable predictable flows for successful manufacturing of the designs, and future tests have to be sensitive to test data volume and test application time, which add to the cost of manufacturing the ICs. Nanometer technologies are having and will continue to have a profound effect on test. Test methods must change dramatically if the industry is to deliver these very complex chips at reasonable cost.
We hope that the articles in this special issue help you understand the challenges inherent in nanometer technologies.

References

Rohit Kapur is a senior staff R&D engineer at Synopsys. His research interests are in VLSI test. Kapur has a BSc in engineering from the Birla Institute of Technology, Mesra, India, and an MS and a PhD in computer engineering from the University of Texas at Austin. He is chair of the task force to create a core test language as part of the IEEE P1500 standard. Contact him at rkapur@synopsys.com.
Thomas W. Williams is Chief Scientist and Director of Research and Development of Test Methodology at Synopsys. Prior to this position, he was with IBM as manager of the VLSI Design for Testability group. His research interests are in testing, synthesis, and fault-tolerant computing. Williams has a BSEE from Clarkson University, an MA in pure mathematics from the State University of New York at Binghamton, and a PhD in electrical engineering from Colorado State University. He is a member of the IEEE, the IEEE Computer Society, Eta Kappa Nu, Tau Beta Pi, ACM, the Mathematical Association of America, Sigma Xi, and Phi Kappa Phi. He is an IEEE fellow and in 1989, E.B. Eichelberger and he shared the IEEE Computer Society W. Wallace McDowell award for outstanding contribution to the computer art. Contact him at tww@synopsys.com.
15 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool