SEPTEMBER/OCTOBER 2002 (Vol. 19, No. 5) pp. 54-55
0740-7475/02/$31.00 © 2002 IEEE
Published by the IEEE Computer Society
Published by the IEEE Computer Society
Guest Editors' Introduction: Stressing the Fundamentals
PDFs Require Adobe Acrobat
The theme of this year's International Test Conference, "stressing the fundamentals," has two aspects. First, over the past 40 years, the test community has developed several fundamental concepts, including fault models, automatic test-pattern generation (ATPG), and DFT. In addition, fundamental techniques such as functional test, scan, burn-in, and I DDQ test are key to modern test technology. Emphasizing these techniques is important as circuits become denser, faster, and more difficult to test. These same trends, though, also lead to a second type of stress: Advances in circuit technology are pushing these fundamental approaches toward their limits. The articles in this special ITC section explore this idea of stressing the fundamentals from various angles.
In the first article, Sheng and Hsiao advance the state of test generation for sequential circuits, with a new genetics-based algorithm. As circuit complexity has increased, so too have the types of algorithms applied to test generation. Concepts such as fault sensitization and propagation remain central to any ATPG technique. But even when the problem's inherent complexity is nonpolynomial complete, the heuristics and algorithms used, and the order in which they're applied, can dramatically alter the technique's effective performance. Sheng and Hsiao can obtain high fault coverage with low test generation time.
Despite improved pattern generation techniques, Moore's law has another consequence for fixed tester memories: increasing vector count. Whereas the number of circuit elements can double every two logic generations, the number of I/Os increases far more slowly, if at all. The amount of test data is proportional to the number of circuit elements, and the available bandwidth to apply this data depends on the number of I/Os. This is a serious problem for modern circuit testing and is taxing ATE to the limit. Such equipment must keep up with the bandwidth challenge yet still meet economic requirements for test time. Several excellent articles at this year's ITC address this problem. For this special section, we've selected an article based on a talk originally given at last year's ITC: Barnhart et al. describe a method to improve scan efficiency by a factor of 10. How are such astounding results possible? It turns out that there are several approaches, including the one in this article. All these approaches rely on the fact that a considerable amount of test data is in some sense random and can be replaced, if necessary, by other random data.
Circuit technology stresses fundamentals as well. With every CMOS process generation, background leakage current rises. Besides the obvious problems for battery-operated and other low-power devices, rising leakage currents make another fundamental test technique, I DDQ testing, increasingly difficult. I DDQ test measures the leakage for various circuit states and rejects devices that draw abnormally high current. In early CMOS technologies, defective devices often consumed several orders of magnitude more current than good ones. In today's processes, test engineers must find defect currents that are only an order of magnitude smaller than leakage current. Daasch's article describes new tricks that let you continue to apply I DDQ test even though conventional I DDQ testing is no longer feasible in many technologies.
Finally, the increasing heterogeneity of circuits also stresses fundamental techniques. In systems on chips (SoCs), designers routinely combine significant memory and analog functionality with extensive digital logic. As communication devices advance and proliferate, RF circuits are also making their way into SoC circuits. RF has its own set of fundamental test approaches, but these do not typically involve DFT, and some are incompatible with conventional ATE. The march of progress demands a resolution of this situation, and the article by Ozev, Olgaard, and Orailoglu addresses this challenge. They examine the specific problem of testing Bluetooth devices, and they propose various solutions.
Of course, these are not the only fundamentals stressed in test technology today. ITC has over 120 technical articles addressing these areas and others, including ATE architectures and software, defect-oriented test, memory test, and high-performance measurement techniques. We invite you to attend this year's conference, and we hope you enjoy the articles in this special ITC section.
Robert C. Aitken is a design and test methodology manager in Agilent's Semiconductor Products Group. His research interests include testing and diagnosis. Aitken has a PhD in electrical engineering from McGill University in Canada. He is a member of the IEEE and program chair of the 2002 International Test Conference.
Donald L. Wheater is a senior technical staff member at IBM Microelectronics. His research interests include DFT methods and their interaction with ATE, as well as overall test methodologies for complex SoC devices. Wheater has a BS and MS in electrical engineering from Rensselaer Polytechnic Institute. He is a senior member of the IEEE and a past program chair of the International Test Conference.