Issue No. 06 - November/December (2004 vol. 21)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/MDT.2004.88
Rajesh Gupta , Editor in Chief, <em>IEEE Design & Test</em>
Verification in chip design (and to an extent, in board design) has many components: functional, timing, power, and manufacturability—just to name a few. In recent years, D&T has devoted special issues to examining each of these topics in depth. With advances in these domains, it is increasingly clear that there are synergistic relationships in the models, methods, and tools within and across these domains. For instance, constraint or assertions, written in a certain way, can also be random-simulation drivers for the unit-level verification, which in turn can "flip around" to become assertion monitors when you combine units with other units in an "assume-guarantee" verification strategy. Similarly, simulations can allow formal verification tools to directly jump into verification hot spots.
Beyond specification, there are also significant algorithmic synergies. For instance, researchers have combined binary decision diagrams (BDDs), Boolean satisfaction algorithms (SAT), ATPG algorithms, uninterpreted functions, and word-level algorithms in various ways to create hybrid systems that are superior to any individual algorithm. These formal verification (FV) techniques help the designer in ultimately proving something, whether it is the equivalence of two models or a theorem about a model. While such static analysis can reduce the need for exhaustive simulation, the most practical work actually combines the static and dynamic verification.
Over the last couple of years, the verification tools industry has seen two major developments that point to its continued growth. First, there has been a maturing of some startup-developed verification technologies and their incorporation into mainstream tool flows, as the acquisition of two US startups (Verplex by Cadence Design Systems and 0-In Design Automation by Mentor Graphics) indicate. Yet, recent venture capital investments into about a dozen verification startup companies in the US and Europe validate the industry's growth potential.
Second, standards such as the Open Verification Library (OVL), Property Specification Language (PSL), and System Verilog Assertions (SVA)—all supported by Accellera—have removed barriers to the adoption of FV technologies. Using these standards, assertions can be used by multiple vendor tools. Going beyond vendor independence, these standards also provide a degree of methodology independence. So, for instance, the same assertion can drive both static and dynamic verification methodologies.
Driven partly by these advances, the verification focus is expanding beyond functional verification to nonfunctional design characteristics. As future system chips contain increasing amounts of (hardware-dependent) software, it is natural to expect that synergies will expand to include embedded software design and validation methods.
Our guest editors, Carl Pixley and Sharad Malik, have carefully selected the four articles in this special issue after a thorough review process. These articles cover synergies between functional verification and manufacturing-test generation through enhancements to equivalence checking; hardware-assisted, fast, functional simulation through synthesizable testbenches; and synergies between simulation, test, and emulation through advances in testbench interfaces, and through an abstraction map that enables using a model checker as a simulation monitor.
This issue of D&T also contains a diverse set of nontheme articles that would appeal to our broader readership. The covered topics include fault tolerance in FPGAs to cope with soft errors; a learning algorithm that speeds up pre-image computation for model checking; a compressor/decompressor core in hardware than can sustain up to a 10-Gbps throughput; power savings in circuits through self-calibration of circuit delays; an estimation tool for system-level energy estimation in embedded systems; yield enhancement through controlled redundancy; jitter measurement in high-speed serial links; and self-correcting digital image sensors.
I do very much hope that you will enjoy this issue!