The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.04 - July/August (2001 vol.18)
pp: 4-5
Published by the IEEE Computer Society
Functional verification involves proving that an IC design works properly relative to specification, protocol, and functionality. It does not necessarily guarantee correct operation with respect to timing, power, noise, routability, and so forth. Other tools and methodologies address these important concerns.
For example, functional verification techniques can typically certify that

    • multipliers and adders give the correct answer based on a particular standard,

    • a register-transfer-level (RTL) and gate-level model are equivalent,

    • a bus protocol (such as PCI, Amba, or RapidIO) is implemented according to a standard,

    • CPUs properly implement their instruction set architectures (ISAs),

    • digital signal processors correctly calculate their intended functions, and

    • distributed-memory designs maintain cache coherency.

There are three main ways to functionally verify designs: simulation, emulation, and formal verification. This special issue focuses on formal verification. However, all three approaches provide a representation—in fact, many representations—of the design under verification. The DUV is usually expressed as a Verilog or VHDL (VHSIC hardware description language) design, or possibly a program in a higher-level language such as C, C++, or a variant like SystemC. DUVs are represented at several different abstraction levels: register transfer, gate, transistor, and circuit. This variety of models suggests that ICs require two basic qualities to verify:

    • equivalence of models, such as whether the gate model implements the RTL correctly; and

    • properties of models, such as whether a design maintains cache coherency or whether multipliers give the correct result.

As chip designers continually implement more cores, memories, buses, and peripherals on a chip, the complexity of ICs is being pushed beyond 10 million transistors. Functionally verifying a chip is consuming increasingly more design resources: people and CPU cycles. This special issue addresses this problem from the perspective of formal verification. Of course, describing all the tools and methodologies for functionally verifying chips is impossible to accomplish in one issue, so we highlight just a few important developments in this field.
Equivalence Checking
First, Hewlett-Packard's Harry Foster describes how to use a Boolean equivalence checker to verify the equivalence of RTL and gate-level models. Equivalence-checking tools have become standard in the industry, and all major CAD vendors offer them. Today's tools focus on comparing RTL, gate-level, and switch-level models. They are based on various techniques such as binary decision diagrams, Boolean satisfaction algorithms, and automatic test pattern generation algorithms.
For the future, several university research projects aim at elevating the level of models handled by equivalence checkers, such as the work by Randy Bryant and his students at Carnegie Mellon University in comparing ISA models to RTL designs. Unfortunately, there is little work about how to formally compare abstract models in C or C++ to RTL models.
Property Checking
Two articles in this issue address the problem of property checking. The first, by Intel's Robert Jones et al., describes a methodology that combines symbolic trajectory evaluation (STE)—that is, symbolic simulation—with theorem proving to verify the correctness of arithmetic units. The authors have developed a highly sophisticated verification system and deployed it on several ICs. The second article, by Motorola's Narayanan Krishnamurthy et al., discusses the concerns regarding how to make STE usable in a commercial design environment. STE has successfully uncovered bugs in memory arrays of several PowerPC designs.
Simulation Coverage
Because of this issue's focus on formal verification, it does not concentrate on simulation as a verification technique, though simulation is still by far the most prevalent method used in the industry. However, Serdar Tasiran from Compaq and Kurt Keutzer from Berkeley present a survey on the various ways to measure simulation coverage. It is interesting that there is no single best-known coverage measure. In fact, there is no known mathematical theory of simulation coverage. Nevertheless, this article discusses the various means in common use.
Conclusion
Another important area this issue does not focus on is model-checking and semiformal verification tools. Vendors have released several products, such as Verplex's BlackTie, 0-IN's Search, RealIntent's Verix, Averant's Solidify, and Cadence's FormalCheck; and Synopsys has announced an offering called Ketchum.
Verification has become a very exciting discipline, as increasingly more formal and semiformal tools and methodologies are becoming available.

Carl Pixley is the manager of design verification at Motorola Semiconductor Products Sector's Architecture and Systems Platforms group. His research interests include formal and semiformal verification, sequential equivalence, and syntheses. Pixley has a BS in mathematics from the University of Omaha, an MS in mathematics from Rutgers, and a PhD in mathematics from the State University of New York at Binghamton. He is a member of the IEEE and the Mathematical Association of America.
31 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool