Formal verification poised for rapid growth

By Dr Pranav Ashar |  No Comments  |  Posted: June 1, 2005
Topics/Categories: EDA - Verification  |  Tags:

Design teams are becoming increasingly concerned at the growing disparity between the capacity of silicon in the latest processes and the design and verification capabilities of simulation tools. A number of trends are now converging to enable a step function increase in verification to complement and extend the debug and verification capabilities of HDL simulators.

First, the dictates of Moore’s Law inevitably foresee the development of ever more powerful computers and other integrated circuits. But today, these ICs boast such high complexity that designing one, quickly and without errors, is becoming increasingly difficult. However, using the same technology, computers can now execute more complex algorithms and handle more complex data than previous generations. These advances are enabling the practical application of sophisticated algorithms, leading to easier implementations of new design and verification technologies, and, consequently, to more robust and productive flows.

Second, the EDA industry is moving towards more open, broadbased standards. The emergence of dedicated verification languages has helped to make testbench development and debug a less onerous task. These languages improve verification coverage and facilitate the conversion of ambiguous and incomplete specifications into testable statements.

Third, there is the emergence of static formal tools that take advantage of increased computational power and language standards. This needs to be coupled with designers’ familiarity with basic formal capabilities such as equivalency checking. The latest generation of formal verification tools has established a working environment that is conducive to further inroads in formal methods. One advance of note here is the development of Boolean Satisfiability-based Bounded Model Checking (SAT-based BMC) methods. These enable an acceptable tradeoff between run time and memory requirements, as well as between the roles of formal tools as proof engines and bug hunters.

IBM-Toshiba microprocessor

Figure 1. Pressure points

Standardization of the design and verification languages allows portability between static and dynamic tools as well as between different static tools. Assertion languages such as PSL (Procedural Specification Language), SVA (SystemVerilog Assertion) and OVL (the Open Verification Library) are recognized by simulators and static tools. The existence of common input data types allows for easier correlation across the tools, an important and necessary step to acceptance among users. This portability means assertions generated at disparate sources can be aggregated into the verification methodology, further reducing the redundant generation of test cases while maintaining and improving test coverage across transformations. These languages also have a cadre of designers who are becoming comfortable with the creation and use of properties and assertions as a part of the specification evaluation and design goal inference process.

Starting from scratch

Historically, design efforts and tools have always tried to address the critical resources and design goals, and new tools emerge to address changes in those constraints. In the 1980s transistors and very low level block timing (clock to out) were the main design constraints, and the HDLs and synthesis focused on these issues. In the 1990s gates, interconnect, and register-level timing were key. While the simulators and synthesis tools remained, the complex timing and verification of the more highly optimized transforms in synthesis led to the introduction of static timing analysis (STA) and equivalency checkers into the design flow. STA provided the means to change timing verification from vector-based dynamic simulation to static analysis, the first break in the dynamic simulation hegemony. The equivalence checkers confirmed the functionality of the design without extensive simulations after transformations from synthesis. Together, these tools not only reduced verification by large amounts of time, but also proved the concept that static analysis tools could replace dynamic simulation in large and vital areas of verification.

At the beginning of the 21st century, design goals are throughput, functionality, power, protocols, and complex transactions. In many cases, the RTL tools are increasingly unable to address the verification needs of the design. These are focused on the very fine details of the internal signal events but the design needs to be checked as a part of a data flow engine. For example, simulators have difficulty in checking signals crossing timing or hierarchy boundaries, and fault conditions based on multiple block interactions – such as timing races, clock-data skew, and the complex exceptions to protocols or transactions. The increasing use of standard interfaces such as PCI or USB, and the inclusion of large amounts of memory and cache make pure HDL simulation difficult.

Design teams are developing alternative strategies and methodologies. For some designs, a solution to the exponentially growing number of test vectors has been to set up server farms and multiple simulator licenses to try to run as many verification jobs in parallel as possible. Among the challenges are a couple of fundamental designer intensive roadblocks. The simulations cannot start until design of a block is fairly complete, so sub-blocks tend to get minimal checking. Even after the blocks are entered, they have to wait for the designer to create the test bench to find events that will break the design. Testbench development and debug now takes much more time than the design that is being checked. Once the blocks and testbenches are complete enough, the simulation runs can commence. The design size and the interactions between blocks means that the verification team needs to run billions or even trillions of vectors to cover all possible operating conditions. Given that the testbenches cannot obtain full coverage, the verification teams use random and directed random data streams to supplement the test vectors. These additional data streams, however, exacerbate the data volume problem since designers need to run many corner cases to find bugs that the nominal vectors may not catch. This verification process then requires additional tools to indicate if you have approached a reasonable level of completion because the random vectors in themselves are not observable. In addition, to get the most from a large batch of server jobs, someone needs to parse the design into parallel stages, and needs to monitor data exchanges across blocks, since some simulations require data from other runs to proceed. In comparison, the latest generation of formal analysis tools is very comfortable with smaller blocks before full design completion. At any level, formal analysis can exhaustively check a design without the need for vectors, and confirm the functionality of a logic block. The designer need only describe the correct functionality of the design by writing properties and the tools then check, unprompted, for desired or illegal behaviors. This eliminates the need to exhaustively simulate to find operating conditions that break the design. These tools automatically check corner conditions, even those that are not easily considered in simulation.

IBM-Toshiba microprocessor

Figure 2. This generic RTL flow, with assertion, shows the insertion points for various formal techniques.

Within the limits of their capacity, such tools can check for faults across boundaries, domains, and hierarchies. Moreover, the latest generation of formal analysis tools can automatically extract assertions for use with other tools and, in conjunction with verification intellectual property (IP), can check protocols and transactions. All together, these capabilities alleviate the data volume and simulation time problems, and minimize the need for extensive testbench creation and debug. (See Figure 2)

Through the pioneering efforts of a handful of start up companies like Real Intent, the static assertion-based verification (ABV) technology model has crossed a significant threshold so that it is poised to provide significant value as a companion to simulation. The latest generation of formal tools – such as Real Intent’s Verix IIV and CIV – are fully automatic, and do not require the writing of any assertions, enabling them to be used out-of-the-box to substitute a significant portion of the simulation effort. The assertions identified by these tools are straightforward enough to be extracted by a structural analysis of the design RTL, but potentially highlight complex design bugs and require the full power of formal analysis to prove or disprove.

Even though simulation continues to be a major component of verification, standards enable IC companies to pick the best of breed in dynamic and static verification tools from different companies. The hybrid approach of using common assertions across static and dynamic tools is, in fact, very similar to using both a simulator and STA for timing closure and for constraint verification and management.

Furthermore, designers are better educated today in PSL/SVA/OVL, and already familiar with the use of formal equivalence checker tools like Formality from Synopsys and Conformal from Cadence Design Systems. At the same time, the emergence of verification IP for standard function blocks frees up additional resources while ensuring compliance to standards. Real Intent has three formal tools to support ABV:

Verix-IIV: Using Implied Intent Verification, this product automatically extracts a comprehensive set of design assertions from an RTL design and formally verifies them.

Verix-CIV: Here, Clock Intent Verification automatically detects synchronization errors that are commonly associated with multipleclock- domain designs. It catches the most common design risks that are an impediment to safe multi-clock designs.

Verix-EIV: This is based on Expressed Intent Verification. By writing their own design assertions, users can verify complex static and temporal design behaviors and detect deep errors without the need for testbenches and test vectors. User assertions are written in standard assertion formats – such as SVA, PSL or OVL – that can be run natively by most standard simulators.

The growing number of static verifications tools from both large and small EDA companies implies that an increasing number of very visible IC companies is already adopting the technique as a sign off component in some part of the verification methodology. Some have established policies requiring that all RTL must be run through Verix IIV and CIV before being checked into the common code base. One verification engineer said that a large number of their bugs are found with IIV. Anecdotal responses in forums like ESNUG (www.deepchip.com) and Verification Guild (verificationguild.com) indicate a growing acceptance of the formal tools and their verification capabilities.

Comments are closed.

PLATINUM SPONSORS

Synopsys Cadence Design Systems Siemens EDA
View All Sponsors