Formal verification

Sphere:  |  Tags: , , , , , , ,

Formal verification is the overarching term for a collection of techniques that use static analysis based on mathematical transformations to determine the correctness of hardware or software behavior in contrast to dynamic verification techniques such as simulation.

As design sizes have increased and with them simulation times, verification teams have looked for ways to reduce the number of vectors needed to exercise the system to an acceptable degree of coverage. Formal verification is potentially very fast because it does not have to evaluate every possible state to demonstrate that a given piece of logic meets a set of properties under all conditions. However, its performance depends greatly on the type of logic on which it is deployed and the way it is applied.

Security concerns and communications issues such as deadlock within on-chip networks have encouraged users to adopt formal verification in more focused ways. For example, formal verification tools such as the security app created by Jasper Design Automation can check for sneak paths in logic that might compromise security more efficiently than using simulation. In the case of focused formal verification, a number of mathematical algorithms are packaged together by the vendor using scripts that target certain types of problem. These and can be delivered as easier-to-use tools than the more flexible products aimed at block and SoC verification.

Equivalence checking

Some forms of formal verification are already widespread in design. Equivalence checking has been used for more than a decade to check that RTL and gate-level descriptions of a design represent the same design. Equivalence checking was introduced in response to the problem of larger designs exceeding the effective capacity of gate-level simulation tools, and quickly took over from hardware-acceleration solutions as well as software-only gate-level simulators. For users, the equivalence checking technology is relatively easy to use in the way it has been packaged by vendors, in tools such as Formality from Synopsys.

Equivalence checking has moved beyond SoC RTL design, migrating into FPGA design because of the use of very large devices and the time it takes to compare simulation with hardware given the limited internal visibility that a programmed FPGA offers. Through tools such as SLEC from Calypto Design Systems, equivalence checking is also used to check the functional equivalence of ESL and RTL descriptions of a block.

Usage at the system level is steadily making its way across the SoC industry. Intel, for example, used a form of equivalence checking to verify the RTL implementation of a cache-coherency protocol against its definition, which itself was tested for correctness using formal techniques, based on the more complex to use form of formal verification: model checking.

Model checking

Model-checking formal verification is more complex to deploy as it not only relies on user-generated assertions but on the correct selection of algorithms for a given problem. early formal verification tools also ran into difficulties on them because the solvers, which were then largely based on the same binary decision diagrams (BDDs) used in early equivalence checking tools quickly ran out of memory. It took a change in the way that multipliers could be represented to made the job of verifying them formally tractable.

Other types of logic and datapath presented problems for BDDs, leading to a proliferation of formal verification methods which slowed adoption. Users had to become experts not just in using the tools but the theory of formal verification to be able to pick the right tool for each job. The EDA companies responded by building better heuristics into their tools that would attempt to recognise types of logic and then pick the best solver for the job. Capacity remains an issue for formal techniques and so is still largely focused on block-level or protocol verification. However, techniques developed to help bring formal into the mainstream are now widely used in simulation-based verification.

Originally, formal verification tools needed mathematical descriptions to provide something to prove. These were difficult to put together even for people with experience of the underlying theory. The response from the computer and EDA industries was to develop languages that could describe the desired behaviour in terms familiar to a hardware engineer but which could be converted internally by a tool into a mathematical description.

The Property Specification Language (PSL) was developed – initially by IBM as the Sugar language – to provide the assertions to formal tools, but types of assertions supported by PSL are now widely used accompaniments to simulation. Designers insert assertions into their code to tell users how a block should be used and test for violations of those conditions. For example, an assertion may check that an acknowledgement signal follows a request after no more than ten clock cycles. Although the syntax of the assertions presents a learning curve, they are easier to handle than mathematical expressions.

SoC-level formal verification

As formal verification has yet to arrive in a form that can test the entire behavior of an SoC, it needs to be used as part of a wider verification strategy that will include simulation and, most likely for large designs, emulation. Coverage management is an effective mechanism for integrating formal verification into a wider flow as, through the selection of appropriate coverage points, it is possible to steer simulation vectors away from parts of the logic that have already been effectively tested by formal techniques.

Conversely, the results of simulation and emulation driven by existing software and firmware can outline parts of the design that need to be exercised more thoroughly, potentially by formal techniques. Having incorporated multipurpose formal tools into their portfolios, the major vendors have developed environments and methodologies that have become increasingly adept at supporting these procedures. For example, Cadence Design Systems and Mentor Graphics have built formal technology into their respective verification environments Incisive and Questa.

In some cases, formal and dynamic technology are incorporated into one tool. An example is Magellan from Synopsys, which couples formal engines with the VCS simulator. The idea behind Magellan is to find bugs that are buried deep within a pipeline and thus would take many cycles to trigger using just simulation.

Formal verification is increasingly being used to support the acquisition of IP cores and during SoC integration for specific tasks. These applications are examples of modular formal verification, in which tools are built from core formal algorithms and coupled with scripts tuned for a specific purpose or integrated into software tools.

Focused formal verification

Clock domain crossing (CDC) is a good example of an area of growing importance. CDC has been with us for a long time, but it is the proliferation of clocks in today’s SoCs that have seen it rise from, say, a third- to a first-order issue. Designs with a few clocks might have been addressed in simulation; those that now have hundreds are best verified in this respect before hand.

X-propagation is another application of formal techniques, and a good example of something that presents problems for simulation. More complex designs are throwing out more unknown states, which may lead to bugs being misidentified or missed.

Such targeted uses of formal verification have important advantages:

  • It addresses areas that are high priority for verification and design engineers (many of the modules are designed for use by both).
  • As it targets more specific areas, there is greater confidence in its ability and capacity to catch related bugs — a confidence that has built rapidly following successful deployments of these modules.
  • It hides much of the technological complexity that underpins formal verification, so the user does not need to be even semi-expert in the technique itself.

These factors combine to increase confidence in formal verification’s ability to operate at the holistic level, spurring its broader adoption.

A major endorsement of this from a Big 3 vendor came in Spring 2014, when Cadence Design Systems struck a deal to acquire Jasper Design Automation. Jasper is the company that has most aggressively followed a strategy of focusing on specific problems, building a series of formal verification ‘apps’ under the JasperGold brand. As well as X propagation checks, individual apps check among other things: on-chip connectivity; issues that are typically encountered in low-power designs; and, most recently, potential logic sneak paths that may compromise the security properties of parts of an SoC that are meant to be protected.

Comments are closed.



PLATINUM SPONSORS

Synopsys Cadence Design Systems Siemens EDA
View All Sponsors