Coverage-driven verification for the analog domain

By Monia Chiavacci |  No Comments  |  Posted: September 1, 2006
Topics/Categories: EDA - IC Implementation  |  Tags:

The verification of digital sub-systems is based on advanced techniques such as constraints capture, randomized or pseudo-randomized stimuli generation and results collection with functional coverage evaluation. The use of manually verified hand-coded analog block models within a digital verification environment has so far been sufficient. However, the move to greater levels of integration, shrinking process nodes, and increasing market pressures now require an automated and metrics-driven verification methodology to provide confidence in the sign-off of a mixed-signal design, prior to fabrication.


Figure 1. Traditional design and verification flow

Traditional approaches (Figure 1) completely separate the digital and analog designs. Coverage-driven functional verification based on an object-oriented language ensures a high level of reliability for digital circuits but only the behavioral models of analog blocks can be included. Without an automatic flow that defines and checks these models, the confidence one can have in the verification results for the combined circuit is significantly limited.

Some mixed-signal simulation tools link and simulate analog and digital blocks in the same testbench, but they mainly address analog functionality and have difficulty supporting complex digital blocks at the same time. Only digital sub-circuits are included in the simulation to simplify the process, and this provides incomplete verification. For full analog and digital verification, we are still far from the effective use of automation, functional coverage evaluation, self-checking mechanisms and re-usability.

Consequently, the overall verification coverage of a mixed-signal system is often much less than it should be, so system bugs appear very late in the design process. This leads to expensive debugging, re-design and silicon re-spins, and time-to-market is inevitably hit.

The Analog Mixed-Signal Verification Kit


Figure 2. Block diagram of the AMS vKit

In response, we propose and have defined the Analog Mixed-Signal Verification Kit (AMS vKit). It extends coverage-driven dynamic functional verification to mixed-signal circuits. AMS vKit enables the creation of automated verification environments and methodologies for top- and block-level mixed-signal designs. It encompasses an extensible set of configurable, plug-and-play, pre-verified components that unify state-of-the-art verification tools, like Cadence Specman Elite, and mixed-signal simulators to create a common verification environment. It is based on the ‘e’ language, an object-oriented hardware verification language (IEEE P1647 standard) that allows engineers to capture rules from specifications as well as to automatically generate random and pseudo-random coverage-driven tests.

As shown in Figure 2, AMS vKit is composed of three libraries (vTerminals, vComponents and Sequences DB) and all the necessary infrastructure (simulator scripts, ‘e’ structures/unit and classes, etc).

The core of the kit is a library of ‘verification terminals’ (vTerminals) that create an interface between the analog and digital domains. The vTerminals are divided into two types (Figure 3):

  • Verification sources (vSources – vS) are models of signal sources configured and controlled by digital commands from the verification environment that provide continuous and time-continuous voltage and current signals or by analog events. They include DC, pulse and sinusoidal signal (current and voltage) generators, noise injectors and parameter spread emulators.
  • Verification probes (vProbes — vP) transfer analog information from the mixed-signal simulator to the verification environment. They provide the values of voltage,current and timing parameters and include self-checking mechanisms (e.g., check a sampled voltage level within a pre-defined range). Examples of vProbes are voltage/current/time detectors, linear behavior and thermal harmonic distortion calculators, and AC gain extractors.

Figure 3. Block diagram for vSources and vProbes

The verification components (vComponents) come ready-to-use for the creation of verification environments (e.g., testbenches) for main blocks, including self-checking mechanisms and coverage evaluation based on analog metrics that are easy to integrate in more complex mixed-signal scenarios. They are used to verify basic analog blocks such as band-gap cells, oscillators, voltage regulators, comparators, operational amplifiers and buffers.

For each cell, the definition of the verification plan includes the significant parameters, conditions and procedures required. Based on this plan, the verification component drives, monitors and processes current and voltage signals, generating correct stimuli for the device-under-test (DUT) and elaborates the information in order to match the target coverage.

A verification component includes:

  • a set of vTerminals to manage analog data;
  • monitors that receive outputs directly or through the vProbes ;
  • drivers that handle and synchronize the whole verification environment including the vTerminals;
  • scoreboards that enable the comparison between simulation outputs and expected results;
  • coverage units that implement analog and digital metrics to evaluate the functional coverage of the verification process.

In order to calculate a non-trivial analog parameter one must properly control and configure a number of vSources and vProbes and synchronize them. This is implemented using sequences: a structure that represents a stream of items signifying a high-level stimuli scenario.

The database provided with the kit (Sequences DB) includes all the sequences needed in an analog context. For instance, to extract the thermal harmonic distortion of a buffer (one of the most important analog parameters), one needs to stimulate the circuit with a sinusoidal signal (vSource) for a defined time period depending on the frequency at which the measurement has to be done. The settling time and the sample period of the output signal (vProbe) depend on the frequency as well. Pre-defined and ready-to-use sequences, which create this and other kinds of test scenarios, are available in the Sequences DB library.

To use the AMS vKit, select the necessary components from the libraries according to the verification plan; instantiate them in their environment; and configure them and create the test. A user-friendly graphical interface simplifies this process.

It reduces the challenge of becoming an expert in ‘e’, instead allowing the user to perform the steps defined above without hand-writing code. Moreover, the GUI can handle configuration files and generate scripts for setting up simulator parameters and regression suites (i.e., multiple simulation runs), if needed.

AMS vKit scenarios

For mixed-signal intellectual property (e.g., A/D and D/A converters), designers perform mix-mode simulations to check the analog and the digital blocks together. Here, one needs to manipulate different kinds of description in the same simulation environment: RTL, spice level, analog HDL (e.g., VerilogAMS), etc. The latest mixed-language simulators supplied by major EDA vendors enable this approach by providing a single engine and a fully integrated platform (e.g., the Cadence AMS Designer).

Where one needs to simulate analog and digital circuits together (Figure 4), the AMS vKit provides a link to a verification tool, enabling the use of all the features bringing automation and reliability to the whole process. This can be achieved for any kind of mixed-signal simulator.


Figure 4. Simulation domains

The ‘e’ language is used to capture analog specifications; create analog stimuli through vSources; and manage self-checking mechanisms for analog data through vProbes. The status of the verification process is then automatically calculated by objective metrics which include analog rules.


Figure 5. Block-and top-level mixed-signal verification

The verification environment built on a defined development level can be re-used (Figure 5): stimuli, checking mechanism, coverage items and monitor strategies can be easily ported through the design hierarchy.

To avoid an explosion in simulation time because of the increase in circuit complexity, one needs to create behavioral models for different blocks to be used at higher levels of the hierarchy (e.g., top-level SoC verification). The reliability of these models has to be evaluated with respect to the validation procedure of the circuit itself. Referring to a well-implemented model in absolute terms makes no sense: the model is good if it behaves as the real circuit does under the conditions in which the circuit itself is validated. The AMS vKit allows the engineer to build an automatic flow to check the circuit (a transistor-level description) and its model (an analog high-level description) in the same verification environment. This is essential to gain enough confidence in the model itself and to be able to calibrate it appropriately.


The described approach has been used for a flow starting from physical layer transceiver verification to its integration into a RISC-based SoC. For the transceiver itself, results show a saving in simulation time of approximately 50% attributable to the verification control and easy management of regressions facilitated by the AMS vKit; and there is a further saving of close to 90% in engineer time for results collection and analysis addressing a functional coverage of 99% including digital, analog, and cross metrics.

With the increase of the DUT complexity (from transceiver to SoC), it was possible to save between 40% and 60% in the verification environment set-up time due to the high level of re-use and, finally, to secure a performance increase of about 30% compared with traditional approaches to bug discovery and resolution.

Via Lenin 132/p
56017 San Martino Ulmiano

T: +39 (0)50 86351
F: +39 (0)50 861870

Comments are closed.


Synopsys Cadence Design Systems Siemens EDA
View All Sponsors