Complexity to force shift to four-stage verification

By Chris Edwards |  No Comments  |  Posted: November 20, 2013
Topics/Categories: Blog - EDA  |  Tags: , ,  | Organizations: ,

The complexity of on-chip interconnect and the relentless growth in software size will drive the move to a four-stage verification process as well as the increased use of formal techniques to speed up SoC-level testing, Mentor Graphics verification specialist Mark Olen claimed at the Verification Futures conference this week (19 November).

“Ten years ago, ASIC verification was partitioned into two main steps: block-level and integration, or full-chip verification. But the emergence of more complex interconnects as we move from AHB to AXI and on to CHI is creating more of a challenge for us. The bus is becoming one of the most important and difficult things to verify,” Olen explained.

Olen argued that after block-level verification, teams would most likely perform interconnect verification followed not just by integration verification but system-level testing using the target software applications. The massive size of SoCs would, at the same time, drive a shift to more efficient use of software for verification that would lead to the separation of SoC and application-level testing.

Bare-metal software for tests

“You would perform bare-metal software testing maybe not using native software but a combination of RTL and C-level stimuli before you commit to full-out system testing,” Olen said, adding that the bare-metal software tests could be produced using formal-verification tools. “We are now merging techniques from formal and dynamic simulation, using formal to identify dead code, identify waveforms that can attack code and also to synthesise stimulus.”

Mentor is working on a technique based on non-deterministic finite automata (NFA). “The technique was around in the 1960s but used primarily for compiler testing. But you can use NFA to model the system. Using sentences from an NFA grammar it is possible to generate tests that are orders of magnitude more efficient than constrained random. People are seeing one to two orders of magnitude throughput improvement at RTL,” Olen said.

Altera has used the NFA-based technique to exercise the bus fabric on its ARM-based FPGAs. The test generator built C code that runs on the ARM that creates traffic on the fabric and looks for illegal conditions.

Olen said this type of automatically generated C would improve software-driven verification. “At the moment, system-level verification is an all-or-nothing type of situation where you go to full hardware-software coverification. But embedded C programs take a long time to write and they don’t scale.

“We are investing to take advantage of this formal technique where we synthesise embedded-C test programs that you can run natively on the processor in a bare-metal environment, so you don’t have to boot your OS first. Booting the OS is something you have to do but it’s not a great test. With a C program generated by formal techniques you get many more sequences run before booting the OS. And when you find bugs they are much easier to debug,” Olen claimed.

For interconnect verification, Mentor is looking at graphical techniques. Olen showed a graphic of aggregate traffic data from a set of verification. “We call it the ligament diagram: you can look at the width of the connections to which masters are talking to which slaves, and whether they should be talking to each other.”

Comments are closed.


Synopsys Cadence Design Systems Siemens EDA
View All Sponsors