Verification platform offers unified compile, debug environments

By Luke Collins |  No Comments  |  Posted: September 29, 2014
Topics/Categories: Blog Topics, Verification  |  Tags: , , , , , , ,  | Organizations:

Synopsys is integrating its verification offerings to help SoC designers get chips to market more quickly.

It has launched what it dubs the Verification Continuum platform, which will offer a unified compilation process and unified debug tools for its virtual prototyping, static and formal verification, simulation, emulation, and FPGA-based prototyping tools. The platform is underpinned by common verification IP, planning and coverage technology.

According to Tom Borgstrom, director of the verification group at Synopsys, the new platform is meant to help design teams, especially those working on SoCs with a large software content, ‘shift left’, that is, move as much of their verification effort as close to the beginning of the project as possible.

“There’s a need for new advances in verification to deal with software-centric FPGAs,” he said, “to find bugs faster and bring up software sooner.”

Recent reports suggest that hardware and software verification accounts for more than half of the budget for a system SoC, with software verification accounting for the majority of the expense.

Unified Compile

Synopsys argues that virtual prototyping, static and formal verification, simulation, emulation, and FPGA-based prototyping tools exist on a continuum, from most abstract to most hardware focused. As the design moves along this continuum it becomes closer to the final implementation, at the cost of reduced debug visibility.

Each form of verification relies on different files, front ends (Source: Synopsys)

Figure 1 Each form of verification relies on different files, front ends (Source: Synopsys)

Using each type of verification involves a different set-up process, with inconsistent language features, semantics, and compile flows, each creating work and introducing risk. Each type of verification tool executes the design in a different way, and inconsistent debug environments limit verification efficiency.

The net result is that there’s a cost to setting up a design for each type of verification and a further cost of moving between the different types for example, if it becomes necessary to go back to a more abstract form of verification to regain debug visibility.

“This verification discontinuity creates the risk of introducing more bugs, and the friction involved in moving between verification approaches can slow progress,” said Borgstrom.

Synopsys’ response has been to build a common verification front-end, based on the well-tested VCS flow. The compilation flow ingests common RTL, UPF, testbench and verification IP files, and then subjects them to common analysis, elaboration, debug preparation, optimization, code generation, synthesis, RTL and FPGA mapping processes. A separate back-end takes the output of this work and runs partitioning, instrumentation and FPGA place and route processes to target Synopsys ZeBu3 emulators and, from next year, a ZeBu-like flow targeting the company’s HAPS FPGA prototyping systems.

A common front end reduces the risk of introducing errors when moving between verification strategies (Source: Synopsys)

Figure 2 A common front end reduces the risk of introducing errors when moving between verification strategies (Source: Synopsys)

According to Borgstrom, users who are only using simulation won’t have to compile designs to an emulation or FPGA target, and so won’t find their work slowed down. On the other hand, since the emulation and FPGA-prototyping compilation process is now built on VCS, if emulation users want to move designs to simulation they will already have a VCS simulation kernel to hand.

“The goal is to enable you to move between all the platforms seamlessly,” said Borgstrom.

Synopsys claims that it has managed to make compilation for the ZeBu emulation target up to three times faster, mainly by beefing up the ZeBu development team since the acquisition of EVE and sharing optimization technologies used in VCS.

Unified Debug

The second major part of the Verification Continuum offering is the development of a common debug environment, based on Verdi, to address the whole hardware and software debug process in a consistent manner.

“It’s a challenging task,” said Borgstrom. “Hardware and software engineers come from very different points of view, with their own expectations of the performance, visibility and usage models of their debug tools.”

A common debug environment builds on unified databases (Source: Synopsys)

Figure 3 A common debug environment builds on unified databases (Source: Synopsys)

Synopsys has opted to provide two main interfaces- a Verdi based interface through which hardware engineers can trace events at the signal and transaction level, and an Eclipse-based software debug environment. Synchronisation mechanisms mean that developers can, for example, step through the code of a device driver and watch waveforms change on key signals in the Verdi view.

Unified debug relies on unified databases for debug and coverage, accessed by a common debug and analysis engines.

Future of emulation

One of the main reasons for building the Verification Continuum platform, according to Borgstrom, is to make emulation more accessible. Synopsys argues that because its ZeBu3 emulator is based on commercial FPGAs, (in this case Xilinx’s latest 28nm parts, with 20nm UltraScale parts to follow), it can launch updates more quickly, and offer lower acquisition and operating costs, than those based on custom chips.

“We think this approach will make commercial FPGA based emulation mainstream within two years,” said Borgstrom. In the meantime, Synopsys and Xilinx have been working to tune the output of the Unified Compile process to drive the FPGA maker’s Vivado implementation suite effectively, and to tune Vivado to handle the ASIC-like designs that the Unified Compile process creates effectively.

“What Vivado sees coming out of the ZeBu flow is different from what it would see coming out of a conventional FPGA flow,” said Borgstrom.

Planning and measuring

How do you know if you have done enough verification, or which type of verification will give you the best return at each phase of the design process? Borgrstrom says that unified verification planning and coverage is part of the technology roadmap for the Verification Continuum. For the moment, there is a verification planner that spans technologies and with which you can define how a design should be verified.

Verification Continuum is on some customers’ hands at the moment, with early availability scheduled for December 2014. General availability is scheduled for 2015.

Comments are closed.

PLATINUM SPONSORS

Synopsys Cadence Design Systems Siemens EDA
View All Sponsors