Verification perspectives: the growth of emulation

By Paul Dempsey |  No Comments  |  Posted: April 16, 2014
Topics/Categories: Commentary, Blog - EDA, - Verification  |  Tags: , , , ,  | Organizations:

In this new series, we sit down with verification experts at various vendors to get their views on how design’s thorniest task is evolving. We begin with a two-interview featuring Mark Olen and Jim Kenney from Mentor Graphics, and focusing on emulation. Mentor has just unveiled an integrated simulation-to-emulation flow, the Enterprise Verification Platform, and our discussion illustrates both market trends and the philosophy behind the new launch.

According to the EDA Consortium, emulation sales rose from $188m in 2010 to $363m in 2012. When any segment effectively doubles in value in such a short time, you have to ask why. Particularly when that segment is as mature as emulation and has historically been seen as hardware-led and expensive.

First and foremost, Jim Kenney, product marketing manager for the Emulation Division at Mentor Graphics, sees our old friend complexity at play.

“I’ve watched sales patterns change over the last five years,” he says. “Design teams are increasingly saying, ‘We’ve barely finished verifying our last design and the next one’s going to be twice as big. We’re not going to make it without something else.’

“So, where’s that in gate count? I’d say roughly 50 million, plus or minus. If the next design is over that, the team is starting to think, ‘We’re never going to get to simulate this; we need to buy an emulator.’

“Maybe the count’s even as low as 25, 30 million gates. But somewhere in there, design sizes have grown to a point where they need to move beyond simulation.”

But the growth trend is not only being led by a traditional hardware count. An increasing amount of software is being exercised on emulators as well as on virtual prototypes and FPGA-based systems at the beginning and end of the pre-silicon development flow.

Boot ROM code, bare metal, RTOSes, exercising device drivers – all of these need to be validated not just to check the RTL but as much to make sure the code itself is evolving correctly through various iterations of development.

“Another issue with that is that the software itself increasingly comes in various forms,” says Kenney. “It’s one thing to say, ‘Let’s boot the OS load drivers and run some target applications,’ and then call that system level simulation. In many cases today, though, you also want to run some embedded C test programs on your emulator – what do you call that? Then, there are models as a third group.”

Elvis and non-Elvis

At a recent DVCon panel, Cadence exec Frank Schirrmeister noted how his company, another key player in the emulation space, distinguishes between ‘Elvis and non-Elvis software’: that which does leave the building (drivers, etc) and that which does not (test programs, etc). However, even this useful definition may need breaking down further. For example, what if companies move toward more post-silicon test to create a feedback loop for future designs: what was non-Elvis has become Elvis software.

Beyond that, there are also some ‘special cases’ promoting the use of emulation, driven by performance, standards and integration.

“We had a customer recently who wanted to run an industry-standard qualification suite against its design,” says Kenney. “Five minutes of real time was a month on the emulator.

“First you can ask, ‘What was that?’ Because it wasn’t an application? But It was really important for them to be able to say as early as possible that their design met the industry standard and passed that test. Because if it hadn’t, they needed to fix it.”

Another scenario lay at the boundary between block and system-level verification. A company was working on a new generation of an existing design, a very common task. The update went from four processors to eight, from a custom to a standard bus and had faster cache memory.

Verification at the block level offered high coverage in the simulator. Then the team undertook system level verification in the emulator. After that, they thought had all the hardware integrated.

“And when they got first silicon back and did performance characteristics for the new generation, it ran 10% slower and consumed more power,” notes Mark Olen, product manager in Mentor’s DVT division. “But they did everything right, didn’t they? Well, it turned out that the flow wasn’t verifying the interactivity between the blocks.”

“There used to be more of a gulf between simulation and emulation where that could happen. You’d do your block level on the simulator and then you’d go to the emulator. We’re finding now that there are some intermediate steps, so we’ve added technology that traverses back and forth between the two.”

“In this case, the customer’s problem turned out to be latency in a bus fabric. And they were checking everything but they had some traffic scenario situations that they were not generating through the constrained-random testing

“It was a massive volume of testing, but in subsequent work, we’ve helped them to be able to use the graph based stimulus, and run it in both the simulation and the emulation environment. We’re covering scenarios now, we’re not just covering functions.”

Emulation then has moved from a discrete activity undertaken only by those with bigger budgets, to increasingly one integrated in a more traditional view of the verification flow. And as this last example shows, the number of techniques being integrated goes still further.

In our next part, we’ll discuss both the future for traditional simulation but also how formal and graph-based strategies are further informing the flow.

Comments are closed.

PLATINUM SPONSORS

Synopsys Cadence Design Systems Siemens EDA
View All Sponsors