Verification perspectives 2: formal for the masses and graph-based techniques

By Paul Dempsey |  1 Comment  |  Posted: May 21, 2014
Topics/Categories: Verification  |  Tags: , , , ,  | Organizations: , , , ,

We continue this new series, gathering the views from verification experts at various vendors on how design’s thorniest task is evolving, with the second part of our interview with Mark Olen and Jim Kenney from Mentor Graphics. This time we focus on the future for traditional simulation and how other formal and graph-based techniques are being brought into play.

“To misquote Mark Twain, ‘The rumor of its death is an exaggeration.’ Simulation is with us and will be for some time to come,” says Mark Olen, product manager in Mentor Graphics’ DVT division. “In fact, it’s still growing—not as fast as it used to, but the growth rate is still solid.

“There are still some things that simulation suits best, so we’re still doing everything we can to wring out all the speed that’s possible.”

At the same time though, Olen adds that finding ways to let you do more with less and do it earlier is a priority.

“We used to say that you if you could do something in the simulator before you got to the emulator, then you should. But now you can add to that that if you can do something before you go to the simulator, then you should also. It’s a ‘rule of 10’—the earlier you can do it and catch any issues, the more you save. The fix is a tenth of the cost,” he continues.

An important trend is growth in the use of formal verification techniques.

Formal verification without tears

“Formal used to be seen as expensive and requiring a lot of expertise,” says Olen. “It got pushed to the top of the pyramid. But there’s a massive rush now to automate that technology, to make it easier to use and deploy in the mainstream.

“We’re not alone in that, but we got in there early with Questa for clock domain crossing. We took a formal engine—it was the same engine that’s inside our equivalence checking, but the user didn’t need to know that. We insulated them from that. It was simply a product built around doing nothing but verifying clocks across, 30, 50, whatever domains.

“We made it easy. All the metastability issues, reconvergence, fan-out, protocol performance, all those things. We took them and put them in a push-button environment.”

That Mentor is not alone in seeing the value of this approach was already shown by the growth achieved by specialists such as Real Intent. But a still bigger endorsement came in the last few weeks with the acquisition of Jasper Design Automation by Mentor’s ‘Big 3’ rival Cadence Design Systems. Jasper has also built a business around task-oriented formal ‘apps’.

The debate around graph-based verification

However, another emerging technique where Mentor and Cadence do not see eye-to-eye—at least not yet—is graph-based verification. At the heart of Mentor’s take on this technology is a portable stimulus description language that it recently donated to Accellera. And on one level, the idea of controlling and focusing stimuli.

“You have to go back to constrained random stimulus, in a way, to see what’s going on,” says Olen. “Constrained random was great in that it generated a huge quantity of high quality stimulus. But when you dug further into that, you could find that up to 90% was repetitive.

“So at the block level, you can look at graph-based as a better way to get the coverage. It’s consistent with SystemVerilog, but instead of using SV constraints to describe a way to make a random choice, you use a UVM-based, SV test bench and put a graph layer on top of that.

“You’re still describing the same thing but what’s happening when you generate the tests is that where you might have a huge bell curve of random things that you do at the same time for constrained random, you can a much flatter – though still random – distribution. So, it’s much more efficient.”

This may illustrate where Mentor and Cadence philosophically differ. The question from Cadence’s side being about how much more incrementally better graph-based is against constrained random techniques. After all, Cadence has a lot invested in constrained random techniques specifically — it acquired one of the technology’s pioneers, Verisity.

But Olen points to increasing interest in graph-based techniques at yet another level, moving from the block to the system.

“We also work with customers, though, who say, ‘We don’t really care about that incremental issue against constrained random. We look at this as a whole different way of modeling stimulus and behavior. And we don’t even care about support for SystemVerilog or UVM—we’re working in C or Assembly at the system level, and we’re not targeting the blocks that our SV models describe,” he says.

“We’re hearing that, and you’re also seeing the guys at Breker [Verification Systems] target this pretty specifically. They’re going hard after system-level verification and pretty much leaving the block level to the guys doing constrained random.”

Jim Kenney,product marketing manager for the Emulation Division at Mentor Graphics, adds, “And now you’re starting to see these system strategies become very portable as well. You can load onto an emulator, maybe later in an FPGA prototype, and beyond that for in-the-field diagnostics on end-product failures.”

The verification plan still comes first

What advances in formal, graph-based and other areas, such as virtual prototyping, show though is that a verification manager is now confronted by a broad range of options in terms of segmenting his methodology. Indeed, is a risk growing that these options are becoming too broad? How do you choose?

“It’s still the same,” says Olen. “You gotta have a plan. You need specific goals for each stage and they shouldn’t overlap. You look at what you want to do and which tools are best. In some cases, it gets clearer.

“Take emulation: people want to do one of two (or both) things. They want to do some simulation acceleration but a lot now seem to get more value out of emulation from running software against the chip.

“Hardware/software co-verification is complicating things. Well that’s what people say, but I can go back 15 years and we were doing it, and it was a problem them. But it’s always about matching the best fit to the plan rather than saying, ‘Oh, I have all these techniques, now which should I use.’”

Comments are closed.

PLATINUM SPONSORS

Synopsys Cadence Design Systems Siemens EDA
View All Sponsors