DAC 2012: A look inside Accellera’s UCIS

By Chris Edwards |  1 Comment  |  Posted: June 6, 2012
Topics/Categories: Standards, Verification  |  Tags: , , ,  | Organizations: , , , ,

Accellera approved version 1.0 of the Unified Coverage Interoperability Standard (UCIS) on Monday and hosted an event at DAC on Wednesday to explain how it all works.

Introducing UCIS 1.0, Richard Ho, co-chair of the UCIS committee, nodded to the perennial Powerpoint slide that complains about verification overhead and the amount of project time it soaks up. He quipped: "If you've been on the show floor, you'll have seen that the companies have all got a slide that says 'verification is hard'. I've seen the numbers range from as low as 50 per cent to as many as 70 per cent."

Turning to the problem at hand, Ho explained: "There are a variety of verification techniques and methods. There are different metrics: code coverage, conditional coverage as well as functional coverag with some langs such as SystemVerilog. These coverage metrics overlap, but not completely. Unfortunately, it's not clear where the holes are.

UCIS was proposed as a first step towards dealing with this problem. It may span different tools, platforms and abstraction levels.

Background

"It started a long time ago," Ho recalled. "Companies got together in 2006 and said: 'we have got to work on this problem. And then we got Accellera interested. Accellera really stepped up and gave us the resources to work on it. A lot of EDA vendors have contributed technology.
Mentor Graphics's contribution forms the basis of the standard. There were also donations from Cadence, Jasper and Synopsys and some of the smaller companies. And we have a lot of companies who are involved."

Ho described the core of the new standard: "What we have in UCIS is a standard applications programming interface (API) that gives an interface to a coverage database. We have not specified a physical database: we rely on the vendors to provide a specialsied database.

"The idea is that you have multiple types of producers that produce different types of coverage information. The API provides a unform way to get that into database and also for tools to use that information to provide engineers with fuller view of what is going on. Using the API, tools can talk directly to one anther without going through proprietary interfaces."

Use-cases

As a new standard, it's hard for users to roadtest UCIS but Ho described some use-cases that the working group envisaged while working on the core API. "Right now, it's more of a wishlist. One use-case is if you have a single design and you have a simulation running on it that you run in month one, month two and month three. You want to be able to merge data – a temporal merge – to get a view how the coverage has progressed.

"A user may also want to do a spatial merge on s design that has a number of subblocks. You may have a testbench that targts block A, another testbench for block B and another for block C. It's helpful to merge those simulators together to get a picture of the overall coverage and see the holes in the block-level simulation.

"A third use-case is the heterogeneous merge where you combine different abstraction levels or verification techniques. For example, if you have a block A that you can simulate and get coverage, there may be a block B where engineers say 'we need to do formal on it'. And maybe run C in an emualtor. WIth a heterogeneous merge, you can see all those tog and provide useful reports. We think that will be a benefit to the user."

Ho said it's feasible that a merge could show the formal and simulation results for the same block, although the tools might not be able to provide clear view of how they overlap.

"Tools would be able to extract from database reports things such as how has coverage changed? And do on," Ho added.

Quick-hit benefits

The way that UCIS handles formal and dynamic tests, however, should provide some immediate benefits once tool support rolls out. Ho explained: "With formal, there is some additional information that is different from the simulation coverage information. You want to save the results of whether something has been proven or shown to be incorrect.

"We also have a way for those tools to mark unreachable code. If you have proved code that's unreachable you want to mark it as such, so that you know that it's been proven formally unreachable," said Ho.

"How are we going to enable this?" Ho asked rhetorically. "One of the key concepts is enshrined in the mission statement: make data universally recognizable and accesssible.

"Universally recognizable means we want to make sure that a cover-point in tool A is recognized as same cover-point in tool B. In some cases it might be obvious that line five in two files is the same. But the names might be different in different vendors' tools. We spent a lot of time talking about how to make object universally recognizable."

Everything must have a name

To make the most of UCIS, users will need to give things names and not just let the tools do it.

"Universal recognition is not canonical naming. When we started off we wanted it, but it's very difficult to get that across the industry," Ho explained. "In SystemVerilog covergroups aren't explict in name. But we want objects to be user names – we want to get away from vendor provided names. If we can get away from vendor-provided names, we can provide better interoperability."

An important distinction in UCIS is between the information model and the data model, said Ho. The core information model behind the coverage database is that each coverage point can be expressed using the form:

@event if {cond} counter++

For example, if the coverage point is showing how many packets of type 'write' a block receives, the database shows how many packets of type 'write' have been received up to that point.

"An important point is that the data model is not same as the information model. Another important concept is that the data model uses a primary key to look up each individual object. It has to be unambiguous.

"UCIS represnts the data model through a bunch of objects. There are four object types. You have scopes, coveritems, history nodes and attributes.

"The scopes give you a hierarchical view of the coverage model. Coveritems are where the counts are stored. And history nodes store information about the test used. The API gives you access to all these types of objects."

Ho explained that the scopes help provide the unique ID for each object: they are named according to which node on the tree of scopes the object occupies. "Scopes give you design hierarcharchy. It's not very readable to humans but it provides a unique name to the tool.

"It was important to have agreement between vendors – so we provided advisory templates so that companies use the same kind of mapping to create the same kinds of objects."

The data interchange format is XML, Ho explained. "One of the things we've worked on is to agree ways to interchange. We interchange data through text: using XML as a basis. It follows the same model as used in the API.

Know your limitations

"There are limitations, so caveat emptor. Information loss is inevitable. You can't save everything. The idea is that you save the most imp things. An assumption is that the source code is available: we don't save it. If you are design team that practices good hygiene, you can pull the source back from your code repository.

"We also don't provide automatic interoperability. Vendors use different semantics for metrics. Until there is agreemtn on metrics we can't provide automatic interoperability. But we can provide an enabler," Ho claimed

"Thru 2012 the focus is really on adoption and feedback," said Ho. "We are asking users to provide applications, such as merging, filtering for reports, calculating coverage scores, ranking the best tests or doing a merge between formal and simulation results. We are hoping users will do that. So we created an area on the Accellera website where users can provide little applets. We want people to provde as many as they can to provide greater utility for UCIS."

One Response to DAC 2012: A look inside Accellera’s UCIS

  1. Pingback: DAC 2012 Notebook | Tech Design Forums

Leave a Comment

PLATINUM SPONSORS

Synopsys Cadence Design Systems Mentor - A Siemens Business
View All Sponsors