Bringing true power analysis to hardware/software co-design

By Bill Neifert |  No Comments  |  Posted: May 15, 2014
Topics/Categories: Embedded - Architecture & Design, EDA - ESL, Verification  |  Tags: , , ,  | Organizations:

Bill Neifert is chief technology officer of Carbon Design Systems. Bill has designed high-performance verification and system integration solutions, and also developed an architecture and coding style for high-performance RTL simulation in C/C++.Bill Neifert is chief technology officer of Carbon Design Systems. Bill has designed high-performance verification and system integration solutions, and also developed an architecture and coding style for high-performance RTL simulation in C/C++.

The convergence of hardware and software is well under way. We’ve settled on some of the basics of hardware/software co-design and verification but we need to take one other important consideration further: Power analysis.

Progress so far

I started in hardware design 20 years ago and one of the first projects I worked on tackled system-level verification for a multiprocessor server. I had to assemble a virtual model of the system in RTL simulation, run some software on it and use that to find bugs that might have been missed by our verification team — back then just one other person on a five-member ASIC project. I was too busy doing the hardware verification and debug to write dedicated software for this task. So I borrowed code that the firmware team was writing to boot the server. By using that, we were able to find all sorts of issues in both the hardware and software.

Later I became an application engineer. I was impressed by the more sophisticated verification methodologies in place at other companies, but one thing surprised me — the lack of system software within them. Other than the folks using an emulator to bring up RTL code when it was complete, few people used real system software in a meaningful way.

Verification methodologies have evolved greatly since. At the block level, verification is done amazingly well. The convergence of multiple hardware blocks together with the software to drive them may still seem more ad hoc than we’d like. But EDA has seen this need and is bringing solutions to market, typically focused around virtual prototypes with a tie to accuracy either by using accurate virtual models or a legacy solution such as an emulator.

There are even products coming out that automate the creation of software tests to exercise the complete system. I sat on a panel at MTVCon recently that focused on system-level performance analysis. Everyone agreed that virtual prototypes with a tie to accuracy were the best way to accomplish that. The lack of controversy may have made for a less confrontational session than the audience wanted, but it showed how far we have come.

Power analysis needs more thought

For all that progress though, software-driven power analysis lags behind software-driven verification, at least at those points in the design cycle where it can have greater impact. Most engineers seem content to do power analysis by running block-level vectors extracted from RTL verification through power analysis tools. This has two fundamental drawbacks:

  1. RTL verification is typically designed to maximize toggle coverage in a block, not represent the behavior of the block in the actual system.
  2. The behavior of a block can be greatly affected by its interaction with the other components in the system, as well as the software that drives it. You need to take a broader view.

This approach generates power numbers that miss the critical system context and are typically geared toward the worst case, since it’s rare that a real block will toggle as much under a software load as it will in a modern RTL testbench.

Elsewhere, some design teams are grabbing system-level power numbers by running software on a previous version of an SoC and judging the impact of routines that way. Since most designs are refinements of previous ones, this offers a fair approximation for only a portion of the whole. It does not generate power numbers for new features. Anyone working on a completely new design is pretty much out of luck.

Power analysis leveraging virtual prototypes

Leading-edge methodologies seem to be gravitating toward power analysis that leverages the same virtual prototype technology being adopted for software-driven, system-level verification. This means running software on the virtual prototype, extracting waveforms and running those through a power analysis tool. The activity represented by these waveforms is far more representative of actual system behavior. But while it may be accurate, it takes a lot of time to generate those waveforms and then run the results through a power analysis tool.

Better strategies take this convergence to another level. They use software running on a virtual prototype to create power vectors to run through an analysis tool - and then use this data to instrument the virtual prototype for future use. This means that late software runs do not suffer from the slowdown inherent in dumping waves for post-process analysis.

This convergence is confined to a small but slowly growing number of design teams. As power needs grow greater and software content continues to increase, I expect it to continue and see many more design teams adopt this methodology.

Learn more about Carbon Design Systems' strategies for virtual prototyping at

Comments are closed.


Synopsys Cadence Design Systems Mentor - A Siemens Business
View All Sponsors