Next wave of innovation in verification technology must come from integration

By Warren Stapleton |  No Comments  |  Posted: February 27, 2014
Topics/Categories: EDA - Verification  |  Tags: , ,  | Organizations:

Warren Stapleton is a Senior Fellow in AMD’s verification methodology team.Warren Stapleton is a Senior Fellow in AMD’s verification methodology team, responsible for the development of AMD’s long-term verification-related strategy.

SoC verification has never been simple. With each new chip generation, we’ve pushed the limits of our verification technology.

But now, as SoC gate counts skyrocket and our target applications call for constant connectivity, we’ve reached a point where the variety of IP on a single chip, multiple protocols, timing complexity, power concerns, and the need to validate software at the same time as we’re verifying our hardware are all pushing design teams to adopt new, best-in-class verification technologies to get our job done.

While we’ve taken advantage of new technologies in areas such as simulation, static and formal verification, low-power verification and advanced VIP, there is still a lot of room for innovation in the area of integrating these technologies.

At AMD, we went through our own integration process when we started developing our APU SoCs back in 2006. We integrated our leading-edge CPU and GPU into a unified SoC so we could combine functions, boost performance, lower costs, and optimize the way each core worked. Creating a tight integration between these previously separate products created value for our customers.

After the initial successful APU integration we were able to continue to build on that foundation: it gave us the opportunity on future revs to leverage the synergy between the different processors as we sought further opportunities for optimization. Now we’ve been able to mold the processors together to reduce redundancies and leverage their combined performance. With tighter integration we now deliver a better overall product experience to our customers.

The next wave of innovation in verification software technology needs to follow the same strategy. We need integration of the best-in-class tools that are available to us, such as advanced formal verification, low power verification, software validation, and high-performance simulation, with unified debug, and unified coverage.

Verifying an SoC today requires a number of software technologies that are only available in loosely tied together environments. For example, if the team wants to validate software using a C-level model, it is difficult to correlate that debug information to the hardware environment. Not only do we spend time gluing together a lot of separate verification technologies, we sometimes get different results from each one and spend engineering time and resources to resolve discrepancies. Spending our time gluing this software together is low RoI work that focuses on making the different technologies work together but limits the ability to fully leverage the best features of each tool.

If the collection of disparate technologies that we use could be integrated into a unified product with common debug and coverage environments, this would give our design teams a substantial productivity boost. Going beyond the initial basic integration to more native merging of technologies could offer opportunities to create synergy between pieces of the verification environment. This would add significant value to SoC design teams in the long run.

However, high levels of integration can have their downsides. It is important to keep in mind that there may still be a need to access external capabilities, either because of legacy or newly developed technology.

In the case of AMD’s APUs, as we integrated our processors more tightly, we continued to improve our internal modularity as well and defined cleaner interfaces between them. If we didn’t maintain adherence to our internal standards in developing these, it would have been much more difficult to switch out an IP and replace it with the next-generation code, or retarget it for a different application when necessary.

Similarly, an integrated verification system must also remain open, interoperable, and adhere to industry standards. This includes SystemVerilog VIP and verification engines that work openly with debug and analysis tools. As an example of this trend to focus more on the integration problem, the Accellera standards body has been working on many areas that contribute to improved reusability across the entire verification and design space: examples include the work on UVM, the Multi-Language Working Group, and the IP-XACT standard.

In summary, I believe the next wave of innovation in verification technology must come from improved levels of integration. This will enable us, as design and verification teams, to maximize our opportunities by focusing on our real value-added, of being excellent designers, rather than spending our time as tool integrators.

Author

Warren Stapleton is a Senior Fellow in AMD’s verification methodology team, where he is responsible for the development of AMD’s long-term verification-related strategy. Prior to AMD, he held engineering and management positions at Montalvo Systems, Azul Systems, Redback Networks, Siara Systems, Nexgen, and Austek Microsystems. He earned degrees in both Electrical Engineering and Mathematics from the University of Sydney.

Sign up for more

If this was useful to you, why not make sure you’re getting our regular digests of  Tech Design Forum’s technical content? Register and receive our newsletter free.

Comments are closed.

PLATINUM SPONSORS

Synopsys Cadence Design Systems Siemens EDA
View All Sponsors