Cadence extends AI to verification data with unified database

By Chris Edwards |  No Comments  |  Posted: September 15, 2022
Topics/Categories: Blog - EDA  |  Tags: , , ,  | Organizations: , ,

Already a keen proponent of machine learning for EDA, Cadence Design Systems is attempting to bring as much data together as possible from EDA tool runs to feed models with the introduction of a common data-analytics engine and database it calls the Joint Enterprise Data and AI (JedAI) platform.

“As chip design size and complexity has increased exponentially over the past decade, the volume of design and verification data has also increased with it,” said Venkat Thanvantri, vice president of AI R&D at Cadence. “Previously, we saw that once a chip design project was completed, the valuable data was deleted to make way for the next project. There are valuable learnings in the legacy data, and JedAI makes it easy for engineering teams to access these learnings and apply them to future designs to deliver optimal engineering productivity and PPA and ultimately more predictable, higher quality product outcomes.”

Supporting both structures and unstructured data, JedAI is designed to feed the Cerebrus implementation, and Optimality system-optimization tools as well as third-party silicon lifecycle-management systems and the Verisium verification, launched at the same time and which includes tools brought into the stable with the acquisition of Verifyter at the end of February.

The platform is intended to handle a variety of EDA data inputs that include: design data such as waveforms, coverage reports, and physical layout shapes and timing and voltage analyses; and workflow data from the tools, such as runtime and memory usage, and which tools were used.

One of the aims is to make it easier to compare metrics across different versions of the same design or multiple designs and use the results to guide actions that improve PPA and increase verification coverage.

Another proposed use is to capture chip design methodologies and automatically transfer design data between projects through data connectors. Users will be able to build their own analytics using Python and Jupyter Notebook and other languages that can access REST APIs.

The new Verisium portfolio is a suite of applications designed to use big data and AI to optimize verification workloads, improve coverage, and accelerate the root-cause analysis of bugs.

Verisium brings together a range of verification data sources, including waveforms, coverage, reports and log files. The initial set of tools includes AutoTriage, which builds machine-learning models that help automate the repetitive task of regression failure triage by predicting and classifying test failures with common root causes.

The WaveMiner analyze waveforms from multiple runs and determine which signals, at which times, are most likely to represent the root cause of a test failure. One of the tools that came with the Verifyter acquisition is PinDown, which focuses on source code changes, test reports and log files to predict which source code check-ins are most likely to have introduced failures. These tools work with a new debug and visualisation tool and a verification-management environment. Though Verifyter had been working on triage based on log files, Moshik Rubin, Cadence senior product marketing group director, said the AutoTriage tool is the result of work within the parent company.

The tools have been in early use at several customers. “We have already observed a significant boost to functional verification productivity, leveraging Verisium AutoTriage, SemanticDiff and WaveMiner,” claimed Mirella Negro Marcigaglia, STM32 digital verification manager at STMicroelectronics.

Comments are closed.


Synopsys Cadence Design Systems Siemens EDA
View All Sponsors