In his keynote at DVCon held this week (March 1st), Synopsys R&D vice president Manish Pandey described the ways in which the tools supplier has harnessed machine learning so far to gain speedups and improvements in coverage.
Pandey explained why verification is the target for much of this work, drawing on statistics that show that this phase in SoC design continues to dominate more and more of the overall project lifecycle. As well as being responsible for an acceleration in cost as transistor counts have increased, verification now accounts for 80 per cent of project time, he said. The question is whether technologies such as machine learning could deliver speedups of a hundred-fold, 10x cost reductions and improvements in quality of results of 10 percentile. “That doesn’t sound like a lot but this would take coverage from 80 per cent to 90 per cent. And the last 10 to 20 per cent is the hardest.”
The work at Synopsys on machine learning has drawn on many of the major techniques, Pandey said. Supervised, unsupervised and reinforcement learning can all play a part at different times though reinforcement learning has proven to be widely applicable particularly as it does not rely on the data labelling that underpins supervised learning. As a result, he noted that a lot of the use of particular machine-learning approaches is tactical and specific to the job being asked of it.
AI’s place in EDA
Another observation is that machine learning is largely a supplement to conventional EDA algorithms such as those used in formal-verification engines. “One thing that’s very imp to understand is that machine learning cannot produce the result itself. The results [from EDA algorithms] are deterministic, these results cannot be probabilistic. Probabilistic results can be used for decision-making but not for producing the actual result, which must be deterministic and always correct,” he explained.
There are problems with implementing machine learning in EDA, Pandey noted. In contrast to applications such as image classification and natural-language processing, “there is no standard labelled set we can refer to. The objects are not fixed-size.” Many of the data objects manipulated by EDA tools are best handled as graphs. “Working with graphs is harder than working with fixed-size images,” he said. Implementers also need to be mindful of the inference runtime: it only makes sense to draft in AI if the inferences the models run can execute in a fraction of the time of performing deterministic tests or simulations.
Pandey pointed to a number of projects at Synopsys where machine learning has helped speed up execution, improved quality of results or both. In the VCS simulator, the VSO.ai project employed machine learning to guide constrained-random verification (CRV) test generation in the direction of more productive stimuli. According to company statistics, this led to greater coverage using fewer tests. Where traditional CRV, it would have taken more than a thousand passes to reach the coverage target for a mobile design. With the AI-driven algorithm, this fell to fewer than ten.
In common with several other EDA suppliers, Synopsys has applied machine learning to engine selection in formal verification, using in its case reinforcement learning to train the orchestration subsystem. Similarly, AI is being used to pick RTL tests for nightly regressions so that more valuable tests are prioritized.
Clustering homes in on violations
In other areas of static verification, clustering has provided a means for making the results from tools used to check clock-domain crossings or reset trees quicker and easier to digest. Often a single core error will generate thousands of violations, which obscures the root cause. Clustering can reduce more than 100,000 violations to the order of a hundred clusters that make it more obvious which piece of logic is the primary cause. “The important thing is you minimise the number of human waivers. Very often they waive out important violations.”
Using a combination of supervised and unsupervised learning, Synopsys implemented an assistant for its logic debug tools that helps indicate where the root cause is more likely to lie in terms of the time window and candidate gates in the schematic.
Pandey said the use of deep learning can go further in EDA, pointing to recent work, presented by Synopsys at the KDD conference last summer, on using the Transformer-based technology now applied to natural-language processing to convert specifications into assertions that can be applied to formal verification or tests in simulation. “With iterative refinement and user feedback, it has become a very useful tool,” he claimed, offer both savings in time and labor.
Agnisys is also presenting a paper on this kind of technique at the conference this week.