Keynote speakers at this year’s DVCon Europe conference called on the verification community and silicon designers overall to take a look at the software industry to find ideas that might speed up processes and improve the quality of the product.
Though software is hardly bug-free when it ships and generally suffers fewer consequences when those bugs emerge, the software world has warmly embraced some techniques that could improve the hardware design and verification process. Moshe Zalcberg, CEO of Veriest Solutions, put together a “menu of tasty ideas” that draws heavily on software concepts, though not exclusively, in his talk.
“The amount of time spent in verification is not getting better,” said Zalcberg said. “Maybe we have less bugs as result but the number of required respins is pretty much the same if not worse than it was in the past. And a good part of the cause of those respins is because of functional errors. I know hardware is hard and it’s only getting harder, but can we do better.”
The first possibility is greater use of open-source software and not necessarily for the core EDA tools, though that is the focus of DARPA’s Electronics Resurgence Initiative. Many teams have already embraced regression environments built around Jenkins and similar tools: in her keynote Arm vice president of system engineering Vicki Mitchell, described in detail how the IP supplier uses these tools to build a highly automated check-in and verification-management system.
Zalcberg pointed to other tools that might prove useful to individual designers and verification engineers. The main example he chose is the language Python. “It’s simple to use and maintain. College graduates know how to use it and there are lots of tools and libraries available for it in open-source form,” he said. “There are limitations: there is no good constrained-random generator for it and no built-in functional coverage tools.”
Python is already a feature for some teams. Zalcberg cited Harry Foster’s latest survey which indicated 30 per cent of users applying Python to verification tasks now and over the next 12 months. “This was the first time he asked about, but the fact is that it has become part of the map. It’s clear the industry is looking for other solutions.”
Zalcberg pointed to agile methodologies as providing another source of inspiration for verification. This is already underway at Arm, Mitchell explained.
“The Agile Manifesto has 12 principles,” Zalcberg said. “But I want to touch on just three: continually deliver value; deliver working code quickly, emphasizing that the code has to be working; and maintain a constant pace. One of the benefits of this is in the transparency and predictability of the schedule.”
Though it is not entirely intuitive as to why agile should provide more predictable schedules, Zalcberg argued this could come from a subtle repositioning of the way milestones are allocated and scheduled compared to a traditional waterfall process.
“Say you have a feature list. Every four weeks, you allocate four features. But at the end of the sprint, you find Feature D doesn’t work. What you’ll do is, at the beginning of the next sprint, decide D is rolled over to that next sprint with three other new features. The impact on the overall schedule is limited because you have a hard stop on the sprint.
“With milestones, we plan from A-Z,” Zalcberg added. “I’m simplifying this but if a task drags, we usually push the milestone out and that is for the other tasks that should be have done as well. Everything gets dragged out because the milestones move. And something that was meant to take a year now takes longer. That may also happen in agile because you have more sprints. But with the fixed cadence, you should be delivering something every sprint.”
Agile affects the check-in process, Zalcberg said, as it puts greater emphasis on unit testing, much of it carried out by designers rather than having blocks thrown over the wall and put in a queue for the verification team. Coding standards become more important, he said. And there is also a greater focus on continuous integration as the emphasis in agile is delivering a minimal viable product at each sprint. This helps avoid the “integration hell” that often faces waterfall-process projects where issues with interface and functional compatibility do not manifest until late in the day.
Mitchell said her experience in internet software delivery, where the emphasis is solidly on developer operations or devops, helped shape how agile techniques are being exploited at Arm. “I had the opportunity to see how agile and devops practices improved the efficiency of the team and the overall product. We adopted the same practices as hardware can realise some of the same benefits, such as early error detection and frequent releases.
“Our road to hardware devops was basically two steps. At a very high level, we had to do two things, in phases. During the first phase we made a change from a waterfall process to one that uses feature-based milestones. In the second phase we automated the integration. It’s more than automated: we made it intelligent to make sure quality levels are maintained for those feature-based milestones.”
In the Arm environment, there are gatekeeping functions that perform sanity tests on checked-in code and only allows it to join the main branch if the module passes. This approach has let Arm avoid having to manage separate development branches that can become difficult to maintain though it involves care and attention to avoid the test framework getting bloated over time, Mitchell explained.
Harnessing machine learning
Zalcberg said greater use of the cloud will help with resource management, something that some teams have already found, particularly with compute-intensive functions such as physical verification. “We are still not embracing the cloud the way we could,” he said, pointing to the potential for greater use of analytics of various forms.
“Last year I was sitting in a DVCon Panel that was about imagining how AI could reshape the verification landscape. We were there in 2019 imagining how. If we look at our friends on the software side they are not imagining, they are doing. They have AI-aware editors that catch bugs when they are committed. They suggest code snippets to developers. And it’s already part of Github where there are functions to search for potential security vulnerabilities,” Zalcberg said. “But there is some work being done on our side of the fence as well.”
That is work that is taking place at Arm, Mitchell explained. One example is the use of machine learning in regression testing. “It’s a tool developed by the productivity engineering group at Arm and provides an improvement to our gatekeeper flows. It’s a statistical tool that optimizes all the tests into a 50-[component] smoke test that can be run periodically and optimized in different ways: for wall-clock time, statistical coverage or for pass/fail testing.”
Though agile could span a design project, Zalcberg pointed to one European user where it’s isolated to the verification team. The minimum viable product in this case is the suite of verification tests. “The hardware guys wanted to work waterfall and it’s working. Agile means different things to different people.”