Rising to the verification challenge of open source
The open-source silicon movement is democratizing chip design with more engineers using freely available tools to design chips that don’t cost $10m in NRE or are produced at more than 10 million units at the most advanced nodes. Freely available, robust open-source verification tools are another matter, however, especially for medical electronics devices (MedTech).
How sensitive chips will be verified in the open-source era was one key question raised during a panel at SEMICON West. The moderator was Lucio Lanza, managing partner of Lanza techVentures. Other members were Lu Dai, Senior Director of Engineering at Qualcomm and Chairman of standards body Accellera; Mike Chin, Principal Software Engineer at Intel; Dave Kelf, CEO of Breker Verification Systems; and Jan Vardaman, President of TechSearch International.
Lanza began the conversation by noting that open-source technologies, such as the RISC-V processor, are leading to a democratization of chip design.
“This market expansion is increasing the size of potential designs. Open-source design tools are becoming available for devices that are less sophisticated than in the past and in many different applications,” he said.
Intel’s Chin is already using using open-source tools for design and layout, but he quickly identified a gap. His group does not have open-source tools for verification. “If, at the end of the day, you don’t verify the design that you arbitrarily put together, you have some very significant problems,” he said.
Chin believes that standards will be the driver in terms of arbitrary design of silicon or whatever product an engineer wants to create. If the design does not adhere to both a specification and a standard, then it cannot be used – the work reaches a dead end. Engineers must be able to adhere to a framework that will deliver something safe and with zero defects.
Qualcomm’s Dai said his team has seen some open-source verification tools, but most engineers are not willing to use them for commercial products.
“The way I look at open source is similar to going to CVS or Walgreens buying generic drugs. More than 50% of the population don’t buy generic drugs. They still buy the branded one, even though all the ingredients are the same. People are just not comfortable,” he said.
So it goes too, he said, for open-source verification. An open-source chip design conceptually should work. But can engineers be comfortable using open-source verification tools? In part, he argued that depends on what end-application the tools are being used for. It might be okay for a vacuum cleaner, he said, but continued, “I’m going to pay more money to go for a safer solution if my chip is intended for a heart surgery application.”
Chin said this also reflected the goal behind any design. “Some of that comes back to verification for the specialization that you put into the shift that you’re designing,” he said. At an open-source development level, an engineer cannot anticipate all the different usages or combinations of a design: “there is a necessity for that specialization that he or she will not find publicly. It will be done by an individual or a company in order to thoroughly verify the specialized component.”
Staying with the MedTech exciple, Dai also said that much depends on the definition of open source. A company still makes a generic drug, and it is clear who is responsible for it. “I would be uncomfortable,” he continued, “if a product doesn’t have a liability clause. We feel comfortable with the medical device because I know that if something goes wrong, I can sue. If you can sue, the engineer is going to be more careful.”
Breker’s Kelf believes there is consequently going to be a bifurcation between open-source tools for those kinds of application and the companies building a basic open-source platform.
Kelf believes that will work for many applications, he continued, but companies creating end-products will also have to build or buy commercial-grade hardened verification tools that engineers feel they can rely on. These are expensive and using them is and will remain expensive as well, but they will be necessary to cover liability issues.
Kelf believes that a big question for MedTech specifically will be whether the final target cost of the device that can bare the necessary commercial verification costs.
Lanza agreed that those medical devices will not come from an open-source platform. His view is broader. He believes open source allows engineers to undertake designs that they could not afford otherwise. Even if open-source technologies remain hard to use in some segments, those can use them more easily will be enough so that the number of people working on designs explodes, as will the number of applications and the number of areas that are addressed by electronics – many of them never having used the technology before.
“Now, is it true that they’re not going to be perfect?” he asked rhetorically. “Absolutely. If I can make a prototype for nothing, I can make a prototype of something I can finally afford.” He went on to note that a strong demand will incentivize people or companies to find a way to do it cheaply. If it is done professionally, more expansively and with guarantees, open source will thus grow the engineering universe and will enable engineers to undertake design without feeling that they must become geniuses.
Dai agreed that in this wider context, open source can be a good initial stimulus, growing both the technology and the community around it. He noted that, as an example, the RISC-V CPU has already built a tremendous following. But he was doubtful that anyone would be willing to buy a RISC-V processor or a chip based on RISC-V if it were completely free – effectively, full open-source hardware.
“When it’s completely free, I probably won’t want it and I’m going to wonder what’s behind it and why it is free. If the engineer uses open source and puts something on top of it and charges for it, I may feel more comfortable,” Dai said. “The way I look at it is like when a company has a free party, they still charge $10. If they don’t, people wouldn’t care to show up.”
Kelf agreed that because the RISC-V IP might be free, designers need to go further. “We’re seeing the advent of some hard-core processor companies building an application process [around RISC-V] and going after Arm. Arm puts $100 million per year into verification of its devices. These processor companies are going to have to do the same for their RISC-V processors and they’re trying to figure that out now.”
Already, he noted that these Arm competitors do use commercial EDA simulators, sometimes emulation and do invest substantial sums in verification platforms for their big application processors because they understand that the market is not going to welcome a free RISC-V chip.
But open source may not be completely shut out, for now, from more sensitive areas. TechSearch International’s Vardaman, gave an example around work in MedTech and prosthetics. This usually involves complicated and typically, but not always expensive circuitry. But what if that puts the technology beyond the needs of a market that desperately needs it.
“I went to school as an undergraduate in Macon, Georgia, at Mercer University, one of the oldest institutions in the south,” she said. “A group there designs inexpensive prosthetics that they take to Vietnam because so many people have been injured and lost limbs by the mines that were left behind.”
An open-source design that allows someone to make something inexpensively and deliver it to people who must otherwise walk with a stump can have a substantial impact on society, she continued. It is why some designs need to be inexpensive because the end-market may not be able to support them.
Lanza noted RISC means Reduced Instruction Set Computers. and that RISC-V specifically was developed at UC-Berkeley when commercial instruction sets were more complicated. Arm is Advanced Reduced Instruction Set Computer Machine, an Advanced RISC Machine. It was supposed to be a RISC CPU, but now Arm is an environment, an infrastructure.
“When you go back to RISC, it’s going back to the origin,” he continued. “You are reducing it to the point where you can utilize it for different applications at a reasonable cost in an easy way to design. The real issue is how are we going to work to make these RISC-V based designs, the RISC fundamental designs, verified in a safe manner?”
Kelf responded that RISC-V verification requires a layered approach. “We have the need to ensure instruction compatibility with the standard, then to check that the processor behaves correctly under all conditions including custom instructions,” he said.
”The following layer is the SoC check to test the processor on the SoC platform, verifying such areas as system coherency. Now on top of this, we have security and safety testing using such techniques as fault analysis. Some of this can be accomplished with open source. In general, commercial-grade tools and techniques are required.”
This looks like bottom-up verification, Lanza remarked. “Here is the chip and it needs to be verified. I’ll test it. Okay, done. This chip works.
“This was perfectly okay if the targeted applications are known.”
Instead of a specific set of applications, Lanza said that the range of applications is going to vary over time and the challenges will be different. Verification is changing irrespective of how the underlying technology is behaving. Chips are the same, but the application is different. The verification starts becoming its own world at the high level as it reflects the application specification, and the engineer does not know how that will be connected to the bottom-level verification.
Kelf agreed that there is a trend toward application-specific verification, where the chip remains the same but the specification changes. Verification becomes a top-down process and specific to the application. The verification parameters change, and the design will probably use software as well, making application verification even more complex.
If the design is used in an application different from that for which it was designed, the challenges to the chip underneath are different, Lanza inserted.
Dai noted that this is still in many ways a good thing. “You design a chip that did not target a certain market. However, you were able to reuse the device for that new market. It’s a plus. Sometimes, that plus is accidental,” he said. “In the future, you will try to design your chip in a way that ensures it will be ready for different markets.”
Dai referred to the idea of system engineering, a concept that tries to determine what design features a chip will require to target a particular application but has fallen out of fashion in some segments.
“Many engineers now will say for the computer or cell phone, we don’t need a system engineer. We know everything. Well maybe, but this is not true for medical.
“For a MedTech device, we know nothing. We need a system engineer to tell us what needs to be done. Chip complexity is high and it’s difficult to complete verification with random coverage, I will not be able to hit all possibility. That’s impossible. I have to constrain it. The hard part is to constrain it properly.”
But then, Lanza concluded, you are going to get engineers on the packaging side who will end up taking your chip and pulling it in a different direction that you never thought about. As chip design democratizes and more engineers begin to implement on an open-source platform as part of their design and verification flow, the industry will have to think differently.