The notion of what it means to be an EDA company is changing fast. The growing reliance on software for practically any design that has a high degree of logic complexity is forcing tool vendors to look closely at the interface between the hardware and that software. For Aart de Geus, chairman and co-CEO of Synopsys, managing the relationship between the diverse elements of a complete SoC is the key.
“The whole notion of having software was to keep it separate from hardware. I started by flipping switches. Then compilers came in so that the software guys didn’t have to touch hardware,” de Geus recalls.
Not all of the software engineers can remain aloof from all of the hardware. The focus on power consumption and system integrity brings with it a need for people who can straddle the two worlds, and many embedded systems rely on careful management of their interaction. This is somewhere EDA can contribute. “We have a lot more value to add on the boundary,” de Geus claims.
The EDA world is increasingly about dealing with boundaries as manufacturing issues strip away abstractions that the hardware designers have become used to, beginning with the concept of standardized logic transistors and gates.
“We have exactly the same issue in silicon. We are interacting with all the advanced guys on yield matters because we know physics and also design. The two are no longer independent,” de Geus notes. “When there are yield issues you need to determine is the problem systemic, because of the design? Or it is because the fab was dirty?”
The use of double patterning in lithography for advanced process nodes raises the prospect of logic designers having to understand how individual circuits or blocks will be printed if tools cannot deliver the required level of abstraction. “The cell designer needs to know [about double patterning]. But for the next level up, the objective has to be that we have to automate this. It has to be automated. There is no room for people to fix things manually except where timing criticality is at stake,” says De Geus.
He goes on to note that the shift to finFETs could also be made practically transparent to the designers working above the cell-design layer: “The first question is, ‘What does the finFET do to libraries?’ This touches us earlier than most because we are the lead provider of TCAD tools. But the [circuit] designer should not need to know what type of device they deal with other than being interested in the device characteristics.
“I have always argued from the technology side and pursued this since the early 2000s: it’s about the interaction between the tools within a flow. For example, a synthesis tool that takes into account routing – how to do that? All these tools have a degree of clairvoyance: you embed some technology from one into the other. But not too much or the result becomes too slow, too unwieldy.”
In principle, the same core abstractions as those used today should prove robust enough to cope with further changes in device technology, according to de Geus, for a number of coming process generations: “The switch to carbon nanotubes does change the nature of design. It’s super-different but because it’s super-different it’s not going to happen quickly.
“We are 25 years backward compatible in an area that has seen changes in scaling on the order of 1014 change. That’s quite amazing.”
The rise of 3DIC
For de Geus, 3DIC adds another layer of complexity to design tools, but in the name of keeping design simple. “You also have the economic complexity of how to make that integration worthwhile. As a result, in most cases, it will really be 2.5D through the use of interposers. But one can see over time that stacking components will achieve 2x to 4x functionality [gain] at manageable design complexity.”
Another way in which teams are trying to keep timescales under control is by applying more and more automation and reuse to verification. “Our perspective is that verification has to be just as hierarchical and reusable as we have seen on the implementation side,” de Geus argues.
“The verification field is going through the next wave of change. The notion of verification IP is central to that. Earlier, this spring, we started the next generation of that: a verification IP strategy that is 100 per cent in SystemVerilog. Using different languages you can’t bring everything together.”
But the verification problem now goes way beyond hardware – as systems move into the billion-gate domain, it’s easier to manage them as a collection of computers rather than a massive number of logic gates. “You have a zillion little computers in this box. You have many compute cores and the cores are heterogeneous. Some of those cores may look very different from the traditional von Neumann machine,” says de Geus.
“When we are talking about SoC assembly, we need to talk about the software-and-hardware assembly and make sure that the two stay in sync. That is why we have invested so heavily in hardware-software prototyping – we are seeing so many customers heading there,” said de Geus. “Those two worlds need to be brought together systematically for verification.”
For Synopsys, the core of the verification strategy for these software-dominated systems has been the combination of FPGA prototyping and software virtual prototypes. In contrast to Cadence Design Systems and Mentor Graphics, the company has traditionally eschewed the move into emulation, although de Geus now proposes: “FPGA prototyping is a form of emulation. It’s highly optimized for a specific situation. It has the benefit of being extremely fast. The regular emulator is a little more versatile and we may, or may not, decide to go in that direction.
“Most emulators are not good for doing a lot of software and development verification – slow and way too expensive. Emulators get used more for hardware validation. But one doesn’t exclude the other. The FPGA technology is pushable. And the fact that we decided to connect it to the virtual parts extends applicability.
“We could go into emulation if we feel it’s sustainable. But there are only so many things we can do. We would certainly never say ‘no’ to something like that.”
Within Synopsys itself, one relationship that itself might have appeared to undergo radical change is that between de Geus himself and Chi-Foon Chan, president and historically chief operating officer. Both de Geus and Chan now officially share the traditionally solo role of CEO, with de Geus maintaining his additional responsibilities as chairman and Chan those as president.
“We didn’t split the [CEO] role. We have been working with this configuration for over a decade but it’s nice to make it more explicit now. It means we can both speak for the company,” de Geus explains. “But there is also symbolism in having a well-functioning team in our own company. It has symbolism in what it means to be a great partner. It doesn’t matter who fixes the issue as long as the chip comes out. Rather than having the good cowboy who will find all the issues, the team wins. I really like the symbolism.
“Good leadership requires the handling of a continuum of opinions. I have been a very strong believer that diversity, as long as we can align, is a great advantage. You will find very great diversity in Synopsys. I think Synopsys is more modern and more diverse than any other I know.”