Mentor’s Wally Rhines on tools as a cultural issue

By Paul Dempsey |  No Comments  |  Posted: June 2, 2014
Topics/Categories: Commentary, Conferences, Design to Silicon, Digital/analog implementation, Blog - EDA, Embedded, - ESL/SystemC, RTL, Standards, Tested Component to System, Verification  |  Tags: ,  | Organizations:

Mentor Graphics chairman and CEO Wally Rhines knows a thing or two about the cultural aspects of what makes EDA tools successful or not – or not quite as successful as they should be. His company has been an early mover in embedded software, ESL and ‘mass market’ formal verification, to name but three. One question that has repeatedly arisen is less about whether or not the tool ‘does its job’, but rather the user’s willingness to accept how it does that.

For a little while now, Wally’s been giving a presentation on the topic but in case you haven’t seen it or heard about it, DAC’s opening day is a good time to look at what it says. We recently got the chance to sit down with him to discuss that in the context of an electronic systems world where hardware and software are converging and even individual hardware disciplines – design and verification – are merging.

“Well, maybe it’s all verification now. You just spend 15 minutes stitching together the IP blocks and then the rest is verifying it,” jokes Rhines. “Well OK, that’s an exaggeration, but one of the interesting things we saw in the latest data out of [Mentor verification expert] Harry Foster’s research was that we are now at the point where more than 50% of those engaged in design say they are in verification.”

But let’s not get ahead of ourselves. Rhines has long used one particular example to illustrate the culture problem. Mentor recently had a tool that could look at trade-offs between hardware and software to optimize power and performance. It could, for example, take software and realize it instead in hardware. Maybe you can see already where this might end up.

“We had a client who had been struggling with getting a power from around 8.47mW to a target of 4.5mW. We were able to go in and where they’d already taken up three man-years of work, in a single day, the tool got that to just over 4mW. Amazing.

“So we sat back and we waited for the order.

“And we waited.

“And we waited.

“And then the client came back and said, ‘We can’t buy the tool.'”

As you may well have guessed the problem was political.

“They said, ‘We agree your product is fantastic and it does exactly what we want to do. But for the last six months we’ve been having an argument between the software and the hardware people. Because neither wants the other to use it. Each group happy to use it themselves but will not allows the other to use it.'”

This was not an isolated case. In the early days of CatapultC [now part of Calypto], the RTL team would always blame the ‘black box’ code generated by the ESL synthesis tool if any bugs cropped up. For Seamless, a successful hardware/software co-verification product in the embedded space, Mentor could initially get users among those software users whose code was deeply embedded in the hardware, but got zip from those whose code was not. Cultures felt challenged, so the approach had to adapt.

A lot of this is a function of large organizations. These naturally tend to create a plethora of specialist jobs, each within particular silos. Each silo then might not just have its own culture but also its own way of benchmarking success.

Culture is not monolithic

Start-ups by contrast get past that part of the problem because everybody typically has to do a bit of everything, they all muck in together (at least at the start).

Similarly, a lot has been done to make tools present themselves to each different engineering discipline as appropriate. Today’s hardware/software debug tools generally offer a user interface to different users that matches what they would expect to see from ‘traditional’ EDA software. Or, in functional verification, the subtleties that once required a Mathematics PhD to understand, have been hidden away within ‘push button’ environments for challenges like clock domain crossing.

And, as hardware and software converge – or very closely interweave – in the design process moving up to higher levels of abstraction also, Rhines notes, removes many of the differences.

Indeed, Rhines sees seven reasons for hope:

  1. Change the customer/supplier relationship (foundries being a good example)
  2. Create a start up
  3. Assimilate the task of one group into another (the realization of analog functions in digital)
  4. Move to a higher level of abstraction
  5. Form an abstraction layer (e.g. AUTOSAR’s use of these)
  6. Multi-physics simulation/modeling
  7. Multi-disciplinary data management

Similarly, there is the sense that design is on the verge of as radical a shift at the silicon level as was introduced by RTL. That shift forced something of a split in the engineering community, with newer entrants and younger workers finding it progessively easier to adapt to a new world, and older ones moving toward the maintenance of existing designs. Will history repeat?

It probably will but as it does, the real solution will lie in how well each flow component – here stretching out beyond electronics into mechanical and other branches – fits each discipline’s familiar way of working while still allowing the right kind of data to pass on to another with appropriate visibility. And that it then delivers the right overall results.

As another yardstick to apply to the tools you check out at this week’s DAC – or elsewhere – it’s an important one to remember. Culture is a crucial component in productivity.

 

Comments are closed.

PLATINUM SPONSORS

Synopsys Cadence Design Systems Siemens EDA
View All Sponsors