New dimensions in performance

By Kerry Bernstein |  No Comments  |  Posted: June 1, 2006
Topics/Categories: EDA - DFM  |  Tags:

undefined

Kerry Bernstein

When Kerry Bernstein, a 28-year IBM veteran, was first drafted to work on Big Blue’s development of 3D semiconductors, he admits he was a skeptic.

“At first, I think I felt as though I’d got dragged into this program. I thought it wasn’t going anywhere. I thought it was going to go anywhere. But the more I’ve seen of it, the more my opinion has turned around,” he says. “Now, I think this technology is going to become pervasive.”.

Indeed, Bernstein, a member of the senior technical staff at IBM’s Thomas J. Watson Research Center, has had such a change of view that he is now one of the men his company sends to events like IEDM and DAC to explain the what and why of 3D to the broader industry.

Bernstein’s own original wariness of 3D may, in some respects, have been down to a misunderstanding of what it principally offers – a misunderstanding that these days he seeks to correct.

“In varying ways, the delay in the interconnect has always been a problem in this industry,” he says. “The back-end-of-line wiring has never kept up. It’s constantly needed transfusions and, recently for example, we’ve revived it with copper and low-k dielectrics.

“Now, again, people see a value in 3D as the next trick – and it will help you to do that. But, and this is really important, when you do just a little analysis you see that the advantage 3D offers here is really very modest.”

Instead, Bernstein switches the focus to a combination of both latency and bandwidth.

“Microprocessor speed has been increasing so much quicker than memory bus speed that the processor has been starved of for data. Now, we have come up with a Band-Aid for that in our move to multi-core processors. Multi-core allows you to hide more and more of the latency associated with the wiring. OK, 3D will indeed improve that,” he says.

“But the roadmap shows us moving to many, many cores. At that point, the interconnect, even at the chip boundary, is no longer sufficient. You can’t run a bus that wide and that far over that many processors. You’ve got this huge problem feeding 10 or however many processors with data. The bandwidth just is not there.

“We are approaching the pain threshold. Right now, you’re still seeing flat buses, but there time is basically over. 3D will give us a boost in bandwidth in the region of at least three orders of magnitude.”

Putting this technical question in more purely economic terms, Bernstein also draws a direct analog with the real estate business.

“I think this technology is going to become pervasive…We need to think in parallel to capture the performance that is available to us in 3D.”

“We have a real estate office in Manhattan. Now, when we started to study 3D, I came to thinking that there is a direct analog in that market. They begin to build skyscrapers in New York when the ground rents began to hit certain thresholds. They couldn’t go out anymore so they went up,” he says.

“And I’ve been bugging the guys in that office for the data, looking at the economic comparison. It’s hysterical. Whenever I call, I can hear the New York accent in the background. ‘Oh gawd, we’ve got that geek from research on the phone again.’ But then, it’s like, ‘Well talk to him, he pays our rent.’ And the bottom line is that the analog holds, and it helps because it’s a pretty easy one for people to grasp.”

There is, of course, the problem of implementation, but here too an analogy and the consequences of IBM’s research in other fields can be brought to bear.

“One of the things I like to say about 3D is that it’s a bit like one of those M.C. Escher drawings. It looks 3D, but it is in fact 2D,” Bernstein says. “Now what I mean by that is that it fits into the existing infrastructure with just some minor modifications, although we are going to require some heroic EDA work.

“But the key is that our proposed process uses existing technology There are no new processes – you can do this in bulk CMOS, in SOI [silicon on insulator], or in both. There are no new materials. What we are looking at are minor perturbations in the process curve to make it work.”

He also sees 3D helping to address issues such as design for manufacturability.

“In many ways, 3D can enhance yield, if it’s done correctly because as you take the pressure off of density, you can take silicon from one plane and move it to two or more – so you reduce the critical dimension area,” he says. “There is a trade off involving the vertical via you insert, but basically you can come out ahead.”

However, you may have noticed the devil in the detail, the EDA software. One of the reasons why the tools matter so much is that, as is so often the case, a move forward in the technology partly depends on that move being largely invisible to the engineer.

“Going back to M.C. Escher, it looks 3D, but it’s 2D. So how does that relate to tools?” Bernstein asks. “The answer is that you have this huge infrastructure already in place in that we have had EDA tools since the beginning of time. And these tools continue to get retrofitted and upgraded, but they are the same tools.

“So think about it, we have this enormous investment in infrastructure, including things you wouldn’t even think of. In 2D, we have Manhattan wire estimation, parasitic extraction, place and route, timing, test pattern generation and so on. A massive amount of tools and, with no disrespect, they are Mr Potato Head. They plug and chug. They require no thinking.

“So if we have a technology that offers something to the architect and something to the designer, but, to the chip’s physical designer, it looks 2D, you’re in business.”

So what are the challenges. Logic partitioning is an obvious candidate. How do you decide what goes up and what goes down – and how do you know and assess the trade-offs involved?

“There’s another very big issue with 3D. Once you settle on the architecture, it’s like dominos. You can map what happens next right down to the size of the vias. From our work so far, we’ve found that it really is a straight line from the highest level of abstraction right down,” Bernstein notes.

“So here’s the deal. How do I optimize? How do I make the various trade offs involved show up when I’m chosing the architecture. The key is EDA and what it needs to provide ia a common language that ties all this into place. It’s something that we’ve needed to do for a long time.”

And there is a third major challenge.

“One of the problems here is the same one that the dinosaurs had and that locomotives had and that popcorn has. It’s the problem of area versus volume when it comes to heat dissipation. EDA is very important here in trying to find a solution. And the answer – which is uniquely EDA centric – is quite complex.

“If you turn the voltage down you can mitigate much of the head, but you take this big dent, perhaps 30X in terms of delay. But what you get back in 3D is parallelism. You have the ability to define vertical buses and you can run any number of processors in parallel. But what you’re looking at here is something that’s almost counter intuitive when you try to grasp it.

“We evolved from the caveman days in terms of uni-processors. ‘I put my hand in the fire, I pulled it out because it hurt me’. Only a few societies – ants, for example – have learned to process things in parallel. And, you know, look at The Borg in Star Trek – that’s something shown to us as being as alien as it gets. But the problem, the challenge is that we need to think in parallel to capture this performance that is available to us in 3D. And you can look at what EDA offers to take us there.”

IBM’s own work on 3D has so far gone as far as proof-of-concept devices and, Bernstein says, “They’ve made us very confident in the technology. We really see it going places.”

In blunt terms, expect to start seeing 3D at either the 45nm or 32nm node with an initial focus on one market.

“This is not an inexpensive technology, so I think that we are going to see that first uses in the high performance market, and basically in servers” says Bernstein.

“And you’ve got two elements there: the scientific and the commercial. What’s interesting is that the scientific workload is predictable – it’s flat out all the time. But the commercial one is more a case of ‘Who knows?’ If you are talking about those very high end servers for e-commerce and so on, the whole issue is that if you keep filling up the bus, you’re in deep yoghurt. 3D gets you away from that – it’s going to be a good applications for the technology.”

Comments are closed.

PLATINUM SPONSORS

Synopsys Cadence Design Systems Siemens EDA
View All Sponsors