Once upon a time, placing even the smallest question mark next to Moore’s Law was seen as something beyond heresy. Today, more and more people are voicing such thoughts in the mainstream. One of the most senior figures speaking out is Bernard Meyerson, vice president and chief technologist of IBM’s Systems & Technology Group. The fact that he also led the team that developed the silicon germanium (SiGe) heterojunction bipolar transistor lends further weight to his comments.
However, before exploring his views any further, a crucial point must be made. Many of those claiming ‘time’s up’ for Gordon Moore’s observation have a ‘silicon’s finished’ agenda. Their arguments are materials-led, frequently with a bias towards nanotechnology and materials such as carbon nanotubes.
This position may, over time, prove to have some merit. However, Meyerson’s views come from a different angle.
For a start, he does not necessarily believe that Moore’s Law, in itself, has or will run out of steam. Also, he is not standing up to declare that silicon and its variants have reached their limits either, or will do so imminently.
In this context, it is best to let the IBM fellow himself articulate what he sees as the problem and also how his company believes it has already lighted upon a strategic design solution.
“There’s been the assumption that changes in performance directly equate to what Moore’s Law says,” he explains. “But it is not actually a metric of performance but one of density – it says you can double the density every 18 months,”Meyerson says.
“The rest of the performance gain that you have got in the past has come from Classical Scaling, and that’s not just about the density, but about modifying 15-20 dimensions – the chemistry, the gate oxide and so on – in step with each shrink.
The IBM-Toshiba developed Cell microprocessor used the holistic design approach
“If you follow that, the connection between Moore’s Law and Classical Scaling, then you have it so that if you do the shrink by a power of two then you get the power down by a power of two also. “The problem is that about three years ago, this breakage occurred. Density was not the issue, so much as the fact that people could not follow Classical Scaling – you couldn’t scale the gate oxide, for example, so you had to hold on to more of the voltage to accommodate that.”
Later this year, Meyerson will celebrate 25 years at IBM. In that time, he has seen many discontinuities: the shift from bipolar to CMOS, moves towards heterostructure, and strain, to name but three. “This, though, is one of the most significant, if not the most significant because the glue that connected Moore’s Law with better performance has dissolved,” he says.
To some extent, Meyerson believes there has been some denial going on in certain parts of the industry. Designers and chip company leaders have known of the breakage but not really acknowledged it. The evidence for that comes in how they have continued to deliver improvements in performance during the time that Moore’s Law and Classical Scaling have been moving apart. “Where we haven’t been able to follow this connection, we’ve already seen companies moving to novel approaches. For example, there is the way strain is already being used at 90nm,” says Meyerson. “All these innovations are making up for what we can’t get from traditional methods.”
Indeed, it is his view that while end-users might still be seeing the delivery of annual improvements in IT performance of 60-90%, only 15-20% of these are a result of innovations in the perennially basic semiconductor technologies.
So, the inherent shift in perceptions in where performance gains lie – in essence, a move away from the long-standing metric of pushing the clock speed – must, in Meyerson’s view, go beyond tricks with physics and chemistry, if the industry is to carry on meeting customer expectations.
“We could see in 1996, that we were heading towards a power cliff, one where trying to keep on driving the clock frequency no longer made sense. You were just going to paint yourself into a corner,”Meyerson said.
“Out of that, you got research on partitioning – going for a lower frequency but with multiple cores – and shifting the focus towards efficiency.”
And what followed from that was a move towards what IBM is calling Holistic Design.
“What all that’s happening – and what end-users expect and demand – both tell you is that you now need to look at a design on the broadest sweep, from the interplay between atoms and molecules right through to the software. You are trying to architect something that is solution-led rather than performance-led.” “We have gone from the extreme of a frequency-driven hardware business to an integrated solutions-driven business. It is a seminal shift,” says Meyerson.
As an example, he cites IBM’s work on the Blue Gene supercomputer and an emphasis on achieving a goal – massively parallel processing – rather than performance.
“Once you identified the solution, you found you could work with relatively modest, sub-gigahertz Power PC cores – two on a single die although there were tens of cards – but the challenge lay in the communications protocol,” says Meyerson.
“It had to have this incredible stretch, from being capable of zero bandwidth and virtually zero latency right through to very high bandwidth to move data to memory, where you don’t care if there’s enormous latency. The challenge was in the control and integration of 100,000 processors, not the clock speed at the individual processor level.”
Blue Gene competed against the Earth Simulator and, initially, ran 10% faster. But, for Meyerson, that was not the critical result. “One thing that got lost with the focus on which one was faster, was that Blue Gene was one-hundredth the size of the computer it outran.When you make a 100X reduction then it’s a solution. It’s not just about bragging rights,” he says.
“Also, Blue Gene was about 1/28th the power of its rival. That is a discontinuity. And the reason you could do all this was that we took a holistic design approach, not brute force. The whole project was much more about the solution as opposed to the gee-whiz technology.”
Similarly, IBM has done a lot of research on core multi-threading, breaking down cores into one-tenth of one-thread, so that an appropriate rather than an overblown resource can be dynamically assigned to, say, “a task as trivial as a compare”, rather than allowing any waste. The results were reflected in high-end server chips that first shipped two years ago, and also feature in the more recent, and high profile launch, of the Cell processor. “Again, to do that you have to optimize everything. It all has to be working in harmony,” says Meyerson.
As he intends to illustrate in a keynote speech at this month’s Design Automation Conference, these are not just theories; practical design implementations are already taking place. However, this is IBM we are talking about here. Other companies may feel that Big Blue can talk holistic because it has a technology profile that is holistic: internally, the company can call upon silicon and software experts, research at the atomic level and in broader material science, and so on.What about everybody else? Meyerson refutes the notion that this is simply a big-boys game: “At IBM, we have what I think of as a revolutionary business model. It’s ‘open vertical’ for want of a better phrase, where we’ve thrown open the assets of the organization at every level for whatever solution you are developing. And it needs to be open so the rest of the industry can take part and reap the benefits.
“An example of what we mean here is with the Open Standards Body for Software.We’ve turned 500 patents over to the group. The idea is that anybody can use them.What we are saying is that we will compete on our skill sets.”
Meyerson adds that the logic of highly optimized holistic development programs requires IBM to set its own example. “We need the IP [intellectual property] that is elsewhere. So, say you have a magic I/O interface that makes the world more efficient. I want that.”
The bottom line is that Meyerson views the holistic concept and the reasons for its necessity, not just in terms of IBM, but the wider technology business.
“It’s long been our approach to drive ‘innovations that matter’, ones that have a seminal effect on society.”