The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.04 - July-August (2008 vol.25)
pp: 344
Published by the IEEE Computer Society
Leon Stok , IBM
ABSTRACT
In the late- and post-silicon eras, variation of all nanometer processes will continue to increase significantly. The industry is gradually addressing this situation and exposing more variability information to the designer. According to the Gigascale Systems Research Center, the deterministic era will be over for most on-chip applications and alternatives must be found. The author of this sidebar looks forward to the time when the two will meet: when more revolutionary design techniques will find their way into practical designs, and when one of the computational paradigms will suddenly be needed to cope with an unexpected surge in variability or reliability in a particular design or technology.
Variability and New Design Paradigms
Leon Stok, IBM
In the late- and post-silicon eras, variation of all nanometer processes will continue to increase significantly. All electrical parameters-such as timing, power, and noise-are already becoming increasingly more affected by these variations. In addition, several effects that can render circuits not only variable, but even unreliable, are becoming more costly to avoid.
Up until the 65-nm technology generation, most variability had been hidden from the designers and had been dealt with in the process of characterizing the technology and generating the device models. Analog and memory designers would run statistical simulations of their designs, but most digital designers were shielded from this. This practice no longer holds for current technology nodes. The extraordinary amount of guard-banding required to sustain this model renders new technologies ineffective.
The industry is gradually addressing this situation and exposing more variability information to the designer. Design tools will attempt to make this information as accurate and actionable as possible, and designers will react with new designs that are more robust and less sensitive to the variations that the analysis tools tell them about. The first statistical analysis tools are being successfully deployed to more designers, and semiconductor fabs are becoming increasingly sophisticated in providing statistical models for their technologies.
It is refreshing to see a university research program like the Gigascale Systems Research Center (GSRC) start at the other end of the spectrum-declaring that the deterministic era will be over for most on-chip applications and that alternatives must be found. The search for these alternative computational models cannot start early enough. For, even if they are found, the paradigm shift to use them effectively could take a long time to implement.
A critical point in the search for these new paradigms is their effect on the power, performance, and cost of design. If too much overhead must be added such that a new design, even with the advantages of the new technology node, is not competitive along these dimensions, there will be no incentive to move to a new, and therefore risky, paradigm. To be viable, new paradigms will need to give at least a 10× advantage over existing paradigms in the current technology node.
I am looking forward to the time when the two will meet: when more revolutionary design techniques will find their way into practical designs, and when one of the computational paradigms will suddenly be needed to cope with an unexpected surge in variability or reliability in a particular design or technology. This moment might be closer than we think.
Leon Stokis director of electronic design automation for IBM. Contact him at leonstok@us.ibm.com.
16 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool