DAC 2012: 20(nm) questions
Although there has been a lot of angst expressed about the future of IC scaling, the folks in the trenches are hard at work making sure 20nm is ready for design starts. One of the big stories at DAC from the EDA IC sector will be that the tools needed for 20nm are largely in place and all the major foundries are moving forward to production per their committed schedules, ergo no slip in the 20nm process node introduction. Yes, there are continuing discussions about different transistor architecture options at 20nm (and it’s 22nm or 18nm variants), but there is a pretty clear vision of what the IC designer needs to care about at the next node.
The most obvious ‘new’ is double patterning (DP), a requirement necessitated by imaging with 193nm light, which has difficulty imaging features below 80nm pitch. Breaking the layout into two masks, viewed by the designer as ‘different colors’, is conceptually easy, but dealing with things like manufacturing variability, mask alignment and other details has proven quite challenging. Fortunately for the designer, most of this is addressed by improvements in the design, verification, and optical proximity correction (OPC) tools. One of these makes the interactive debugging environment able to resolve DP violations, because it is not intuitively obvious how one DP violation can be fixed without introducing a new one. The new 20nm DP debug environments will be immensely helpful in this area, regardless of what approach to double pattering the designer or foundry make take.
There are also new fill requirements at 20nm. For several nodes, fill has affected more than just physical planarity, but at 20nm there are new ramifications. The absolute density and rate of density change of both poly and metal across multiple layers can have undesirable effects on stress, etch rates, rapid thermal annealing (RTA) and other parameters. GDSII file sizes are exploding due to the sheer number of fill shapes, and the need to run OPC on fill shapes at 20nm creates new capacity and runtime challenges. Moreover, filling is also a component of DP that needs to be considered when balancing masks to ensure proper light emissivity and etch processes. To address these emerging issues, new ‘intelligent’ fill techniques have been implemented in the latest EDA tools and adopted by the foundries to address these problems.
Another change is the need for more accurate yet faster extraction tools—in fact, 20nm designs need reference level accuracy, while maintaining traditional turnaround times. This is a challenge because smaller dimensions exacerbate the impact of manufacturing variability on performance, so designers will need to run extraction at more process corners. Likewise, addressing the impact of DP on extraction has also increased the complexity of the analysis. More gates, more corners, more complexity, higher accuracy—it’s a challenging scenario for EDA vendors striving to maintain acceptable runtimes. However, new algorithms, computational techniques, and the ability to make efficient use of CPU clusters are allowing the tools address the complexity while still achieving designers run-time requirements.
Finally, an emerging trend is higher emphasis on circuit checking associated with design for reliability (DFR). Effects such as electro migration (EM) and electrical overstress (EOS) can result in circuit degradation causing delayed failure in the field. These issues affect a wide range of IC designs, and are particularly troublesome and expensive where high reliability is required, such as communications, automotive, aerospace and medical. Multi-voltage designs to reduce power consumption introduce yet another dimension of complexity in determining whether signal paths crossing voltage domains have set up the potential for short or long term failure due to dielectric degradation. At advanced nodes these all become more difficult to address as the margins of the process technology are reduced, at the same time the complexity of the designs is increasing significantly.
New tool technology is now available to accurately and efficiently locate and correct aspects of the design that lead to these types of failures. While new reliability checking disciplines are often put in place at the most advanced nodes, we are also observing a trends to “back fit” them on older nodes as well…..so this year’s DAC is not only driven by the latest process, but by what’s new for designers at all technologies.
There will be a lot to see at DAC if you’re gearing up for the next node regardless as to what node that is. I suggest you spend time with the entire design ecosystem to get some ‘on the ground’ insight into the challenges that lie ahead and what is available to help you. See you in San Francisco!