Giga-scale challenges will dominate 2015

Dr. Zhihong LiuBy Dr. Zhihong Liu, Executive Chairman, ProPlus Design Solutions, Inc.

It wasn’t all that long ago when nano-scale was the term the semiconductor industry used to describe small transistor sizes to indicate technological advancement. Today, with Moore’s Law slowing down at sub-28nm, the term more often heard is giga-scale due to a leap forward in complexity challenges caused in large measure by the massive amounts of big data now part of all chip design.

Nano-scale technological advancement has enabled giga-sized applications for more varieties of technology platforms, including the most popular mobile, IoT and wearable devices. EDA tools must respond to such a trend. On one side, accurately modeling nano-scale devices, including complex physical effects due to small geometry sizes and complicated device structures, has increased in importance and difficulties. Designers now demand more from foundries and have higher standards for PDK and model accuracies. They need to have a deep understanding of the process platform in order to  make their chip or IP competitive.

On the other side, giga-scale designs require accurate tools to handle increasing design size. The small supply voltage associated with technology advancement and low-power applications, and the impact of various process variation effects, have reduced available design margins. Furthermore, the big circuit size has made the design sensitive to small leakage current and small noise margin. Accuracy will soon become the bottleneck for giga-scale designs.

However, traditional design tools for big designs, such as FastSPICE for simulation and verification, mostly trade-off accuracy for capacity and performance. One particular example will be the need for accurate memory design, e.g., large instance memory characterization, or full-chip timing and power verification. Because embedded memory may occupy more than 50 percent of chip die area, it will have a significant impact on chip performance and power. For advanced designs, power or timing characterization and verification require much higher accuracy than what FastSPICE can offer –– 5 percent or less errors compared to golden SPICE.

To meet the giga-scale challenges outlined above, the next-generation circuit simulator must offer the high accuracy of a traditional SPICE simulator, and have similar capacity and performance advantages of a FastSPICE simulator. New entrants into the giga-scale SPICE simulation market readily handle the latest process technologies, such as 16/14nm FinFET, which adds further challenges to capacity and accuracy.

One giga-scale SPICE simulator can cover small and large block simulations, characterization, or full-chip verifications, with a pure SPICE engine that guarantees accuracy, and eliminates inconsistencies in the traditional design flow.  It can be used as the golden reference for FastSPICE applications, or directly replace FastSPICE for memory designs.

The giga-scale era in chip design is here and giga-scale SPICE simulators are commercially available to meet the need.

POST A COMMENT

Easily post a comment below using your Linkedin, Twitter, Google or Facebook account. Comments won't automatically be posted to your social media accounts unless you select to share.

One thought on “Giga-scale challenges will dominate 2015

  1. Harsh

    I, completely, agree with Dr. Zhihong Liu on accuracy and performance requirement on “memory characterization”. Paripath product guna works with Proplus nanospice to generate sign-off accurate models. Paripath’s context aware models (http://www.paripath.com/Products/collateral) for timing and charge sharing analysis (http://www.paripath.com/Products/incharge) for SI goes one step further to ensure sign-off accuracy.

    FinFET, smaller nodes and lower voltages are making these effects worse. Memory compilers are no longer trust-worthy because of these of their interpolation/extrapolation of tiles.

Comments are closed.