Call for improved EDA tools at the Common Platform Tech Forum

by Debra Vogler, Senior Technical Editor, Solid State Technology

Advanced technology design was a major topic at this year’s Common Platform Tech Forum (Nov. 6, Santa Clara, CA). Mark Johnstone, chief technologist for tools and methodology development, Freescale Semiconductor, had a good news/bad news message for attendees. There are no new major design concerns at 45nm, he said, but the hurdles the industry faced at 65nm are even more challenging — e.g., signal integrity, manufacturing variability (parametric yield), non-ideal scaling of device parasitics and supply/threshold voltages, global interconnect scaling, multiple system objective optimization, and design complexity (e.g., the number of transistors in a design has been doubling every 18-24 months).

In particular, Johnstone observed that threshold voltage variability is rising dramatically at 45nm and beyond, so the industry has to develop design styles in EDA tooling to help manage it. He also pointed out problems with addressing the growing interconnect delay problem (local delay goes down with advanced nodes, while the interconnect delay at the global level gets dramatically slower). This has been handled with repeater insertion in designs, but EDA tools need to update to accommodate that approach, he said.

In general, “taming” variability will be even more challenging at advanced nodes (down to 32nm) because of the number of sources in the manufacturing processes themselves, and they are not all accurately modeled or are random in nature. Johnstone cited several manufacturing processes that need better models: depth-of-focus and scanner x-y field differences, ILD thickness variation, RC temperature effects, cross-die voltage gradients, dishing and over polishing, and CD control.

To address the variability challenge, Johnstone said the question the industry should be asking itself is whether it can make a profit without understanding the processes listed above — and the answer is not throwing greater/more design margins at the problem. A key factor, he believes, in whether or not the industry can sell parts at a profit going forward is having improved modeling (i.e., deterministic and statistical) through improved EDA tools that drive better designs.

In a rather candid observation, Johnstone noted that given the tremendous time-pressures under which designers are laboring, they will not spend time fixing problems for which they are not credited. He outlined several examples of how EDA tools can be improved to give them credit: hold time margins, common clock pessimism removal (i.e., design circuits to be less sensitive to variability), and balanced clock tree design. By addressing these and other issues in the EDA tools, Johnstone believes designers will not be tempted to solve variability problems with the broad brush inherent in adding more design margin, and instead will fix the specific causes of variability and get the credit for doing so.

Taking the audience on the path of where design has been and where it needs to go, Johnstone emphasized that as nodes advance, increasing variability is a fact of life and a reliance on design margin is not sustainable — so the next steps are modeling and developing the tools to measure variability. Statistical timing analysis, lithography effects modeling, and power analysis variability are some of the tools that will be needed. Once the variability can be understood and measured, he says, the next step is to eliminate it altogether. “We need design styles that are insensitive to that variability, so this will require investment in EDA tools,” he concluded. — D.V.


Easily post a comment below using your Linkedin, Twitter, Google or Facebook account. Comments won't automatically be posted to your social media accounts unless you select to share.