Reality, FUD and vision

by Gary Smith, Gary Smith EDA

We are in year five of a major inflection point in the semiconductor market, affecting not only all supporting industries, but impacting the entire electronics ecosystem. As is to be expected, it is being accompanied by FUD (fear, uncertainty and doubt), a new reality, and is being driven by vision. Unfortunately, being a little over halfway through the transition, FUD seems to be winning out over vision.

As with all of these inflection points, there have been warnings, and gratefully, some good research reaching back into the 1990s. The first ESL (electronic system level) designs were actually done back in 1994. The Golden Age, or as I prefer to call it, the Cream Puff Era, of semiconductor process development ended with the 130nm node in 2001. By 2002, a delineation between system level IC vendors and component IC vendors was evident. Throw in the emergence of the Asian IC vendors and the signs of change were clear. Still, the majority of us seem to prefer the ostrich approach to the problem — or the “Chicken Little” approach.

This is the most exciting, challenging, and rewarding time the semiconductor market has ever seen! Everything is exploding around us. Some of the explosions are the destruction of companies that refuse to change, but most are explosions of new opportunities, technologies, and yet another new adventure for those of us in the electronic ecosystem.

For those that don’t quite get it yet, let’s visit some of the FUD that has been thrown our way:

– We are in a significant semiconductor recession. No, it’s a memory pricing problem. We are making more ICs than ever. The struggles of some old-line semiconductor vendors are fueling this FUD.

– There will only be a handful of fabs in the near future. There will only be a handful of mega-fabs, but mini-fabs will make a come-back. Keep in mind, a mini-fab can now produce the same amount of silicon as a full size fab could, not that many years ago. Not all markets need the volume or are that price sensitive. Only memory, processors for PCs, and some consumer SoCs need mega-fabs. The number of fabs in production today will probably be the same ten years from now.

– There will be a significant consolidation in the semiconductor market. Actually there will be an expansion. Many of the older semiconductor vendors will not be with us, but they will be replaced primarily by Asian semiconductor vendors. In particular, watch India — there are a lot of highly skilled designers becoming available for start-ups, as verification is being moved back to the US and European design teams.

– EDA is a maturing industry. Actually, the RTL design methodology is maturing. We added 62 new EDA companies to our Wallchart this year, bringing the total to 502 vendors. The emerging ESL design mythology grew 50% in 2006 and is projected to have a 47.4% compound annual growth for the next five years. Take Cadence out of the EDA numbers and 2008 will be a pretty good year.

– The cost of design is skyrocketing. This is true, but you need to understand what is happening. The cost of designing the average high-end SoC has remained relatively the same since the late 1990s, ranging from $10M-$20M depending on the year. The cost fluctuations depend on the introduction of new methodologies and design tools (see ITRS cost chart, below).


Impact of design technology on SoC consumer portable implementation cost (US $M).
CLICK HERE to view larger image

What is driving up the cost is the cost of developing the embedded software that is now required before a semiconductor vendor can market an SoC. Is there a problem? Yes. Is it a big problem? Yes. Will we solve it? Yes. Someone will look at it as a major opportunity, and they will be right. If we can convince embedded software developers to spend $30k more for their tools, it’s averaging $10k now, the TAM would be $16B. Are we going to reach $16B? Probably not. Can we get $8B? Possibly. How about $4B? You can bet on it.

My first experience with an inflection point was in 1971. We were moving from transistor-based design to gate level design. We had been used to SSI (small scale integration), MSI (medium scale integration), and LSI (large scale integration), and were entering the realm of VLSI (very large scale integration). Our problem was eloquently expressed by one of our top applications engineers: “What in the %#&$ are we going to do with all these *@#! transistors?!” As it turned out, the answer was to let the customers figure it out — or in other words, gate arrays. Our second inflection point came with the move to RTL design.

All of these inflection points were driven by the ability of the semiconductor industry to give us more transistors than we could design within a one-year design cycle. The answer for the second inflection point was cell-based design. At the 130nm node, we once more had more transistors than we could design in a year. This drove the move to SoC (system on a chip); however we still were using the RTL methodology. By 2004, some designs were exceeding the 20,000,000 gate range and the move to the ESL methodology began in earnest.

Today we are approaching the 32nm node, which gives us 1,106,000,000 gates. That’s right, over a billion gates will be at the disposal of today’s design engineer, and again you hear the cries of, “What in the %#&$ are we going to do with all these *@#! transistors?” If there is one thing I’ve learned about design engineers, it’s never sell them short. Give them the silicon and the tools and they will figure out how to fill up the real-estate.

So if you want to be a winner in this new challenging world, you’ll need to do a few things:

– Take the words “never” and “impossible” out of your vocabulary.

– Have faith in the design engineers, and do your utmost to give them the tools and support they need to get their jobs done.

– If you still are thinking inside the box, you are in the process of going under. The box blew apart four years ago while you were sleeping.

– Do not attack the software problem, attack the system problem. There is a graph that has been around since the late 1990s, which shows the design gap for software with alarming results. It fell out of popularity after the introduction of the 3G phones from Europe. It ended up that, in order to meet the power and performance specifications, designers had to use hardware accelerators to run nine out of the 11 algorithms. That meant that the software content of the 3G phone was about the same as the 2.2G. The predicted rapid growth of software content didn’t happen.

So, take nothing for granted. The only constant we can count on is the constant change driven by Moore’s Law, and that may end in 2020. Though probably not. Although CMOS will not be the fabric we will use in the next decade, that doesn’t mean Moore’s Law won’t still be the engine driving us to greater and greater accomplishments.

Gary Smith is founder and chief analyst for Gary Smith EDA (Santa Clara, CA), a provider of market intelligence and advisory services for the global electronic design automation (EDA), electronic system level (ESL) design, and related technology markets. Email: [email protected], http://www.garysmithEDA.com.

POST A COMMENT

Easily post a comment below using your Linkedin, Twitter, Google or Facebook account. Comments won't automatically be posted to your social media accounts unless you select to share.