Tag Archives: Top Story Left

200mm fabs on the rise


October 11, 2016

One year after the debut of the industry’s first 200mm Fab Outlook report, SEMI has issued an October 2016 update, with the improved and expanded report forecasting 200mm fab trends out to 2020.  This extensive report features trends from 2009 to 2020, showing how 200mm fab activities and capacity have changed worldwide.  SEMI’s analysts updated information on almost 200 facilities, including new facilities and closures of existing facilities.

Examining 200mm capacity over the years, the highest level of 200mm capacity was recorded in 2007 and the lowest following this peak in 2009 (see figure). The capacity decline from 2007 to 2009 was driven by the 2008/2009 global financial crisis, which caused the closure of many facilities, and the transition of memory and MPU fabrication to 300mm fabs from 200mm.

Global_200mm_chart_700px

Since 2009, installed 200mm fab capacity has increased, and by 2020, 200mm capacity is expected to reach 5.5 million wafers per month (wpm), though still less than the 2007 peak.  According to SEMI’s data, by 2019, installed capacity will reach close to 5.38 million wpm, almost as high as capacity in 2006.  From 2015 to 2020, 200mm facilities are forecast to add 618,000 wpm net capacity. This increase is a combination of fabs adding capacity and fabs losing capacity

Two applications account for the growing demand for 200mm: mobile devices and IoT. Rising fab capacity from 2015 to 2020 will be driven by MEMS devices, Power, Foundry and Analog.  By region, the greatest increases in capacity are expected to be in China, Southeast Asia, Americas, and Taiwan. Another trend is also observed: 200mm fabs are increasing the capacity to provide process capability below 120nm. Higher capacity does not mean more fabs, but fewer, larger fabs. In fact, the number of fabs in 2020 is almost the same as the count seen in 2009.  So 2020 capacity heads toward industry highs while in comparison 2009 had the lowest levels off the 2007 peak.

The Global 200mm Fab Outlook to 2020, published by SEMI in October 2016, includes two files: a 92-page pdf file featuring trend charts, tables and summaries and an Excel file covering 2009 to 2020 detailing on quarterly basis and fab-by-fab developments.

Nothing raises more suspicion today: 2015 – 2016 have been exciting years for the GaN power business: 600V GaN is today commercially available, after many ups and downs. And GaN power IC has debuted, opening new market perspectives for GaN companies. According to Yole Développement (Yole), the “More than Moore” market research and strategy consulting company, GaN power business is expected to reach US$280 million in 2021, with an 86% CAGR between 2015 and 2021. The market is driving by emerging applications including power supply for datacenter and telecom – AC fast charger – Lidar – ET – And wireless power.

gan hype circle

“Numerous powerful developments and key collaborations have been announced during this period and confirmed a promising and fast-growing industry,” commented Dr. Hong Lin, Technology & Market Analyst from Yole. Integrated Device Technology (IDT) and Efficient Power Conversion (EPC) – Infineon Technologies and Panasonic – Exagan and XFab – TSMC and GaN Systems for volume production and much more. All collaborations took place within only 2 years, between 2015 and 2016. In parallel Texas Instruments announced a 80V power stage in 2015 and a 600V power stage in 2016. From its side, Visic announced its first GaN product in 2015.

Yole’s analysts propose you to discover the status of the Power GaN industry with a new technology & market analysis titled Power GaN 2016: Epitaxy and Devices, Applications and Technology Trends. This report gives a deep understanding of GaN penetration in different applications (power supply, PV , EV/HEV , UPS , lidar…) and the state-of-the-art GaN power devices. It also review the industrial landscape, market dynamics and market projection.

Up until late 2014, 600V/650V GaN HEMTs’ commercial availability was still questionable, despite some announcements from different players. Fast-forward to 2016 end users can now buy not only low-voltage GaN (<200V) devices from EPC Power, but also high-voltage (600V/650V) components from several players, including Transphorm, GaN Systems, and Panasonic.

In parallel a new start-up, Navitas Semiconductor, announced their GaN power IC in March 2016, followed by Dialog Semiconductors revealing their GaN power IC in August 2016. The idea of bringing GaN from the power semiconductor market to the much bigger analog IC market is of interest to several other players too. For example, EPC Power and GaN Systems are both working on a more integrated solution, and Texas Instruments, a well-established analog IC player, has also been engaged in GaN activities, releasing an 80V power stage and 600V power stage in 2015 and 2016, respectively.

Despite these exciting developments, the GaN power market remains small compared to the gigantic US$335 billion silicon semiconductor market. In fact, according to Yole’s investigation, the GaN power business was less than US$10 million in 2015.

“But before you think twice about GaN, remember that a small market size is not unusual for products just appearing on the market,” commented Dr Hong Lin. Indeed first GaN devices were not commercially available until 2010. According to Yole’s analysts, the most important point to be noticed is the potential of GaN power. Indeed they expect the GaN power business to grow, reaching a market size of around US$300 million in 2021 at a 2016 – 2021 CAGR of 86%. “The current GaN power market is mainly dominated by low voltage (<200V) devices in the forecasted period but the 600V devices should take off,” commented Zhen Zong, Technology & Market Analyst at Yole.

“More than 200 patent applicants are involved in the power GaN industry,” explained KnowMade in its GaN for Power Electronics: Patent Investigation report (KnowMade, August 2015). Such figure is showing the strong interest from power players in the GaN business. The take-off of patenting activity took place in the 2000s with a first wave of patent publications over the 2005-2009 period mainly due to American and Japanese companies. A second wave started in 2010 while first commercial GaN products, collaborations and mergers and acquisitions emerged.

“In the today’s power GaN market, it is crucial to understand the global patent landscape thorough in-depth analyses,” commented Nicolas Baron, CEO & Co-founder of Knowmade. “This approach helps the companies to anticipate the changes, identify and evaluate business opportunities, mitigate risks and make strategic choices.”

New research, led by the University of Southampton, has demonstrated that a nanoscale device, called a memristor, could be used to power artificial systems that can mimic the human brain.

First demonstration of brain-inspired device to power artificial systems. Credit: University of Southampton

First demonstration of brain-inspired device to power artificial systems. Credit: University of Southampton

Artificial neural networks (ANNs) exhibit learning abilities and can perform tasks which are difficult for conventional computing systems, such as pattern recognition, on-line learning and classification. Practical ANN implementations are currently hampered by the lack of efficient hardware synapses; a key component that every ANN requires in large numbers.

In the study, published in Nature Communications, the Southampton research team experimentally demonstrated an ANN that used memristor synapses supporting sophisticated learning rules in order to carry out reversible learning of noisy input data.

Memristors are electrical components that limit or regulate the flow of electrical current in a circuit and can remember the amount of charge that was flowing through it and retain the data, even when the power is turned off.

Lead author Dr Alex Serb, from Electronics and Computer Science at the University of Southampton, said: “If we want to build artificial systems that can mimic the brain in function and power we need to use hundreds of billions, perhaps even trillions of artificial synapses, many of which must be able to implement learning rules of varying degrees of complexity. Whilst currently available electronic components can certainly be pieced together to create such synapses, the required power and area efficiency benchmarks will be extremely difficult to meet -if even possible at all- without designing new and bespoke ‘synapse components’.

“Memristors offer a possible route towards that end by supporting many fundamental features of learning synapses (memory storage, on-line learning, computationally powerful learning rule implementation, two-terminal structure) in extremely compact volumes and at exceptionally low energy costs. If artificial brains are ever going to become reality, therefore, memristive synapses have to succeed.”

Acting like synapses in the brain, the metal-oxide memristor array was capable of learning and re-learning input patterns in an unsupervised manner within a probabilistic winner-take-all (WTA) network. This is extremely useful for enabling low-power embedded processors (needed for the Internet of Things) that can process in real-time big data without any prior knowledge of the data.

Co-author Dr Themis Prodromakis, Reader in Nanoelectronics and EPSRC Fellow in Electronics and Computer Science at the University of Southampton, said: “The uptake of any new technology is typically hampered by the lack of practical demonstrators that showcase the technology’s benefits in practical applications. Our work establishes such a technological paradigm shift, proving that nanoscale memristors can indeed be used to formulate in-silico neural circuits for processing big-data in real-time; a key challenge of modern society.

“We have shown that such hardware platforms can independently adapt to its environment without any human intervention and are very resilient in processing even noisy data in real-time reliably. This new type of hardware could find a diverse range of applications in pervasive sensing technologies to fuel real-time monitoring in harsh or inaccessible environments; a highly desirable capability for enabling the Internet of Things vision.”

The Semiconductor Industry Association (SIA), representing U.S. leadership in semiconductor manufacturing, design, and research, today announced worldwide sales of semiconductors reached $28.0 billion for the month of August 2016, an increase of 3.5 percent compared to the previous month’s total of $27.1 billion and an uptick of 0.5 percent over the August 2015 total of $27.9 billion. August marked the market’s largest month-to-month growth since May 2013 and its first year-to-year growth since June 2015. All monthly sales numbers are compiled by the World Semiconductor Trade Statistics (WSTS) organization and represent a three-month moving average.

“Following months of sluggish global semiconductor sales, the global market recently has shown signs of a rebound, punctuated by solid growth in August,” said John Neuffer, president and CEO, Semiconductor Industry Association. “The Americas market was particularly encouraging, topping 6 percent month-to-month growth for the first time in nearly three years to lead all regional markets. China also stood out, posting by far the strongest year-to-year growth of all regions in August. All told, global sales are still behind last year’s pace, but appear to be on the right track as 2017 draws closer.”

Month-to-month sales increased across all regions: the Americas (6.3 percent), Japan (4.8 percent), China (3.1 percent), Asia Pacific/All Other (2.7 percent), and Europe (0.7 percent). Year-to-year sales increased in China (7.1 percent) and Japan (2.2 percent), but fell in Asia Pacific/All Other (-2.7 percent), the Americas (-3.1 percent), and Europe (-3.3 percent).

 

August 2016

Billions

Month-to-Month Sales                               

Market

Last Month

Current Month

% Change

Americas

5.10

5.43

6.3%

Europe

2.70

2.71

0.7%

Japan

2.60

2.73

4.8%

China

8.56

8.82

3.1%

Asia Pacific/All Other

8.12

8.34

2.7%

Total

27.08

28.03

3.5%

Year-to-Year Sales                          

Market

Last Year

Current Month

% Change

Americas

5.60

5.43

-3.1%

Europe

2.81

2.71

-3.3%

Japan

2.67

2.73

2.2%

China

8.23

8.82

7.1%

Asia Pacific/All Other

8.57

8.34

-2.7%

Total

27.88

28.03

0.5%

Three-Month-Moving Average Sales

Market

Mar/Apr/May

Jun/Jul/Aug

% Change

Americas

4.79

5.43

13.2%

Europe

2.63

2.71

3.3%

Japan

2.55

2.73

6.9%

China

8.09

8.82

9.0%

Asia Pacific/All Other

8.00

8.34

4.2%

Total

26.07

28.03

7.5%

By Paula Doe, SEMI

As the rate of traditional scaling slows, the chip sector looks increasingly to materials and design to move forward on multiple paths for multiple applications. Figuring out more effective ways to collaborate across silos will be crucial.

Source: IBM [IBM slide 6 in Strategic Materials Conference deck]

Source: IBM [IBM slide 6 in Strategic Materials Conference deck]

  1. Paradigm shift requires co-optimization

SMC-Image2

“Scaling has hit a wall, and there is no longer any single path forward,” noted Larry Clevenger, BEOL Architect and Technology Definition, IBM Research, at the SEMI Strategic Materials Conference 2016 (September 20-21). “The materials set we use in the middle and back end of line is running out of steam. We need new materials and design co-optimization.”  He noted EUV would much improve the critical tight pitch areas for the memory and BEOL for 7nm-5nm logic. But reducing the parasitics in the metal interconnect in middle of the line and BEOL will also be critical, with good results demonstrated from new materials like Si:P and Ge:Ga meta-stable alloys, cobalt instead of tungsten, self-forming encapsulation of copper by cobalt, and airgaps, all of which would require optimization of an ecosystem of appropriate cleaning, deposition and wet process technologies for integration. Changing the design to route the critical paths directly up to higher wiring levels where the wires are larger would also help reduce resistance.

“It’s a paradigm shift that what was once a process deviation is now an excursion,” said Archita Sengupta, Intel senior technologist, noting the need for new specialized tools to measure, monitor and control the process to detect ever tinier defects sooner. “We need more proactive cooperation across the supply chain for bottom up control of quality from suppliers.”

Showing impressive examples of imaging and computation enabling doctors to reduce errors in breast cancer detection by 85 percent, and even to operate on a beating heart, using Nvidia GPUs and artificial intelligence, Nvidia’s director of Advanced Technology John Hu noted, “We are at a real inflection point for demand for more compute power, and we can’t get there by just process scaling any more. We are going to have to rely on new architectures to rescue us from the increasingly imperfect reality of materials and processes.”

While almost every speaker stressed the increasing need for the different segments of the supply chain from materials to design to work more closely together to move technology forward along many new paths, the materials suppliers in the audience felt that progress could be better to make this happen. Some audience members talked among themselves of now being invited more often into the fabs to discuss material development, but still not being told much detail about the key target parameters. Material suppliers in the audience raised the issues of the time and expense needed to qualify their second sources for raw materials and precursors, to get the needed environmental certifications, and to find access to the expensive exotic multi-technology metrology tools capable of finding contaminates too small to see with conventional methods, before they could even bring in any potential material to be evaluated for use several years in the future.

Although speakers kept referring to the past Golden Age of Moore’s Law of regular two-year dimensional scaling, before the proliferation of alternatives, Tim Hendry, retiring Intel VP, Fab Materials, pointed out that it hadn’t really seemed like a Golden Age at the time. “As I remember, we thought it was pretty hard back then too.”

  1. Look to self-aligned and selective processes as scaling boosters

As lithography scaling slows down, new approaches will make creative use of deposition and etch to keep improving pattern resolution. “14nm is a real sweet spot technically for lithography that will be with us for a long time,” noted Anton DeVilliers, Tokyo Electron America director of Patterning Technology, suggesting a toolkit of assorted self-alignment and selective deposition and etch processes likely to see increasing use as resolution boosters as an alternative to pushing the lithography, such as collars at key points to protect the pattern, or self aligned patterning by selective etching.

Adding a protective ALD collar holds a key region open during etch to widen the process window and prevent shorts from process variation in tight pattern areas.

SMC-Image3

ALD snap collar holds the critical part of M1 pattern open to widen window in LELELE process…

SMC-Image4

So that overlay variation that would typically create a short…

SMC-Image5

Instead creates the desired pattern. Source: TEL

Using materials with different etch selectivity for different parts of a pattern, such as for alternate lines, enables the creation of a self aligned pattern at higher resolution than the lithography.  Different etch selectivity in alternate metal tracks could also reduce the number of exposure passes and improve overlay tolerance. “For 5nm nanowires, we’ll have to use selective ALD and ALE, controlled by self assembling monolayers,” noted DeVilliers. “We’ve done each of these steps on a tool, but now the challenge is to put them all together.”

  1. Progress on 3D alternatives

“To maintain the pace of progress we’ll have to change everything—we can’t do it with Moore’s Law,” said Bill Bottoms, chairman and CEO, Third Millennium Test Solutions, updating on the international effort to create a Heterogeneous Integration Roadmap. “Future progress will come from bringing active elements closer together through integration at the system level, with interconnect with photonics and plasmonics.” The aim is to map future needs to better enable precompetitive collaboration. The first edition of the roadmap is now slated to come out in March.

SMC-Image6

CEA-Leti researchers meanwhile are reporting good progress on lowering the temperatures of the various processes needed to build a second chip directly on top of a first, for monolithic 3D CMOS-on-CMOS integration.  Performance of the bottom chip degrades if the process temperatures for the top chip are >500°C, mainly because the NiPt silicide deteriorates, but replacing the NiPt with a more stable NiCo and adding an Si cap looks promising to increase stability. The 8nm active active layer for the top device is bonded atop the bottom device at room temperature and annealed 300C. Nanosecond laser thermal annealing and low temperature solid phase epitaxy regrowth help bring down temperatures for dopant activation. Cycles of deposition and etch replace selective epitaxy for the source and drain, while different precursors reduce process temperatures to 500-550C. “Later this year at IEDM we’ll demonstrate top CMOS made at 500°C with these developments,” said Philippe Rodriguez, CEA-Leti research engineer.

  1. Get used to the slow growth world 

The semiconductor industry will see silicon demand (MSI) pick up from this year’s 0.6 percent increase to  ~3.8 percent growth in 2017, and ~6.3 percent in 2018, as some uncertainty about interest rates and government policy in major countries resolves, according to the econometric semiconductor forecast from Hilltop Economics and LINX Consulting. “We got comfortable with 3 percent GDP growth in the world that we sell chips into, but since the 2009 recession we are only seeing about 2.4 percent growth,” said Duncan Meldrum, chief economist, Hilltop Economics. He noted that economists keep saying the world will get back to its regular 3 percent growth next quarter or year, but it hasn’t happened, probably because high government debt levels in most major economies tends to reduce growth by about reduces it. Silicon demand grows a little faster than GDP, but its trends generally track that global growth number more than in the past as the electronics industry matures.

  1. Wafer level fan out will shake up package materials sector

Now that it appears the 40 to 50 percent improvement in performance in the newest Apple A10 processor is largely from its wafer-level fan out packaging from TSMC, demand for the packaging approach is ramping fast. “This is one of the fastest ramps we’ve seem for a package in a long time,” said TechSearch International president Jan Vardaman. “It’s a very disruptive technology that will have a big impact on the industry.” The thinner, lower-cost packaging approach is also showing up in RF and audio codec chips in mobile phones, with  ~2 billion units just in Samsung and Apple phones, potentially bringing big changes to the packaging materials market. Laminate substrate suppliers will see demand plunge, copper post suppliers will see little change, and makers of wafer-level dielectrics could potentially see 3X growth in volume. “But don’t think you’ll see that in revenue, since customers will really beat the prices down.”

And in a final note, the gathered materials sector paused in a moment of silence for Dan Rose, who passed away on September 19.  Dan was a well-known market researcher and founder of Rose Associates with a focus on materials market data.

Originally published on the SEMI blog.

By David W. Price and Douglas G. Sutherland

Author’s Note: The Process Watch series explores key concepts about process control—defect inspection and metrology—for the semiconductor industry. Following the previous installments, which examined the 10 fundamental truths of process control, this new series of articles highlights additional trends in process control, including successful implementation strategies and the benefits for IC manufacturing. 

Introduction

In a previous Process Watch article [1], we showed that big excursions are usually easy to detect but finding small excursions requires a combination of high capture rate and low noise. We also made the point that, in our experience, it’s usually the smaller excursions which end up costing the fab more in lost product. Catastrophic excursions have a large initial impact but are almost always detected quickly. By contrast, smaller “micro-excursions” sometimes last for weeks, exposing hundreds or thousands of lots to suppressed yield.

Figure 1 shows an example of a micro-excursion. For reference, the top chart depicts what is actually happening in the fab with an excursion occurring at lot number 300. The middle chart shows the same excursion through the eyes of an effective inspection strategy; while there is some noise due to sampling and imperfect capture rate, it is generally possible to identify the excursion within a few lots. The bottom chart shows how this excursion would look if the fab employed a compromised inspection strategy—low capture rate, high capture rate variability, or a large number of defects that are not of interest; in this case, dozens of lots are exposed before the fab engineer can identify the excursion with enough confidence to take corrective action.

Figure 1. Illustration of a micro-excursion. Top: what is actually happening in the fab. Middle: the excursion through the lens of an effective control strategy (average 2.5 exposed lots). Bottom: the excursion from the perspective of a compromised inspection strategy (~40 exposed lots).

Figure 1. Illustration of a micro-excursion. Top: what is actually happening in the fab. Middle: the excursion through the lens of an effective control strategy (average 2.5 exposed lots). Bottom: the excursion from the perspective of a compromised inspection strategy (~40 exposed lots).

Unfortunately, the scenario depicted in the bottom of Figure 1 is all too common. Seemingly innocuous cost-saving tactics such as reduced sampling or using a less sensitive inspector can quickly render a control strategy to be ineffective [2]. Moreover, the fab may gain a false sense of security that the layer is being effectively monitored by virtue of its ability to find the larger excursions. 

Micro-Excursions 

Table 1 illustrates the difference between catastrophic and micro-excursions. As the name implies, micro-excursions are subtle shifts away from the baseline. Of course, excursions may also take the form of anything in between these two.

Table 1: Catastrophic vs. Micro-Excursions

Table 1: Catastrophic vs. Micro-Excursions

Such baseline shifts happen to most, if not all, process tools—after all, that’s why fabs employ rigorous preventative maintenance (PM) schedules. But PM’s are expensive (parts, labor, lost production time), therefore fabs tend to put them off as long as possible.

Because the individual micro-excursions are so small, they are difficult observe from end-of-line (EOL) yield data. They are frequently only seen in EOL yield data through the cumulative impact of dozens of micro-excursions occurring simultaneously; even then it more often appears to be baseline yield loss. As a result, fab engineers sometimes use the terms “salami slicing” or “penny shaving” since these phrases describe how a series of many small actions can, as an accumulated whole, produce a large result [3].

Micro-excursions are typically brought to an end because: (a) a fab detects them and puts the tool responsible for the excursion down; or, (b) the fab gets lucky and a regular PM resolves the problem and restores the tool to its baseline. In the latter case, the fab may never know there was a problem.

The Superposition of Multiple Simultaneous Micro-Excursions

To understand the combined impact of these multiple micro-excursions, it is important to recognize:

  1. Micro-excursions on different layers (different process tools) will come and go at different times
  2. Micro-excursions have different magnitudes in defectivity or baseline shift
  3. Micro-excursions have different durations

In other words, each micro-excursion has a characteristic phase, amplitude and wavelength. Indeed, it is helpful to imagine individual micro-excursions as wave forms which combine to create a cumulative wave form. Mathematically, we can apply the Principle of Superposition [4] to model the resulting impact on yield from the contributing micro-excursions.

Figure 2 illustrates the cumulative effect of one, five, and 10 micro-excursions happening simultaneously in a 1,000 step semiconductor process. In this case, we are assuming a baseline yield of 90 percent, that each micro-excursion has a magnitude of 2 percent baseline yield loss, and that they are detected on the 10th lot after it starts. As expected, the impact of a single micro-excursion is negligible but the combined impact is large.

Figure 2. The cumulative impact of one, five, and 10 simultaneous micro-excursions happening in a 1,000 step process: increased yield loss and yield variation.

Figure 2. The cumulative impact of one, five, and 10 simultaneous micro-excursions happening in a 1,000 step process: increased yield loss and yield variation.

It is interesting to note that the bottom curve in Figure 2 would seem to suggest that the fab is suffering from a baseline yield problem. However, what appears to be 80 percent baseline yield is actually 90 percent baseline yield with multiple simultaneous micro-excursions, which brings the average yield down to 80 percent. This distinction is important since it points to different approaches in how the fab might go about improving the average yield. A true baseline yield problem would suggest that the fab devote resources to run experiments to evaluate potential process improvements (design of experiments (DOEs), split lot experiments, failure analysis, etc.). These activities would ultimately prove frustrating as the engineers would be trying to pinpoint a dozen constantly-changing sources of yield loss.

The fab engineer who correctly surmises that this yield loss is, in fact, driven by micro-excursions would instead focus on implementing tighter process tool monitoring strategies. Specifically, they would examine the sensitivity and frequency of process tool monitor inspections; depending on the process tool, these monitors could be bare wafer inspectors on blanket wafers and/or laser scanning inspectors on product wafers. The goal is to ensure these inspections provide timely detection of small micro-excursions, not just the big excursions.

The impact of an improved process tool monitoring strategy can be seen in Figure 3. By improving the capture rate (sensitivity), reducing the number of non-critical defects (by doing pre/post inspections or using an effective binning routine), and reducing other sources of noise, the fab can bring the exposed product down from 40 lots to 2.5 lots. This, in turn, significantly reduces the yield loss and yield variation.

Figure 3. The impact of 10 simultaneous micro-excursions for the fab with a compromised inspection strategy (brown curve, ~40 lots at risk), and a fab with an effective process tool monitoring strategy (blue curve, ~2.5 lots at risk).

Figure 3. The impact of 10 simultaneous micro-excursions for the fab with a compromised inspection strategy (brown curve, ~40 lots at risk), and a fab with an effective process tool monitoring strategy (blue curve, ~2.5 lots at risk).

Summary

Most fabs do a good job of finding the catastrophic defect excursions. Micro-excursions are much more common and much harder to detect. There are usually very small excursions happening simultaneously at many different layers that go completely undetected. The superposition of these micro-excursions leads to unexplained yield loss and unexplained yield variation.

As a yield engineer, you must be wary of this. An inspection strategy that guards only against catastrophic excursions can create the false sense of security that the layer is being effectively monitored—when in reality you are missing many of these smaller events that chip away or “salami slice” your yield.

References:

About the Author: 

Dr. David W. Price is a Senior Director at KLA-Tencor Corp. Dr. Douglas Sutherland is a Principal Scientist at KLA-Tencor Corp. Over the last 10 years, Dr. Price and Dr. Sutherland have worked directly with more than 50 semiconductor IC manufacturers to help them optimize their overall inspection strategy to achieve the lowest total cost. This series of articles attempts to summarize some of the universal lessons they have observed through these engagements.

IC Insights recently released its September Update to the 2016 McClean Report. This Update included Part 2 of an extensive analysis of the IC foundry industry and a look at the current state of the merger and acquisition surge in the semiconductor industry. An excerpt from the M&A portion of this Update is shown below.

After an historic surge in semiconductor merger and acquisition agreements in 2015, the torrid pace of transactions has eased (until recently), but 2016 is already the second-largest year ever for chip industry M&A announcements, thanks to three major deals struck in 3Q16 that have a combined total value of $51.0 billion. As of the middle of September, announced semiconductor acquisition agreements this year have a combined value of $55.3 billion compared to the all-time high of $103.8 billion reached in all of 2015 (Figure 1). Through the first three quarters of 2015, semiconductor acquisition pacts had a combined value of about $79.1 billion, which is 43% higher than the total of the purchasing agreements reached in the same period of 2016, based on M&A data compiled by IC Insights.

In many ways, 2016 has become a sequel to the M&A mania that erupted in 2015, when semiconductor acquisitions accelerated because a growing number of suppliers turned to purchase agreements to offset slower growth in major existing end-use equipment applications (such as smartphones, PCs, and tablets) and to broaden their businesses to serve huge new market potentials, including the Internet of Things (IoT), wearable electronics, and strong segments in embedded electronics, like highly-automated automotive systems. China’s goal of boosting its domestic IC industry is also driving M&A. In the first half of 2016, it appeared the enormous wave of semiconductor acquisitions in 2015 had subsided substantially, with the value of transactions announced between January and June being just $4.3 billion compared to $72.6 billion in the same six-month period in 1H15. However, three large acquisition agreements announced in 3Q16, including SoftBank’s purchase of ARM, Analog Devices’ intended purchase of Linear Technology, and Renesas’ potential acquisition of Intersil) have insured that 2016 will be second only to 2015 in terms of the total value of announced semiconductor M&A transactions.

Figure 1

Figure 1

A major difference between the huge wave of semiconductor acquisitions in 2015 and the nearly 20 deals being struck in 2016 is that a significant number of transactions this year are for parts of businesses, divisions, product lines, technologies, or certain assets of companies.  This year has seen a surge in the agreements in which semiconductor companies are divesting or filling out product lines and technologies for newly honed strategies in the second half of this decade.

By Ted Shafer, Business Manager, Mature Product Sales, ASML

Ted Shafer of ASML reports on the highlights from the ≤200mm manufacturing session during SEMICON West, organized by the SEMI Secondary Equipment and Applications Special Interest Group. Your next opportunity to catch up on latest trends on ≤200mm manufacturing trends and its impact on the secondary equipment and applications market is SEMICON Europa 2016 and the Secondary Equipment Tech Arena session

Wednesday July 13th at SEMICON West a seminar and panel discussion were held to discuss the longevity and growth of the 200mm equipment market, and responses from IDMs, OEMs and 3rd parties to the challenges this growth presents.

Tim Tobin of Entrepix was the first speaker.  Entrepix is a premier 3rd party refurbisher of CMP and other process equipment.  Tim was the first to remark on a phenomenon that the other speakers and panelists also noted: a huge portion of the die in the devices we use daily do not require state of the art 300mm manufacturing.  For example, 60% – 80% of the chips in your smartphone or tablet are manufactured on 200mm – or smaller – wafers.  These wafers are created using mature equipment, which is frequently purchased from the secondary market, often from refurbishers such as Entrepix.

SEMI’s Christian Dieseldorff next provided a great overview of 200mm market trends, titled “200mm Fab: Trends, Status, and Forecast”.  Driven by the growth of IoT (Internet of Things), new 200mm fabs are being built and additional capacity is being added at existing fabs.  Key take-away is that after peaking in 2006, then declining for several years, 200mm wafer starts per month are now forecasted to exceed 2006’s level of 5.4M by 2019.  The question on everyone’s mind is, once that level is exceeded, where will the tools come from to manufacture those wafers?

200mm-image1

Pierric Gueguen of Yole spoke of the increased adoption of exotic substrates like GaN, Sapphire and Silicon Carbide.  These substrates provide many performance advantages, such as lower power consumption, faster switching speed, and high temperature resistance.  Yet the substrates cannot scale to 12”, and sometimes not to 8”.  So the increased adoption of these substrates is driving additional demand for 150mm/200mm tools.

As a counter-point to the 200mm discussions, Karen Erz of Texas Instruments gave a very well-received presentation on TI’s pivot to 300mm for analog, which has traditionally been manufactured on 200mm wafers.  A key to TI’s success is to embrace without fear buying opportunities for used equipment when they present themselves.  TI does not compete at the leading edge – their minimum feature size is 130nm – and thus mature, pre-owned, cost-effective equipment is always their first choice.  In fact, surplus 300mm is often more available, and less expensive, than comparable 200mm tools.  TI capitalized on the bankruptcies of the 300mm fabs of Qimonda Dresden, Qimonda Richmond, and PROMOS, also surplus tools at Powerchip, to scoop up large batches of inexpensive 300mm tools.  They continue to buy surplus 300mm tools when they come on the market, even in advance of actually requiring the tools.  As a result, 92% of RFAB’s analog production is done with pre-owned 300mm equipment.

Emerald Greig of Surplus Global, in addition to organizing the seminar, also provided a well-researched presentation on surplus equipment trends, titled “The Indispensable Secondary Market”.  Surplus Global is one of the largest surplus equipment traders, and they track the used equipment market very closely.  Emerald discussed how the supply of tools per year is trending dramatically downwards.  In 2009 they saw 6,000 tools come on the market, and that run-rate has steadily decreased to the point where by last year it was under 1,000/year.  This year we are at just 600.

200mm-image2

AMAT’s John Cummings provided the first OEM perspective on the 200mm market.  John showed how over 70% of the chips in the segments of automotive, wearables and mobile are produced on <=200mm wafers.  These segments are growing – for example a BMW i3 contains an astonishing 545 total die, and 484 of them are manufactured on <=200mm wafers.   AMAT reports that there are not enough used 200mm tools on the market to support the demand, and thus AMAT supplies their customers with new 200mm tools to augment the upgrades and refurbs they perform on pre-owned tools.  AMAT also provides new functionality for their mature 200mm products, increasing their usefulness and extending their lifetime.

Finally there was the OEM panel discussion, consisting of Kevin Chasey of TEL, David Sachse of LAM, Hans Peters from Ebara, and Ted Shafer of ASML.  Emerald Greig of Surplus Global provided some initial questions and solicited additional ones from the audience.   The OEMs echoed one common theme of the presentations, that 200mm demand is robust, and core tools are increasingly hard to find.  TEL additionally noted that China is a growing player in this market, and that OEMs must now support their 200mm product lines much longer than initially planned.  LAM said that 200mm core supply is so tight that the prices are rising above even comparable 300mm cores.  In response, LAM augments the supply of used tools by creating new 200mm tools.  Ebara added that the core tools coming on the market are often undesirable first-generation tools or tools in very bad condition.  On the other hand, this creates a role for the OEM, who has the expertise to make these tools production-worthy.  ASML noted that many of their larger 200mm customers are considering a migration from the PAS 5500 platform to ASML’s TWINSCAN platform for 200mm production.  Although developed for 300mm, and in general larger and more expensive than the 200mm 5500 series, ASML has spent the last 15 years making TWINSCANs increasingly productive and reliable, to the point where they often offer superior cost of ownership at 200mm than ASML’s 5500 platform.  Furthermore, customers buying TWINSCAN for 200mm production have an easy upgrade to 300mm when/if their plans call for it.

200mm-image3

In summary, the seminar showcased a robust exchange of ideas, where the presenters and panelists examined the resurgent 200mm market, and described many solutions to the common challenge of limited and expensive 200mm cores.

Attend SEMICON Europa and the Secondary Equipment & Applications session on October 26 to find out the latest trends and discuss in what areas OEMs, IDMs and secondary  market operators can cooperate more closely to improve sustainable access to legacy manufacturing equipment.

Find out more about SEMI’s Secondary Equipment and Applications Special Interest Group and the Secondary Equipment Legacy Management Program that is currently under development. For more information and to get involved, contact [email protected] (Ms. Rania Georgoutsakou, Director Public Policy for Europe, SEMI).

Asia-Pacific’s grip as the dominant market for IC sales is forecast to strengthen in 2016 with the region expected to account for 61.0% of the $282.0 billion IC market this year, based on analysis published in IC Insights’ mid-year Update to the 2016 IC Market Drivers report.  The forecast calls for another small gain in total IC marketshare in 2016 after Asia-Pacific held 57.7% share in 2013, 58.4% in 2014, and 60.5% in 2015. The Asia-Pacific region is particularly dominant with regard to IC marketshare in the communications and computer categories, and to a lesser extent in the consumer and industrial categories (Figure 1).  In 2016, IC Insights expects the Asia-Pacific region to surpass Europe and become the largest region for automotive ICs for the first time, as China continues to account for a large and growing portion of new car shipments.  That will leave only the Government/Military end use segment where Asia-Pacific does not have top IC marketshare—a condition that is forecast to hold through 2019.

Figure 1

Figure 1

IC Insights’ Update to the IC Market Drivers 2016 report forecasts total IC usage by system type through the year 2019. Highlights from the forecast include the following items.

– The Asia-Pacific region is forecast to increase its share of the IC market to 62.3% in 2019, from 61.0% forecast for 2016. Over the same time, North American is also forecast to increase marketshare to 23.8%. Conversely, Europe and Japan are expected to lose IC marketshare through 2019. Japan’s IC marketshare is forecast to slip to 5.5% and Europe is forecast to slide to 8.3% in 2019.

– The two fastest growing end-use markets for ICs through 2019 are forecast to be the automotive and industrial/medical segments, having 2015-2019 CAGRs of 8.0% and 7.1%, respectively.  Though having the greatest CAGR through 2019, the automotive IC market is not expected to account for more than 8.0% of total IC sales any time through the forecast period.

– After slumping to only $10.6 billion in 2009, the automotive IC market is forecast to reach nearly 3x that amount ($28.0 billion) in 2019.

– The two largest end-use markets (computer and communications) are forecast to account for 73.7% of the total IC market in 2019, almost the same as the 73.9% share they are forecast to hold in 2016.

– In 2016, analog ICs are forecast to account for the greatest share of IC sales within the automotive (45%) and industrial (50%) segments; logic devices are expected to account for the greatest share of IC sales in communications (41%), consumer (41%), and government/military (32%) applications, and microprocessors are forecast to account for the greatest share (42%) of IC sales in the computer segment.

Global growth in the number of “things” connected to the Internet continues to significantly outpace the addition of human users to the World Wide Web. New connections to the “Internet of Things” are now increasing by more than 6x the number of people being added to the “Internet of Humans” each year. Despite the increasing number of connections, IC Insights has trimmed back its semiconductor forecast for Internet of Things system functions over the next four years by about $1.9 billion, mostly because of lower sales projections for connected cities applications (such as smart electric meters and infrastructure). Total IoT semiconductor sales are still expected to rise 19% in 2016 to $18.4 billion, as shown in Figure 1, but the updated forecast first presented in the Update to the 2016 IC Market Drivers Report reduces the market’s compound annual growth rate between 2014 and 2019 to 19.9% compared to the original CAGR of 21.1%. Semiconductor sales for IoT system functions are now expected to reach $29.6 billion in 2019 versus the previous projection of $31.1 billion in the final year of the forecast.

Figure 1

Figure 1

The most significant changes in the new outlook are that semiconductor revenues for connected cities applications are projected to grow by a CAGR of 12.9% between 2014 and 2019 (down from 15.5% in the original forecast) while the connected vehicles segment is expected to rise by a CAGR of 36.7% (up from 31.2% in the previous projection). IoT semiconductor sales for connected cities are now forecast to reach $15.7 billion in 2019 while the chip market for connected vehicle functions is expected to be $1.7 billion in 2019, up from the previous forecast of $1.4 billion.

For 2016, revenues of IoT semiconductors used in connected cities applications are expected to rise 15% to about $11.4 billion while the connected vehicle category is projected to climb 66% to $787 million this year.

Sales of IoT semiconductors for wearable systems have also increased slightly in the forecast period compared to the original projection.  Sales of semiconductors for wearable IoT systems are now expected to grow 22% to about $2.2 billion in 2016 after surging 421% in 2015 to nearly $1.8 billion following Apple’s entry into the smartwatch market in 2Q15.  The semiconductor market for wearable IoT applications is expected to be nearly $3.9 billion in 2019.  Meanwhile, the forecast for IoT semiconductors in connected homes and the Industrial Internet categories remains unchanged.  The connected homes segment is still expected to grow 26% in 2016 to about $545 million, and the Industrial Internet chip market is forecast to increase 22% to nearly $3.5 billion.  The semiconductor forecast for IoT connections in the Industrial Internet is still expected to grow by a CAGR of 25.7% to nearly $7.3 billion in 2019 from $2.3 billion in 2014.