Category Archives: Semiconductors

At the International Reliability Physics Symposium (IRPS), being held April 14-18, 2013 at the Hyatt Regency Monterey Resort & Spa in Monterey, CA, imec will present new research focused on the stress induced breakdown between the tungsten trench local interconnects (M1, M2) and metal gate in a 28nm CMOS technology. Imec’s Thomas Kauerauf will present a paper titled “Reliability of MOL local interconnects.”

The researchers found that the breakdown voltage shows strong polarity dependence (see figure).

The breakdown voltage revealed significant polarity dependence. Here the VBD data with bias applied at IM1 or the gate is shown.

“This has profound implications for estimating the end-of-life, specifically in bipolar applications,” explained Giuseppe Larosa, IRPS Technical Program Chair. “Here, bipolar means the voltage between the gate and the drain can basically change polarity. That really depends on the situation where the gate can be high and the drain can be ground, or the gate can be ground and the drain can be high. When you’re switching, you are in that situation. The total end of life can be a mix between these two situations.”

The imec authors note that, due to overlay errors, the spacing between the gate and the contact varies, resulting in large VBD and tBD variability across the wafer. They have developed a methodology for an intrinsic TDDB lifetime extrapolation with uniformity correction.

New finFETs feature high-k dielectrics, which are better than conventional silicon nitride dielectrics in that they can be thinner, yet still enable good control of the transistor’s channel region from the gate. Although high-k materials have been in use for five years or more, new reliability concerns associated with their use in finFETs have arisen, particularly bias temperature instability (BTI) and time dependent dielectric breakdown (TDDB).   

"There are two aspects of high k dielectrics that people have to face," said Giuseppe Larosa, IRPS Technical Program Chair. "BTI is again a concern with continued scaling. The second new effect of using high-k oxides is the TDDB physics is really completely different than nitride oxides. There’s been a lot of controversy on how to describe the TDDB for high-k in recent years.”

At the International Reliability Physics Symposium (IRPS), being held April 14-18, 2013 at the Hyatt Regency Monterey Resort & Spa in Monterey, CA, GLOBALFOUNDRIES will present the first large-scale stochastic BTI (particularly PBTI) study in metal gate/high-k technology, confirming fundamental BTI area scaling trends derived from conventional SiO2 technologies, and IBM will report on TDDB in high-k, and how it will lead to more accurate models.

“Contrary to nitride oxides, high-k brings a higher sensitivity of the NFET devices to PBTI. This is mostly due to the face that the high-k material can be sensitive to electron trap activation or generation so producing PBTI effects that you will not see in standard nitride oxide technologies,” Larosa said. “While in nitride oxide, NBTI is the main BTI mechanism that drives PFET aging, in high-k materials it’s both NBTI for PMOS and PBTI for NMOS that is actually producing some BTI aging.”

In a paper titled “Challenges in the characterization and modeling of BTI induced variability in Metal Gate / High-k CMOS technologies,” GLOBALFOUNDRIES researchers show that PBTI in the NFET is similar to NBTI in the pFET, but with a different type of permutation (see figure).

“The actual distribution shows a slightly different trend than the NBTI distribution,” Larosa said.  But everything can be normalized and scaled exactly the same way.

In another paper, titled “A New Formulation of Breakdown Model for High-k/SiO2 Bilayer Dielectris,” researchers from IBM show that breakdown can happen slowly.

“Gate leakage is actually starting to progressively increase until you’re going to get a hard breakdown,” Larosa said. “The distribution in the case is really bimodal. You have completely different behavior between when the first breakdown takes place versus when the oxide breakdown is actually evolving into the hard breakdown. The challenge here has been how to simulate this. This works suggest that it can be done with a Monte Carlo simulation that is based on dual-layer percolation statistical model. Why is this important? Because without this model you cannot be confident in predicting end of life, and having this type of simulation can help in making that projection that can be relevant for product level of circuit level reliability.”

The first large-scale stochastic BTI study in metal-gate/high-k transistors shows that PBTI in the NFET is similar to NBTI in the pFET, but with a different type of permutation.

It’s well-known that transistors generate heat when they’re operating, and that can have a significant impact on the chip’s reliability and longterm longevity. A small increase of 10°C–15°C in the junction temperature may result in ∼ 2× reduction in the lifespan of the device.

In conventional bulk transistors, self-heating is controlled in that the heat moves away down into the bulk of the devices. In newer FinFETS, however, it could pose a serious problem because there’s nowhere for the heat to go. 

“In finFETs, because it’s a three-dimensional structure, this self-heating is a bottleneck in scaling down,” said Giuseppe Larosa, Technical Program Chair of the IRPS. “Self-heating become a key important issue. The International Reliability Physics Symposium (IRPS) is being held April 14-18, 2013 at the Hyatt Regency Monterey Resort & Spa in Monterey, CA.

At IRPS, Intel will present new research in a paper titled “Self-heat Reliability Considerations on Intel’s 22nm Tri-Gate Technology.” This work elaborates on various measurements to observe self-heating as well as the associated reliability implications, not only in the transistors, but the overlying metal lines. “The self-heating of the finFET can locally increase the temperature in the metal wires above, enhancing  electromigration effects,” Larosa said.

Self-heating effects are investigated at various locations in the tri-gate architecture and in long metal lines with multiple finFETs underneath.  FinFET local temperature rise was linear with power and independent of gate stack (predicted by thermal modeling).  A linear trend for metal lines local temperature rise is observed as function powered FinFETs segments.

Intel emphasized the importance of well-calibrated self-heat models for process optimization for cutting-edge performance and reliability. They showed that aging during switching events is affected by local self-heat and shows sensitivity to number of fins or gate lines (see figure). 

“To calibrate the self-heating, you have to make sure that you have a good understanding of the local temperature in the structure that is under investigation,” Larosa said. “The figure shows self-heating at the device level is affecting aging of a given FET. It’s a function of the number of fins and the number of active lines per transistor. “

Self-heat manifests as a sensitivity to the fin or gate count in switching aging degradation. Here, switching conditions are accelerated to enhance the sensitivity.

 

FinFETs offer several advantages compared to traditional planar transistors, but it’s not yet clear what kind of new reliability problems might arise as FinFETs are scaled to smaller dimensions. One concern is what impact bias temperature instability (BTI), particularly negative BTI (NBTI), might have on FinFETs.

In p-channel transistors, NBTI affects small feature size devices and is quite difficult to reduce or eliminate. As feature sizes become smaller, the effects of NBTI become more pronounced, resulting in increases in threshold voltage and decreases in drain current and transconductance in p-channel devices. NBTI results from a positive charge buildup in p-channel transistors. It occurs at low negative gate-to-source voltages and does not result in an increase in gate leakage current. Rather, it affects off-state drain-to-source leakage and reduces the drive current. Generally, this problem is worse than standard hot carrier degradation because it results in permanent interface traps being generated, reducing device lifetime.

At the International Reliability Physics Symposium (IRPS), being held April 14-18, 2013 at the Hyatt Regency Monterey Resort & Spa in Monterey, CA, Intel will present a paper titled “Intrinsic Transistor Reliability Improvements from 22nm Tri-Gate Technology” that shows that FinFETs have a similar total BTI budget at application condition but, while NMOS PBTI can be reduced to near-negligible levels, NBTI sensitivity seems to increase with scaling.

Giuseppe Larosa, IRPS Technical Program Chair said that “Intel’s data suggest that when you look at the total BTI budget, it seems to be even in the finFET world, going from 32 seems to be pretty much okay, but NBTI seems to be an issue because it’s increasing with finFET scaling.” What Intel did was to compare a 32nm planar technology to 22nm finFET technology. The figure shows that 32nm technology in red and the 22nm finFET technology in blue. “You can see that they can manage to really reduce the PBTI, but the NBTI is actually getting worse with scaling,” Larosa said. “NBTI is becoming one of the key challenging issues for finFETs.”

22nm BTI is comparable to 32nm. NMOS is significantly improved due to gate optimization and WF scaling.

Anapass, Inc, a display SoC solution provider listed on the KOSDAQ, today announced that it has entered into strategic collaboration and investment agreements with GCT Semiconductor, Inc., a designer and supplier of advanced 4G mobile semiconductor solutions, to develop and commercialize mobile application processors (AP) for use in smartphones. Anapass will collaborate with GCT to develop a next-generation mobile application processor that is mated with GCT’s leading edge 4G RF/modem SoC solution. As a result, Anapass and GCT will provide a total solution platform incorporating 4G, LTE, RF, modem and AP to the explosively growing smartphone market.

Through this strategic collaboration and investment agreement, Anapass said it is securing technology and engineering resources related to the 4G LTE RF/modem platform from GCT, which is necessary for Anapass to develop a competitive mobile application processor for 4G smartphones. Anapass is also getting access to GCT’s broad 4G ecosystem including the world leading wireless operators and OEM/ODMs with which GCT has been establishing close relationships for years. As part of the agreement, Anapass is making a $30M strategic investment while seeking technology, business and strategic benefit to aid and support its mobile application processor strategy.

Anapass is a display SoC solution provider that developed and commercialized its proprietary intra-panel interface technology for flat panel TV displays known as AiPi (Advanced Intra Panel Interface). Anapass has also been a panel controller supplier for Samsung’s flat panel TV display business and has been listed on the KOSDAQ since 2010. The company has been seeking new opportunities offering the best products and business models, so Anapass said in its official press release that it intends to enter the fast-growing smartphone market in order to achieve diversification and expansion of products, customers and markets.

GCT Semiconductor is a 4G RF/modem SoC solution provider that commercialized the world’s first single-chip LTE solution based upon collaboration with LG Electronics. The LTE solution has been adopted by wireless operators including Verizon, Sprint, Metro PCS, Vodafone, Yota, YTL, SK Telecom and LG Uplus. According to a recent market research report by Forward Concepts, GCT has been ranked third in the 2012 market share of FDD-LTE baseband shipments, behind Qualcomm and Samsung. GCT recently announced that it entered into a 3G/2G IP licensing agreement with LG Electronics to support backward compatibility for 4G smartphones, and is currently developing a multi-mode 4G/3G/2G RF/modem SoC solution.

In addition, Anapass says it will leverage its technical know-how and experience from the successful commercialization of its panel controller products for flat panel TV displays in developing a competitive mobile application processor. Anapass says it is expecting that this agreement will allow the companies to introduce a competitive total 4G solution including 4G, RF, modem and application processor to the explosively growing worldwide mid-range smartphone market and to succeed in its plans for diversification and expansion of products and customers.

The same material that formed the first primitive transistors more than 60 years ago can be modified in a new way to advance future electronics, according to a new study.

Chemists at Ohio State University have developed the technology for making a one-atom-thick sheet of germanium, and found that it conducts electrons more than ten times faster than silicon and five times faster than conventional germanium.

The material’s structure is closely related to that of graphene—a much-touted two-dimensional material comprised of single layers of carbon atoms. As such, graphene shows unique properties compared to its more common multilayered counterpart, graphite.  Graphene has yet to be used commercially, but experts have suggested that it could one day form faster computer chips, and maybe even function as a superconductor, so many labs are working to develop it.

 “Most people think of graphene as the electronic material of the future,” Goldberger said. “But silicon and germanium are still the materials of the present. Sixty years’ worth of brainpower has gone into developing techniques to make chips out of them. So we’ve been searching for unique forms of silicon and germanium with advantageous properties, to get the benefits of a new material but with less cost and using existing technology.”

In a paper published online in the journal ACS Nano, he and his colleagues describe how they were able to create a stable, single layer of germanium atoms. In this form, the crystalline material is called germanane.

Researchers have tried to create germanane before. This is the first time anyone has succeeded at growing sufficient quantities of it to measure the material’s properties in detail, and demonstrate that it is stable when exposed to air and water.

In nature, germanium tends to form multilayered crystals in which each atomic layer is bonded together; the single-atom layer is normally unstable. To get around this problem, Goldberger’s team created multi-layered germanium crystals with calcium atoms wedged between the layers. Then they dissolved away the calcium with water, and plugged the empty chemical bonds that were left behind with hydrogen. The result: they were able to peel off individual layers of germanane.

Studded with hydrogen atoms, germanane is even more chemically stable than traditional silicon. It won’t oxidize in air and water, as silicon does. That makes germanane easy to work with using conventional chip manufacturing techniques.

The primary thing that makes germanane desirable for optoelectronics is that it has what scientists call a “direct band gap,” meaning that light is easily absorbed or emitted. Materials such as conventional silicon and germanium have indirect band gaps, meaning that it is much more difficult for the material to absorb or emit light.

“When you try to use a material with an indirect band gap on a solar cell, you have to make it pretty thick if you want enough energy to pass through it to be useful. A material with a direct band gap can do the same job with a piece of material 100 times thinner,” Goldberger said.

The first-ever transistors were crafted from germanium in the late 1940s, and they were about the size of a thumbnail. Though transistors have grown microscopic since then—with millions of them packed into every computer chip—germanium still holds potential to advance electronics, the study showed.

According to the researchers’ calculations, electrons can move through germanane ten times faster through silicon, and five times faster than through conventional germanium. The speed measurement is called electron mobility.

With its high mobility, germanane could thus carry the increased load in future high-powered computer chips.

“Mobility is important, because faster computer chips can only be made with faster mobility materials,” Golberger said. “When you shrink transistors down to small scales, you need to use higher mobility materials or the transistors will just not work,” Goldberger explained.

Next, the team is going to explore how to tune the properties of germanane by changing the configuration of the atoms in the single layer.

Lead author of the paper was Ohio State undergraduate chemistry student Elizabeth Bianco, who recently won the first place award for this research at the nationwide nanotechnology competition NDConnect, hosted by the University of Notre Dame. Other co-authors included Sheneve Butler and Shishi Jiang of the Department of Chemistry and Biochemistry, and Oscar Restrepo and Wolfgang Windl of the Department of Materials Science and Engineering.

The research was supported in part by an allocation of computing time from the Ohio Supercomputing Center, with instrumentation provided by the Analytical Surface Facility in the Department of Chemistry and Biochemistry and the Ohio State University Undergraduate Instrumental Analysis Program. Funding was provided by the National Science Foundation, the Army Research Office, the Center for Emergent Materials at Ohio State, and the university’s Materials Research Seed Grant Program.

SPIE leaders said they were encouraged to see proposed increases in funds for scientific research and development and a greater emphasis on STEM education in President Obama’s 2014 budget proposal released last Wednesday. At the same time, they stressed the importance of making applied research high priority, and expressed concerns about some funding levels.

The White House proposal includes an 8.4 percent increase over the 2012 enacted level for the National Science Foundation (NSF). Funding would rise for the NSF to an annual $7.6 billion. The budget for the Department of Energy’s Office of Science would increase by 5.7 percent, to $5 billion.

All told, the President’s 2014 budget proposes $143 billion for federal research and development, providing a 1 percent increase over 2012 levels for all R&D, and an increase of 9 percent for non-defense R&D.

“While the budget continues this Adminstration’s unflinching support for science and recognition of the importance of photonics to our future economy and health, I have some concerns,” said Eugene Arthurs, CEO of SPIE, the international society for optics and photonics. “In these times of constraint, It is very encouraging to see proposed increases for NSF, DOE science, and NIST (National Institute of Standards and Technology), and the investment in the NOAA (National Oceanic and Atmospheric Administration) earth observations program is overdue. But it is disturbing to see both NASA and NIH R&D budgets reduced, in real terms.”

Arthurs said that the decrease for NIH is particularly troubling because health issues are changing with demographics and risks are expanding with global disease mobility. He cited recognition by NIH director Francis Collins of the potential for imaging coupled with the power and possible economies from more use of data tools as ways to address those challenges.

A strong proposal, Arthurs said, is the Brain Research through Advancing Innovative Neurotechnologies (BRAIN) initiative announced by the President. The initiative would be launched with approximately $100 million in funding for research supported by the NIH, Defense Advanced Research Projects Agency (DARPA), and NSF.

“The decrease in real terms, compared with 2012 budgets, for defense basic and applied research and advanced technology development is worrying,” Arthurs said. “We need to better understand the deep cuts in defense development when this is where our security has come from and also where for decades there has been much spillover into our tech industry.”

To remain competitive in the global economy, the nation would benefit from even stronger support of applied research, Arthurs said.

“Canada and the European Union are among regions that have established policies focusing priority on applied research, and for good reason,” he said. “Applied research is concerned with creating real value through solving specific problems ― creating new energy sources, finding new cures for disease, and strengthening the security and stability of communication systems. Its metrics are improvements in the functioning of society as a whole and in the quality of individual human lives, not those of laboratory animals, and in patents and new inventions that spark economic growth, not just journal citations.”

That focus on applications is reflected in work being done by the National Photonics Initiative (NPI) committee to raise awareness of the positive force of photonics on the economy and encourage policy that promotes its development. Born out of the National Academies report issued last year on “Optics and Photonics, Essential Technologies for Our Nation,” the NPI is being driven by five scientific societies: SPIE, the international society for optics and photonics; OSA; LIA; IEEE Photonics Society; and APS.

The President’s budget proposal also moves 90 STEM programs across 11 different agencies under the jurisdiction of the Department of Education. This "reorganization" aims to "improve the delivery, impact, and visibility of STEM efforts," the budget document said.

IBM announced plans on Thursday to invest $1 billion in flash memory research and development and launch a series of systems that will use solid state drives.

solid state drives and flash memory IBM

At an event in New York, IBM’s Steve Mills, head of IBM’s software and systems division, said Flash is at a key tipping point and IT will see all-solid state data centers sooner than later. 

Corporate servers have struggled to keep up with the substantial growth in data use from smartphones and tablets. IBM believes there is a solution in flash memory, which is faster, more reliable, and uses less power than a traditional hard disk drive. The $1 billion investment will be put to use in research and development to design, create and integrate new flash-based products in its expanding portfolio of servers, storage systems and middleware.

"The economics and performance of flash are at a point where the technology can have a revolutionary impact on enterprises, especially for transaction-intensive applications," said Ambuj Goyal, IBM’s general manager of systems storage. "The confluence of Big Data, social, mobile and cloud technologies is creating an environment in the enterprise that demands faster, more efficient, access to business insights, and flash can provide that access quickly."

IBM also announced the availability of the FlashSystem line of all-flash storage appliances. Sprint Nextel will be installing nine of these storage systems at its data center, becoming one of the first companies to adopt IBM’s flash-based model.

As part of its commitment to flash development, IBM said it plans to open 12 Centers of Competency around the globe, which will allow customers to run proof-of-concept scenarios with real-world data to measure the projected performance gains that can be achieved with IBM flash products.

"Clients will see first-hand how IBM flash solutions can provide real-time decision support for operational information, and help improve the performance of mission-critical workloads, such as credit card processing, stock exchange transactions, manufacturing and order processing systems," IBM said in a news release.

Once a white-hot PC product that sold in the tens of millions of units annually, netbook computers are now marking their final days, with the rise of tablets causing their shipments to wind down to virtually zero after next year, according to an IHS iSuppli Compute Electronics Market Tracker Report from information and analytics provider IHS.

Shipments of netbooks this year are forecast to amount to just 3.97 million units, a plunge off the cliff of 72 percent from 14.13 million units in 2012. The market for the small, inexpensive laptops had steadily climbed for three years from the time the devices were first introduced in 2007, peaking in 2010 when shipments hit a high of 32.14 million units. Since then, however, the netbook space has imploded and gone into decline—fast.

Next year will be the last hurrah for netbooks on the market, with shipments amounting to a mere 264,000 units. By 2015, netbook shipments will be down to zero, as shown in the attached figure.

“Netbooks shot to popularity immediately after launch because they were optimized for low cost, delivering what many consumers believed as acceptable computer performance,” said Craig Stice, senior principal analyst for compute platforms at IHS. “Initially intended for light productivity tasks such as web browsing and email, netbooks eventually became more powerful, taking advantage of a mature PC technology that allowed cost-effective implementation of various functionalities. And though never equaling the performance of full-fledged notebooks and lacking full laptop features like an optical drive, netbooks at one point began taking market share away from their more powerful cousins. However, netbooks began their descent to oblivion with the introduction in 2010 of Apple’s iPad.”

The following year, netbook shipments dived 34 percent on what would become a trend of irreversible decline.

“The iPad and other tablets came in a new form factor that excited consumers while also offering improved computing capabilities, leading to a massive loss of interest in netbooks,” Stice said.

At the other end of the spectrum, high-end laptops were also making their appearance. Although much more costly than netbooks, they offered premium performance. Squeezed in between, netbooks could only pass off pricing as their strong point, losing out in other benchmarks that consumers deemed important, including computing power, ease of use such as touch-screen capability, and overall appeal.

From the supply end of production, the major original equipment manufacturers of notebooks will have already terminated netbook production at this point. Whatever production is left is expected to be limited, or manufacturers will simply be shipping last-time builds to satisfy contractual obligations to customers.

Mobile PCs also get hit by media tablets

Mobile PCs retained the largest share of the overall PC market in the fourth quarter last year—the latest time for which full figures are available—compared to desktop PCs and entry-level servers. Mobile PCs had about 63 percent share, compared to 34 percent for desktops and 3 percent for entry-level servers.

Nonetheless, mobile PCs continued to be sideswiped by the ongoing popularity of tablets, and new Ultrabooks and similar ultrathin PCs have yet to take off to the extent hoped for by manufacturers.

Among the computer brands, Hewlett-Packard was No. 1 during the fourth quarter with a nearly 18 percent  share of total PC shipments. China’s Lenovo was second, followed by Dell in third place, Acer in fourth, and Asus—which introduced the first netbook in 2007—in fifth.

Landing in sixth place was Toshiba, which climbed one spot from the third quarter, sending Apple one rung down to seventh. Apple struggled during the last quarter of 2012 because of constraints related to panel supply for the company’s new iMac desktop system, which kept Apple PC shipments down.

In eighth place was Samsung, trailing Apple by a tenth of a percentage point, followed by Sony and Fujitsu rounding out the Top 10.

Imagine if you could track and trace connected goods, assets and people in real-time, anywhere, at any time with high accuracy. BlinkSight and imec recently made this future a reality, with the launch of the first ever single-chip indoor GPS solution.

BlinkSight, a fabless semiconductor company in real-time location systems (RTLS), released the first single-chip “indoor GPS” solution for RTLS and wireless sensor network (WSN) applications. Based on ultra-low power impulse radio technology by imec and Holst Centre, the new chip delivers real-time information to track and trace people and objects in indoor environments. Its unique combination of high accuracy, long range and low power consumption is ideal for both business and consumer applications.

“The real-time location business is emerging and the global market for connected devices is growing at tremendous speed,” said BlinkSight CMO Guus Frericks. “Adding highly accurate indoor capabilities to connected devices such as smartphones paves the way for a broad range of game changing consumer applications across ‘Internet of Things’ segments like smart homes, offices and retail.”

BlinkSight’s innovative system solution uses the ultra-low power “impulse-radio” (IR) developed by imec and Holst Centre which enables real-time 3D location information that is accurate to within 10cm. The device combines digital processing elements and sophisticated analog radio functionality in a single chip, enabling superior performance at a low cost of ownership. It can operate in both the 3.1-4.8GHz and 6-10GHz bands for use around the world and seamless co-existence with other wireless technologies.

Moreover, with an operating voltage range of 1.5 to 3.6V, the new device is ideal for battery-powered applications. Together with its small form factor and low power consumption, this makes it suitable for integration into tags, wireless sensors, base stations and mobile devices. A base station equipped with BlinkSight’s technology could track and trace thousands of fast moving tags in real time. In addition the tags are interactive and capable of sending dynamic data (e.g. temperature).

“Our solution is easy to install and thanks to the collaboration with imec we’ve been able to bring a working solution to market very fast,” said BlinkSight CEO and founder Stéphane Mutz. “A lot of effort went into minimizing power consumption, and we expect to have tags powered by energy harvesting available soon. We aim to bring a complete turnkey system to market and want to work with industry leaders to bring accurate indoor GPS capabilities to connected devices.”

“Imec was a pioneer of impulse radio and the first to demonstrate an integrated impulse radio prototype. We are thrilled that BlinkSight is now successfully bringing the technology to market.” said Harmke de Groot, Program Director Ultra-Low Power Wireless and DSP at Holst Centre/imec.

Fabricated in standard 90nm RF-CMOS, the chip is manufactured at TSMC in Taiwan. The chip features a single chip impulse radio transceiver, which is optimized for indoor GPS applications; standard 90nm RF-CMOS technology; single 1.5 to 3.6V power supply; embedded software programmable ultra-low power 128-bit vector DSP; a range greater than 60m line of sight and greater than 20m no line of sight; 3D positioning accuracy better than 10cm; and over five years operation from a standard coin battery.

BlinkSight is a fabless semiconductor company specializing in the design of integrated circuits and turnkey solutions for Real Time Location Systems. Founded in 2011, this privately held company is based in Caen, France and has offices in Eindhoven, the Netherlands.