Category Archives: Top Story Right

Tokyo Electron Limited (TSE: 8035) and FSI International, Inc. (NASDAQ: FSII), entered into a definitive agreement under which TEL will acquire FSI for $6.20 per share in cash, or an aggregate equity purchase price of approximately $252.5 million. This marks further consolidation in the wafer cleaning/surface preparation equipment market, following Applied Materials’ acquisition of Semitool in 2009 and Lam Research’s acquisition of SEZ in 2007.

FSI, a provider of cleaning and surface preparation equipment, has process capabilities complementary to TEL’s. Surface preparation has increasingly become a critical technology in semiconductor manufacturing, and TEL is focused on improving its market position.

Hiroshi Takenaka, President and CEO of TEL, commented: “FSI has a long history as a technology innovator in surface preparation. They have repeatedly developed creative solutions to key challenges in semiconductor manufacturing. I’m convinced that the acquisition will expand TEL’s business by strengthening our ability to provide effective solutions for the full range of current and future customer applications, thereby increasing value to our shareholders.”

Donald Mitchell, Chairman and CEO of FSI, added: "This transaction represents a compelling opportunity for FSI shareholders, employees and customers. By combining the market position, scale and operational excellence of Tokyo Electron with the leading edge surface preparation solutions from FSI, we can ensure that semiconductor manufacturers have access to the advanced technology they need for success at 28 nanometers and below. We are pleased to become part of Tokyo Electron, a premier company in the semiconductor production equipment industry.”

The purchase price represents a premium of 53.5% to the closing price of FSI’s common shares on August 10, 2012. The acquisition, which will be completed pursuant to a cash tender offer followed by a second step merger, has been unanimously approved by the boards of directors of TEL and FSI. The board of directors of FSI unanimously recommends that FSI’s shareholders tender their shares into the tender offer. The transaction is expected to close in calendar year 2012.

Under the terms of the definitive merger agreement between TEL and FSI, TEL, through an indirect wholly-owned subsidiary, will commence a cash tender offer to purchase all of the outstanding shares of FSI’s common stock for $6.20 per share. The closing of the tender offer is subject to customary terms and conditions, including the tender of a number of shares that constitutes at least a majority of FSI’s outstanding shares of common stock, on a fully diluted basis, and receipt of required regulatory approvals, including expiration or termination of the waiting period under the Hart-Scott-Rodino Antitrust Improvements Act. The agreement also provides for the parties to effect, subject to customary conditions, a merger following the completion of the tender offer that would result in all shares not tendered in the tender offer being converted into the right to receive $6.20 per share in cash. TEL will finance the acquisition from its existing cash resources.

Goldman Sachs is serving as exclusive financial adviser to Tokyo Electron in connection with the acquisition, and Jones Day is its legal adviser. Barclays is serving as exclusive financial adviser to FSI in connection with the acquisition, and Faegre Baker Daniels LLP is its legal adviser.

August 13, 2012 — Researchers from IBM (NYSE:IBM) and scientists at ETH Zurich, a leading European university, revealed the first-ever direct mapping of the formation of a persistent spin helix in a semiconductor. This discovery answers the physics question as to whether electron spins possess the capability to preserve encoded information long enough before rotating.

Scientists from IBM Research and the Solid State Physics Laboratory at ETH Zurich report in Nature Physics that synchronizing electrons extends the spin lifetime of the electron by 30 times to 1.1 nanoseconds — the same time it takes for an existing 1 GHz processor to cycle.

Also read: IBM, ETH Zurich nanotechnology research center opens — custom-built for nanoscience research

Today’s computing technology encodes and processes data by the electrical charge of electrons. However, this technique is limited as the semiconductor dimensions continue to shrink to the point where the flow of electrons can no longer be controlled. Spintronics could surmount this approaching impasse by harnessing the spin of electrons instead of their charge.

This new understanding in spintronics not only gives scientists unprecedented control over the magnetic movements inside devices but also opens new possibilities for creating more energy-efficient electronics.

In this photo, IBM scientists Matthias Walser (left) and Gian Salis who published the finding with C. Reichl and W. Wegscheider from ETH Zurich in the 12 August 2012 online edition of Nature Physics.

The scientists observed a previously unknown aspect of physics — how electron spins move tens of micrometers in a semiconductor with their orientations synchronously rotating along the path, similar to a couple dancing the waltz.

Dr. Gian Salis of the Physics of Nanoscale Systems research group at IBM Research – Zurich explains the metaphor: "If all couples start with the women facing north, after a while the rotating pairs are oriented in different directions. We can now lock the rotation speed of the dancers to the direction they move. This results in a perfect choreography where all the women in a certain area face the same direction. This control and ability to manipulate and observe the spin is an important step in the development of spin-based transistors that are electrically programmable."

IBM scientists used ultra short laser pulses to monitor the evolution of thousands of electron spins that were created simultaneously in a very small spot. Atypically, where such spins would randomly rotate and quickly loose their orientation, for the first time, the scientists could observe how these spins arrange neatly into a regular stripe-like pattern, the so-called persistent spin helix.

The concept of locking the spin rotation was originally proposed in theory back in 2003 and since that time some experiments have even found indications of such locking, but until now it had never been directly observed.

IBM scientists imaged the synchronous ‘waltz’ of the electron spins by using a time-resolved scanning microscope technique. The synchronization of the electron spin rotation made it possible to observe the spins travel for more than 10 micrometers or one-hundredth of a millimeter, increasing the possibility to use the spin for processing logical operations, both fast and energy-efficiently.

The reason for the synchronous spin motion is a carefully engineered spin-orbit interaction, a physical mechanism that couples the spin with the motion of the electron. The semiconductor material called gallium arsenide (GaAs) was produced by scientists at ETH Zurich who are known as world-experts in growing ultra-clean and atomically precise semiconductor structures. GaAs is a III/V semiconductor commonly used in the manufacture of devices such as integrated circuits, infrared light-emitting diodes and highly efficient solar cells.

Transferring spin electronics from the laboratory to the market still remains a major challenge. Spintronics research takes place at very low temperatures at which electron spins interact minimally with the environment. In the case of this particular research IBM scientists worked at 40 Kelvin (-233 C, -387 F).

This work was financially supported by the Swiss National Science Foundation through National Center of Competence in Research (NCCR) Nanoscale Sciences and NCCR Quantum Science and Technology.

The scientific paper entitled "Direct mapping of the formation of a persistent spin helix" by M.P. Walser, C. Reichl, W. Wegscheider and G. Salis was published online in Nature Physics, DOI 10.1038/NPHYS2383 (12 August 2012).

Visit the Semiconductors Channel of Solid State Technology!

August 10, 2012 — Propelled by strong sales to tablet and cellphone manufacturers, Asahi Kasei Microsystems (AKM) led the semiconductor magnetic sensor market for the third year in a row in 2011, claiming almost one-quarter of the industry’s total revenue of $1.5 billion, reports IHS.

IHS also compiled a report on magnetic sensor growth by application space. Learn about the markets and trends in Wireless consumer devices reenergize magnetic sensor IC sector

Together the top 10 suppliers of magnetic sensor ICs enjoyed combined revenue amounting to $1.3 billion, equivalent to 90% of the total magnetic sensor space.

Top 10 Suppliers of Magnetic Sensors by Revenue ($M)

 

2011

2010

Y/Y Growth

Asahi Kasei Microsystems

372

300

24%

Allegro MicroSystems

302

264

14%

Infineon

188

142

33%

Micronas

150

143

5%

Melexis

112

107

5%

NXP

96

92

4%

Aichi Steel

40

18

122%

AMS

29

21

36%

MEMSIC

29

2

1340%

Diodes

24

22

9%

Top 10 Total

1,342

1,111

21%

Magnetic Sensor Industry Total

1,499

1,225

22%

AKM posted an estimated $372 million in revenue last year, up a solid 24% from $300 million in 2010, according to an IHS iSuppli Magnetic Sensor report. Share last year of AKM equated to approximately 25% of the total magnetic sensor market, allowing the firm to hold on to the pinnacle it first reached in 2009.

“AKM is the undisputed leading provider of Hall elements and ICs for magnetic sensors, which track position, contact, rotational speed and linear angles in machines and devices, or detect and process magnetic fields to establish location,” said Richard Dixon, Ph.D., senior analyst for MEMS & sensors at IHS. “A large part of the company’s revenue growth last year was from Hall-based 3-D magnetic compasses used in cellphones and tablets.”

Rivalry in the compass space may be heating up, however, in the form of 6-axis compasses. In the past such compasses were bulky and represented no serious competition to 3-axis discrete devices. However, Bosch and Freescale Semiconductor Inc. have recently come up with very compact versions measuring 3 x 3mm. Bosch and Freescale also have in-house technology for both the accelerometer and compass that make up the 6-axis device, which could increase the competitive pressure on AKM as original equipment manufacturers adopt this multiaxis format, IHS believes. Read more about micro electro mechanical systems (MEMS) here.

No. 2 Allegro MicroSystems posted revenue last year of $302 million, up 14% from $264 million in 2010. The company last year made a conscious move away from the low-cost consumer sensor market served by the likes of AKM and toward higher-value products like sensors for the automotive space. Allegro is strong in relatively high-priced Hall-based speed-sensing sensors for camshaft and switches in vehicles, and is also a major supplier of Hall current sensors for use in applications like battery monitoring systems.

Third place went to Infineon Technologies AG, which achieved 33% growth in revenue to $188 million, up from $142 million. This strong growth allowed Infineon to vault past Micronas, which dropped to fourth place. Major growth areas for Infineon include wheel-speed sensing and torque sensors for automotive steering applications. Next year, Infineon will introduce a new current-sensing palette and higher integrated packages with magnets included for speed sensing, in addition to anisotropic magneto-resistive (AMR) technology for angle sensing.

Micronas still managed a 5% rise in magnetic sensor revenue to $150 million, up from $143 million. Micronas suffered a weak 2011 after being negatively affected by the loss of business contracted by Denso, an important customer for Micronas Hall sensors, following the March 2011 earthquake-tsunami disaster in Japan.

At No. 5, Melexis NV saw revenue in 2011 of $112 million, up 5% from $107 million in 2010. The company continues to lead the market — by a long shot — in acceleration-pedal-position sensing, and is also a major presence in sensors for measuring currents. Approximately 80% of its focus remained on automotive, even though the firm also supplies switches for mobile phone displays.

The rest of the Top 10 included NXP Semiconductors, in sixth place with $96 million in revenue; Aichi Steel Corp., in seventh place with $40 million; a tie in eighth place between AMS AG (formerly austriamicrosystems) and Memsic Inc., each with $29 million in revenue; and Diodes Inc in 10th place with revenue of $24 million.

 Memsic had the biggest jump in revenue among the Top 10, growing a stupendous 1,340% derived mainly from a significant surge in the shipments of electronic compasses to Samsung from very low levels the previous year. Aichi Steel had the second-best growth rate, up 122%, with Sharp mobile phones making up the key part of its business.

IHS (NYSE: IHS) is the leading source of information, insight and analytics in critical areas that shape today’s business landscape, including energy and power; design and supply chain; defense, risk and security; environmental, health and safety (EHS) and sustainability; country and industry forecasting; and commodities, pricing and cost. For more information, visit www.ihs.com.

August 10, 2012 — The polyvinyl alcohol (PVA) protection film industry will change significantly in 2012, with many display makers actively pursuing PVA-free and triacetate cellulose (TAC)-free designs. Display manufacturers are looking for costs savings, designs that suit tablet PCs and smartphones, and alternatives to Fuji Film, Displaybank reports.

Also read: Polarizer film trends

PVA films will grow to more than 800 million square meters (Msqm), based on area, in 2012, up from 740 Msqm in 2011. TAC film dominates the space, with 91% market share (about 750 Msqm) in 2012. COP will take 5% market share (40 Msqm), and acryl film is forecasted to have 3% of the market (240 Msqm).

By revenue, PVA protection film brought in Yen 331 billion in 2011, but will decline to Yen 326 billion this year, due to lower TN-use compensation film, the wide-view market, and competition between films. TAC film will bring in about Yen 300 billion.

Development efforts on new films are more active than ever in the history of LCD manufacturing. LCD makers are using polarizers with various combinations, looking for reduced film thickness.

Figure. Apple’s polarizer structures in different products. SOURCE: Displaybank.

Apple’s tablet PCs and smartphones have led the market with innovative structure and thickness. Companies are developing diverse films to supply polarizers to Apple, which has a big market with a single item, and technologies are rapidly moving from research to commercialization. For example, companies are using acryl film to make PVA films thinner. Acryl film has been applied as a compensation film of IPS-use polarizer, and could steal market share from conventional TAC. Processes to apply surface treatments to acryl are emerging as early as 2013, Displaybank reports.

Japan dominates the polarizer film industry. Giants of the polarizer industry, Nitto Denko, Sumitomo, and LG Chem, have their own acryl film production technology. Sumitomo Chemical and LG Chem can have higher price competitiveness than using the conventional TAC films as they secure the acryl resin production technology. 2012 will likely be the most important year for acryl’s offensive on TAC market share. TAC is clearly advantageous in production capacity, price, and know-how, Displaybank says, but acryl offers the display designs that smartphone and tablet makers want.

As these diverse technologies and films developed for mobile devices are applied to high-end TVs, future TV displays are expected to change greatly. If acryl films are used in TVs, the material will see rapid adoption from 2014.

Displaybank analyzed the polarizer film industry and various compensation films from 2010 to 2016 for the report “2012 Compensation Film and TAC/Acryl Film Analysis.” Access the report at http://www.displaybank.com/_eng/research/report_view.html?id=752&cate=4

Visit our new Displays Manufacturing Channel on Solid State Technology and subscribe to our Displays Digest e-newsletter!

by Tony Christian, Director, Cambashi, Ltd., Cambridge, UK

The field of power electronics, the application of electronics for the control and conversion of electric power, is underpinned by basic electrical principles that were established in the distant past by the pioneers of electrical science. But today, the need to supply, modify and control the voltage, current or frequency of electric power arises in a vast number of applications and products spanning a huge range in terms of power handling capability. The industry has generated numerous technological advances to address the ever growing spectrum of requirements; in its 30th anniversary edition, Power Electronics Technology described some of the most important developments of the past three decades.

In the limit, the requirements for power electronics systems range from those designed to handle a few milliwatts such as the DC/DC converters designed to maintain constant voltage as the battery power declines in mobile phones and portable hand held devices, to those handling many megawatts in the large power converters used in the electrical generation and distribution industry. Naturally, the challenges for power electronics designers vary considerably according to application and scale. Those challenges now cover not only electrical function (particularly the drive to maximize efficiency for the power and frequency range in question), but a host of practical and, especially in consumer products, even aesthetic design constraints. For example, the developers of devices like mobile phones or PCs seek to pack ever more functionality into smaller spaces and their power supplies must not consume a disproportionate amount of that space. At the same time, the ever-closer proximity of the components imposes increasing constraints on electromagnetic radiation and limits the ability to dissipate heat. But that kind of challenge is not limited to what is usually regarded as the high tech sector – it seems that even purchasers of auto battery chargers want them to be small and attractive!

For applications of larger scale involving supplying power to electromechanical devices, the design of the power electronics must take into account and adapt to the behavior of the load (usually a motor or drive system) across its operating range. Considerations will include factors such as power factor optimization and minimizing losses, dealing with harmonic currents and eliminating any electromagnetic torque oscillation that might give rise to electromechanical vibrations.  In many applications, including at the high end the power generation and distribution industry, achieving the electrical functionality often requires the design effort to extend to customizing the characteristics of individual system components. There is then the challenge of balancing optimization of the performance of individual elements with the behavior of the overall system.

Power electronics is therefore one of the most multi-disciplinary design problems in the modern industrial landscape. The sheer range of considerations and the differing emphasis between them according to application has led to the development of numerous design tools, each targeted at specific aspects of the design problem.  The design of a power electronics solution will involve using some combination of:

  • ‘Whole system’ simulation technologies based on setting up the circuit logic using component manufacturers’ data for off the shelf components and sub-systems. Here the goal is to understand the system current and voltage levels and frequency components under the range of operating conditions. The analysis is often rendered more complex by the need to incorporate the behaviors of any electromagnetic and /or electromechanical elements. The goal will be to optimize the characteristics of the power electronic system (in terms of the balance of efficiency and cost, size and/or weight) for the application.
  • Systems to analyze individual component or sub-system performance for device types in given applications. For example, since inductors and transformers have a significant impact on power losses and the volume and weight of the system, in many applications optimizing their individual characteristics and performance will be a focus of the design effort.
  • Systems that analyze electro-magnetic emissions. In the majority of applications the power electronics system is in close proximity to other electronic equipment or there are stringent EMC limits such as in defense applications. It is necessary to understand the electro-magnetic signature of the system in order to develop mitigation strategies in terms of electrical filters and/or physical shielding.
  • Thermal analysis technologies. Since the electrical performance of power system components varies with temperature and the function of the system by its nature will involve the need to dissipate waste power in the form of heat, temperature control is a central aspect of power electronics design. Indeed, studies have shown that temperature issues are a major source of failure in electronic systems and avoiding excessive temperatures is therefore a vital aspect of achieving high reliability and extending operating life. Accurate simulation of not only the heat generated by the system but the heat dissipation performance of the options for cooling requires both heat transfer and fluid dynamics capabilities to support the development of the optimum approach to temperature control.
  • Structural analysis technologies. Many power electronics systems operate in hostile environments, for example in aerospace and automotive applications they may be subjected to extremely high levels of vibration, requiring that the structural strength of the system is assured under the target operating conditions.

Caption: Combining Analysis of Circuit Behavior and Thermal Simulation in Power Electronics. Source: www.gecko-research.com.

While adequate technologies for each of these aspects of power electronics design have been around for ten years and in some areas longer, there remain a number of significant challenges in terms of achieving maximum exploitation of the capabilities. The first is ensuring that the design technologies can accurately model the characteristics of the more recent (and continuing) semiconductor and ancillary component advances – we have already noted that recent decades have seen substantial progress. This requires a combination of high flexibility in the system modelling logic and continuous software development.

The second issue has been how to integrate the different design technologies into a coherent and efficient design workflow. Traditionally, the electrical performance was the dominant aspect and once the circuit was confirmed as being able to do the job, the ‘packaging’ – not only size and weight but including the means of controlling temperature and EMC emissions – could be developed to accommodate the electronics. In the last decade, the emphasis has been on how to integrate the various areas of simulation and analysis technology into a unified design environment. For example, data should be able to be exchanged seamlessly between the thermal simulation of a power circuit and the electrical circuit simulator to enable transient junction temperatures to be calculated directly. Not only would such comprehensive integration enable a better balance between the all of the system characteristics to be achieved, but it would also provide the opportunity to reduce the design timescale dramatically.

The third challenge arises from the fact that the recent history of using the sophisticated design tools available for power electronics design has enabled the capture of all kinds of knowledge regarding power electronics performance – electrical, thermal, EMC, reliability and so on. As a result, adequate designs for the majority of applications can be developed relatively easily based on the knowledge and rules encapsulated in the modern design tools. The drive for improved designs is therefore focused on the 1% efficiency improvement or the small reduction in size or weight, making the use of leading edge simulation and analysis tools essential. For the developers of those tools, the challenge then becomes how to make them accessible to a much broader community of users than the traditional market of specialist analysts in companies with large R&D budgets. 

Caption: 3D Electromagnetic FilterSimulation and the Calculated Frequency-Dependent Filter Insertion Losses. Source: www.gecko-research.com.

Developments in recent years in both the electrical and mechanical design software tools, assisted by continued advances in IT hardware and infrastructures, have made substantial progress in addressing these issues. The ability to exchange data between different solutions for different disciplines from different vendors has improved enormously.  Companies like Ansoft, Zuken and Gecko Research now offer integrated suites with the ability to perform complete multi-domain simulations and analyses of power electronics systems, including the full time and frequency performance of the circuit, the thermal behavior, the electromagnetic and electro­mechanical behavior and mechanical stress analysis. The fourth challenge for both multi-discipline analyses and greater accessibility has been the computing power required – even a single-discipline analysis for a moderately complex power electronics design is a highly computing intensive problem. Typically, a simulation is run for each step in a time/frequency series so that many thousands, if not hundreds of thousands, of simulation runs can be required to fully validate all aspects of system performance. The computing requirement is heightened by the fact that many control systems now involve software control and the additional variable of the software set up introduces yet further dimensions to the analysis problem. The promise of ‘infinite’ computing resources in the cloud and the growth in power of dedicated processors that can be added locally mean that there is now sufficient processing power available to the engineer’s desktop to run even quite complex simulations for thousands of time steps.

Give that good progress has been made in addressing the issues of multi-discipline integration and sufficient desktop computing power, the remaining challenge for broadening the user constituency for power electronics design technologies is that of being able to address a suitably wide range of applications while achieving ease of use. The spectrum of applications covers a huge range of power handling requirements (milliwatts to megawatts), frequencies (DC to GHz), temperatures (-55°C to 275°C) and physical scale (µm to m). In the past, as with almost all analysis and simulation systems, the design engineer needed strong expertise and knowledge in modelling and simulation techniques as well as in the technology being modelled in order to ensure that the results represented reality with a reasonable degree of accuracy. Now, to fully exploit the workflow benefits of an integrated suite of design tools, we need the design engineer to be confident in not only the electrical simulation at both the component and system level, but also those for the thermal and electromagnetic behaviors, as well as possibly structural aspects too. The tools therefore need to sufficiently easy to use to be handled by an engineer who is not a specialist in a particular discipline while still providing confidence in the results. There is no doubt that the leading vendors have made invested substantial effort in this aspect of integrated power electronics design and it continues to be a priority area.

For many applications, power electronics forms part of the final product and a likely significant further step in the development of power electronics design technology will be full integration with the product lifecycle management (PLM) environments in use by most large manufacturing companies. The data volumes resulting from larger numbers of design simulations by greater numbers of engineers will put pressure on the data management capabilities of existing PLM deployments, but, in parallel with the increased accessibility of multi-discipline power electronics design tools, we are seeing a similar effort to exploit cloud and web technologies to extend the reach of PLM solutions to smaller companies.  As a result there is an opportunity to move the two forward to support even better management of the power electronics design workflow.

About the author

Tony Christian has a wide-ranging experience in engineering, manufacturing, energy and IT. His early career was in technical R&D roles, after which he moved into computer-aided engineering. His subsequent roles included divisional head of the IT subsidiary of a major international engineering and construction company and leadership of teams developing and implementing state of the art manufacturing control systems at British Aerospace. More recently, Tony was a director of the UK Consulting and Systems Integration Division of Computer Sciences Corporation (CSC), leading a consulting and systems practice for manufacturing industries, and then Services and Technology Director at AVEVA Group plc where he was responsible for all product development and the company’s worldwide consulting and managed services business. Tony has a BSc degree (Mechanical Engineering) and MSc degree (Engineering Acoustics, Noise and Vibration) from the University of Nottingham.

August 9, 2012 — Active-matrix organic light-emitting diode (AMOLED) displays are growing rapidly and offer many performance benefits over liquid crystal displays (LCDs). However, 55” AMOLED TV displays cost 8-10x as much as a comparable LCD to manufacture.

Also read: AMOLED manufacturing improvements to enable TV market share grab

According to the NPD DisplaySearch AMOLED Process Roadmap Report, the manufacturing cost of a 55” oxide TFT-based AMOLED using white OLED (WOLED) with color filters is 8x that of a high-end TFT LCD display of equal size. The cost multiplier of a 55” AMOLED module using red, green, and blue (RGB) OLED is 10x. These higher costs are mainly a result of low yields and high materials costs.

LCD manufacturing is a mature process with slower, more incremental cost reduction. AMOLED cost reduction efforts are in their infancy, said Jae-Hak Choi, senior analyst, FPD Manufacturing for NPD DisplaySearch. These could include new and improved processes, printing technology, and higher-performance materials that will take AMOLED prices to parity with LCD in the long term.

Figure. Relative manufacturing costs of technologies for 55” TV panels. Based on current yield and material cost assumptions. Source: NPD DisplaySearch AMOLED Process Roadmap Report.

In order to scale up to large sizes, advancements in several aspects of AMOLED manufacturing are needed, including the active matrix backplane, organic material deposition, and encapsulation. Because oxide thin-film transistors (OTFT) require lower capital costs and are similar to existing amorphous silicon TFT (a-Si TFT), the technology offers a strong alternative to the low-temperature polysilicon (LTPS) TFT currently used for AMOLED. However, there are many hurdles for mass production of oxide TFT, particularly threshold voltage shifts, which are continuing to prove problematic for AMOLED production.

While indium gallium zinc oxide (IGZO) and other forms of oxide TFT show great promise for backplanes, progress in scaling up LTPS production is also being made by increasing the excimer laser beam width to 1300 mm. In addition, the current method of depositing red, green, and blue materials by evaporation through a fine metal mask is being continuously improved. Pixel densities of 250 ppi are now possible, and over 280 ppi is feasible.

“High resolution patterning such as laser induced thermal imaging (LITI) and material improvements are still required for AMOLED to be highly competitive for super-high-resolution flat panel displays,” Choi said.

Manufacturing processes for small, 4” AMOLED displays are more mature, creating a much smaller cost premium over LCDs (<1.3x). Most AMOLED capacity is currently dedicated to small/medium production for smart phones, but much of the future capacity increase will be driven by fabs dedicated to TV production. Uncertainties abound, as AMOLED technology has not yet been proven in large-size TVs.

Based on planned investments, NPD DisplaySearch forecasts that the AMOLED market will grow nearly tenfold from 2.3M square meters in 2012 to more than 22M in 2016.

Samsung Display has been highly successful in its small/medium AMOLED production because it has been able to raise yields to near-LCD levels. This implies that manufacturers can potentially lower large-size AMOLED TV costs to be competitive with LCD TVs in the future.

The NPD DisplaySearch AMOLED Process Roadmap Report provides in-depth data and analysis on OLED manufacturing technologies including materials, backplanes, OLED, and encapsulation. It also includes an analysis of benefits, opportunities, negatives, and challenges for each technology. Unique to the industry, the report shows specification roadmaps for OLED manufacturing through 2016 and indicates which manufacturing technologies will be required to achieve stability and performance. Also, the report provides a unique equipment investment simulation and module cost modeling analysis. NPD DisplaySearch provides market research and consulting, specializing in the display supply chain, as well as the emerging photovoltaic/solar cell industries. For more information on DisplaySearch analysts, reports and industry events, visit http://www.displaysearch.com/.

Visit our new Displays Manufacturing Channel on Solid State Technology and subscribe to our Displays Digest e-newsletter!

August 8, 2012 — Lab-on-a-chip (LOC) devices are micro-chip-sized systems that prepare and analyze very small volumes of fluids — from a few ml to sub-nl. They hold promise for disease diagnostics and forensic evidence investigation. These devices are fabricated by microfluidics makers, a segment of the micro electro mechanical systems (MEMS) industry.

The National Institute of Standards and Technology (NIST) believes that before LOC technology can be fully commercialized, testing standards need to be developed and implemented. These will define the procedures used to determine if a lab on a chip device, and the materials from which it is made, conform to specifications, said Samuel Stavis, NIST physical scientist.

Standardized testing and measurement methods, Stavis said, will enable MEMS LOC manufacturers to accurately determine important physical characteristics of LOC devices such as dimensions, electrical surface properties, and fluid flow rates and temperatures. These must be calculable at all stages of production, from processing of raw materials to final rollout of products.

Figure. A microfluidic lab on a chip device sitting on a polystyrene dish. Stainless steel needles inserted into the device serve as access points for fluids into small channels within the device, which are about the size of a human hair. Credit: Cooksey/NIST.

Stavis focuses on autofluorescence, the background fluorescent glow of an LOC device that can interfere with sample analysis. Multiple factors must be considered in the development of a testing standard for autofluorescence, including: the materials used in the device, the measurement methods used to test the device, and how the measurements are interpreted. For meaningful sample analysis, all autofluorescence factors must be controlled for or excluded from the measurements.

Quality control during LOC device manufacturing, Stavis says, may require different tests of autofluorescence throughout the process. The raw block of plastic may be measured for autofluorescence, then the substrate the block has been fabricated into, then the final device with functional microfluidics and substrate, Stavis said.

Stavis also emphasizes that it is important not to confuse testing standards with product standards, and to understand how the former facilitates the latter. "A product standard specifies the technical requirements for a lab on a chip device to be rated as top quality," he says. "A testing standard is needed to measure those specifications, as well as to make fair comparisons between competing products."

The argument for testing standards is proposed in a paper in Lab on a Chip: Stavis, S.M. A glowing future for lab on a chip testing standards. Lab on a Chip (2012), DOI: 10.1039/c2lc40511c

Learn more at www.nist.gov.

Visit the MEMS Channel of Solid State Technology, and subscribe to our MEMS Direct e-newsletter!

August 8, 2012 — The light-emitting diode (LED) industry is entering its third growth cycle, general lighting, according to Yole Développement and EPIC’s report, “Status of the LED Industry.” However, the cost of a packaged LED still needs to be reduced by a factor x10 to enable massive adoption. New business models are mandatory to capture added value of LED lighting.

Growth of the LED industry has come initially from the small display application and has been driven forward by LCD applications. LED TV was expected to be the LED industry driver for 2011 but the reality was quite different. Lower adoption of LEDs in the TV market and the entry of several new players, mostly from Asia, created a climate of overcapacity, price pressure and strong competition. As a consequence, packaged LED volume was about 30% lower than expected and revenue shrank due to strong ASP pressure.

Figure. Packaged LED revenue, by application. SOURCE: Yole, Status of the LED Industry, August 2012.

Yole and EPIC estimate packaged LED revenue will reach a market size of $11.4 billion in 2012 and will peak to $17.1 billion by 2018. Growth will be driven both by the display (LCD TV) and general lighting applications until massive adoption of LEDs in lighting.

From 2014, the third growth cycle of the LED business will accelerate with the general lighting application representing more than 50% of the overall packaged LED business. In terms of volume, LED die surface will increase from 22.5 billion mm² (2012) to 80 billion mm² (2018). This will prompt substrate volume growth from 8 million x 2” wafer equivalent (TIE) in 2011 to 39.5 million TIE in 2018, with a CAGR of 26%.

The adoption of LEDs for general lighting applications strongly depends on technology and manufacturing improvements, improving performance and cost to hit an LED adoption trigger point. Industry consensus points out a cost reduction per lumen of packaged LEDs by a factor x10. This can be achieved through a combination of manufacturing efficiency and performance improvement, such as access to larger size wafers, improvements in LED epitaxy cost of ownership through yield and throughput, and improved packaging technologies (phosphors, optics, etc).

Additionally, improved package and luminaire design will also enable significant cost reduction.

Ultimately, the long life of solid state lighting (SSL) technology will totally change the lighting market by dramatically increasing the length of the replacement cycles. The replacement market (aftermarket) will be strongly impacted, pushing traditional players of the lighting industry to define new strategies to capture profit (intelligent lighting, lighting solutions, etc).

“In addition, as value is moving to the top of the value chain (module and luminaire levels), several players that were originally involved only at LED device levels will develop strategies of vertical integration in order to capture more value,” added Tom Pearsall, general secretary, EPIC. But accessing distribution channels represents a big challenge for those players who develop new approaches to sell their lighting products (e-commerce, new distributors). The rise of LED lighting will therefore depend on the right merger of the emerging LED industry with the traditional lighting industry.

The researchers also found that China’s GaN MOCVD reactor capacity has increased by a factor of 20 in the last 3 years. The capacity for GaN LED epitaxy has increased dramatically in 2010 and 2011. This increase took place across all regions but was most dramatic in China (increased by a factor x20 of the reactor capacity between Q4 2009 and Q1 2012).

“Most emerging Chinese LED epiwafer and die manufacturers are still lagging significantly behind their competitors in term of technology maturity and LED performance,” says Dr Eric Virey, senior analyst, LED at Yole Développement.

The bulk of those new companies are not yet capable of manufacturing LEDs to address the large display and general lighting applications that are currently driving the market. In the mid-term, consolidation of the Chinese LED industry will occur (scenario in the central government’s new five-year plan), and China should became a major actor in the LED industry.

The report presents all applications of LEDs and associated market metrics, LED cost reduction opportunities, entire LED value chain, a deep analysis of the general lighting application and an analysis of geographical trends. Authors include Pars Mukish, market and technology analyst and Dr Eric Virey, senior analyst at Yole Développement, amd Tom Pearsall, general secretary, EPIC.

Companies cited in the report: A-Bright, Advanced Photonics, American Bright, American Opto Plus, AOT, ApexScience & Engineering, APT Eelctronics, Aqualite Co, Arima, AUO, Avago, Bridgelux, Bright LED, Brightview electronic, CDT, Century Epitech, Chi Mei Lighting Technology, Citizen Electronics, CREE, CS Bright, Daina, Dominant Semiconductors, Edison, Elec-tech, Enfis, Epiled, Epilight Technology, Epistar, EpiValley, Everlight, Excellence Opto, Fangda group, Formosa epitaxy (Forepi), Galaxia Photonic, GE, Genesis Photonics, Golden Valley Optoelectronics, Hangzhou Silan Azure, Harvatech, HC SemiTek, Heesung, High Power Opto, Hi-Light, Hueyjann Huga, Huiyuan Optoelectronic, Hunan HuaLei Optoelectronic, Hunin Electronic, Idemitsu Kosan, Illumitex, Invenlux, Itswell, KingBright, Kodenshi, Konica Minolta, Korea Photonics Technology Institute (KOPTI), Kwality group, Lattice Power Corporation, LedEngin, LEDTech, Lemnis, Lextar/Lighthouse, LG Display, LG Innotek, Lighting Science, Ligitek, Lite-On, LongDeXin (LDX), Lumei Optoelectronics, Lumenmax, Lumex, Lumileds, LumiMicro, Lumination, Luminus, Lumitek, Lustrous Technology, Luxpia, LuxtalTek, MokSan Electronics, Moser Baer, Nanosys, Nanya, Nationstar, Neo-Neon, Nichia, NiNEX, Oasis, Optek Technology, Opto Tech, Osram, ParaLight, Philips, Power Opto, Powerlightec, Rainbow Optoelectronics, Rohm, Samsung SEMCO, Sanan Optoelectronics, Sanken Electric, Seiwa Electric, SemiLEDs, Seoul semi / Optodevice, Shandong Huaguang Optoelectronics, Sharp, Shenzen Mason Technology, Shenzen Mimgxue, Shenzen Yiliu Electronic, Shenzhen Refond, Showa Denko, Stanley Electric, Sunpu Opto, Supernova, Sylvania, Tekcore, TESS, Tonghui Electronic Corporation, Toshiba, Toyoda Gosei, TSMC, Tyntek, UDC, Unity Opto, Visera Tech, Vishay, VPEC, Walsin Lihwa, Wellipower, Wenrun Optoelectronic, Wooree LED, Xiamen Changelight, Xiamen Hualian, Ya Hsin, Yangzhou Huaxia Integrated Photoelectric (DarewinChip), Yangzhou Zhongke Semiconductor, YoungTeck, Yuti Lighting Shanghai, Zoomview (Xi An Zoomlight), and more.

Yole Développement is a group of companies providing market research, technology analysis, strategy consulting, media, and finance services. For more information, please visit www.yole.fr.

The European Photonics Industry Consortium, EPIC, has three important activities: dialogue with the European Commission, ownership of the European roadmap for photonic technologies, and developing the critical human resource of trained scientists and engineers in the European economic area. EPIC is composed of 80 member organizations and over 400 associate members. For more information: www.epic-assoc.com.

Visit the LED Manufacturing Channel on Solid State Technology and subscribe to the LED Manufacturing News monthly e-newsletter!

Mobile devices use sensors to measure more than µT of magnetic field and m/s2 of acceleration. Sensor data also reveal user activities, postures, environments, and even attention. Sensors are not merely metrological instruments linking sensor algorithms and hardware. To enable user context awareness, advanced sensor algorithms must be well-matched with mobile system architectures on the one hand, and simultaneously understand how users behave on the other.

August 7, 2012 — In 2007, Apple rolled out the iPhone and started a revolution in smart mobile devices. The original iPhone included an accelerometer to sense how a user is holding the device, and orient the image on the display in landscape or portrait accordingly. Today, smartphones from all makers include one or more inertial motion sensors (accelerometers, magnetometers, and gyroscopes). However, application developers and system designers are just beginning to take advantage of their sensing capabilities, including combining sensor inputs — sensor fusion — with advanced algorithms.

Early sensor applications: Motion interfaces

To date, many sensor applications track user gestures, and use the results as another input to the user interface. This allowed users to change screen orientation by rotating the device, erase an email by shaking the phone, or double tap to send an incoming call to voicemail, for example.

The most significant advancement enabled by a gesture-based user interface so far allows users to navigate available applications by tilting their phones to step through menu selections. This ability, coupled with advances in image processing and speech synthesis, now allows vision-impaired users to browse supermarket aisles using their smartphones [1]. For the average smartphone user, besides controlling screen orientation, motion interfaces have largely gone unappreciated and unnoticed.

Figure. A model for sensor algorithms. SOURCE: Sensor Platforms.

User context

Introducing any new user interface requires the user to learn a new set of behaviors. For the vision-impaired, learning and adopting motion interfaces for their smartphones opens new possibilities [2]. For average users, however, using gestures to control their devices is at best a passing novelty, since they do not see enough benefits to justify learning something new. To be truly successful with the general public, a new generation of smart devices must adapt to their users and not demand that users adapt to them. This takes a combination of sensors, intelligent algorithms, and mobile computing resources.

Sensors in mobile devices capture a lot more than gross user movements, like gestures. Accelerometers and gyroscopes in smart phones record muscle tremors and biomechanical resonances from their users. Magnetometers detect magnetic fields emissions from nearby power lines and engines. Such information is generally discarded in motion interfaces but it does contain user contexts; that is, information about the user that can improve interaction.

For example, muscle tension and resonance can identify when and how a user is holding the device. Calculating that the user is holding the phone at his side, the smartphone can turn off the backlight for its display, sensing it is currently unused. On the other hand, sensor signals can indicate when the user is reading the display, and so keep the backlight on.

Again, motion dynamics can identify if a user is standing, sitting, walking or running [3], and so control functions like refreshing GPS or WiFi fixes. Unless, that is, subtle signatures in the magnetic field suggests the device is in a vehicle, which may start to move. Detecting these characteristics requires more than just clean metrological measurements.

To derive user contexts, algorithm developers first collect data containing the specific context, and then create a set of algorithms to recognize it reliably. The data are best collected from subjects who are acting naturally and unaware of the context of interest. Some algorithms can develop an understanding of a user on a personal level, and thus improve reliability by catering to the user’s unique characteristics.

In this article, I use the term “anthropology” as a broad umbrella to include studies of the characteristics of human physical traits, human behavior, and the variations among humans. These inputs are critical today: in designing the appropriate settings to collect the algorithm training information; for determining if the data collected are sufficiently diverse for the algorithm to work for an average smartphone user; and in understanding which part of the algorithm could benefit from user-specific adaptation.

Low-power system architecture

Besides sensors and intelligent algorithms, designers must consider mobile computing resources, which are always limited by battery life. Context-detection algorithms monitor user activities by running continuously in the background, creating a nonstop demand for power whether the user is interacting with the phone or not.

Of course, cell phone designers are familiar with circuits that must remain continuously active. A phone has to be in constant connection with the cellular network to receive calls and text messages. Over many phone generations, designers have focused on minimizing the standby current, the electricity consumed by cellular connectivity when the phone is otherwise completely idle. To do this, the standby mode of a cell phone consists of repeated cycles of sleep and wakeup. The phone wakes up to check for the presence of a call or text message. In the absence of either, the phone re-enters sleep mode. The design increases the efficiency of any hardware needed to check for calls.

The same considerations are applicable to sensor algorithm design. They should be power-aware and adjust the processing requirements based on the amount of meaningful information contained in each sample. For example, the gyroscope used to track the angular rate of device motion requires significantly more power than the accelerometer and the magnetometer combined. The magnetometer and the accelerometer in combination form an electronic compass, which measures the angular position of the device. Because the first derivative of angular position is angular rate, an intelligent algorithm can decide that, when the device is turning slowly in a uniform magnetic field, it is possible to derive angular rate by using a high-bandwidth electronic compass as a virtual gyroscope. Doing so avoids the higher power use of the gyroscope, as well as the computation needed to process gyroscope samples. As the rotation rate approaches the limit of the electronic compass’s tracking ability, the algorithm can switch on the gyroscope and transition to its angular rate measurement seamlessly.

Sensor hardware agnosticism

Sensor component manufacturers have argued that the best-performing sensor algorithms need to be customized to the proprietary characteristics of each sensor component [4] Such arguments treat mobile sensing applications as mere measurement instruments, and thus ignore the impact that system design, target use cases, and user variances can have on the performance and usefulness of sensor algorithms.

While targeted optimization is possible with any algorithm, its impact falls far short of the higher-level architectural concerns discussed here. Given the nature of sensor physics, no single sensor manufacturer can offer the breadth of products that satisfy every price/performance objective for every mobile device in a manufacturer’s product portfolio. Rather than catering to specific configuration components, good sensor algorithms must be derived from sound usage data and be architected for low power, as well as work with a wide selection of sensor components to meet a device manufacturer’s requirements.

Conclusion

Applications for sensors in mobile devices are still evolving. Instead of treating sensors like a set of measuring instruments, new context-aware devices are using sensor information to learn about their users and adapt to improve interactions. Sensor algorithms for these devices must be founded on power-conscious architecture, and a sound understanding of the behavior of target users.

References

1. Vladimir Kulyukin, “Toward Comprehensive Smartphone Shopping Solutions for Blind and Visually Impaired Individuals,” Computer Science Assistive Technology Laboratory, Department of Computer Science, Utah State University, Logan, UT, Rehab and Community Care Magazine, 2010.

2. H. Shen and J. Coughlan, “Towards A Real-Time System for Finding and Reading Signs for Visually Impaired Users,” 13th International Conference on Computers Helping People with Special Needs (ICCHP ’12), Linz, Austria, July 2012.

3. James Steele, “Understanding Virtual Sensors: From Sensor Fusion to Context-Aware Applications,” Electronic Design Magazine, July 10, 2012, http://electronicdesign.com/article/embedded/understanding-virtual-sensors-sensor-fusion-contextaware-applications-74157.

4. As discussed in “You make MEMS. Should you make sensor fusion software?” Meredith Courtemanche, blog entry, Solid State Technology Magazine, May 25, 2012, www.electroiq.com/blogs/electroiq_blog/2012/05/you-make-mems-should-you-make-sensor-fusion-software.html.

Ian Chen is executive vice president at Sensor Platforms Inc. Contact him at [email protected].

Visit the MEMS Channel of Solid State Technology, and subscribe to our MEMS Direct e-newsletter!

August 6, 2012 – PRNewswire — A “leading semiconductor technology innovator” ordered Qcept Technologies Inc.’s ChemetriQ 5000 non-visual defect (NVD) inspection system for unit process development and process integration activities for advanced nodes, including 2Xnm and 1Xnm logic nodes for both front-end-of-line and back-end-of-line processes.

Applying the ChemetriQ 5000’s metrology characteristics for unit process development and process integration enables new looks at the surface characteristics of the wafer after a single process step, as well as how the surface characteristics evolve through an integrated process flow. NVD inspection enables leading-edge fabs to identify yield-loss-inducing issues that do not match any physical defect data. Also read: Impact of charge during gate oxide patterning on yield by Jungtae Park, Samsung Electronics Co.; Sungjin Cho and Jeff Hawthorne, Qcept Technologies Inc.

Qcept’s ChemetriQ platform is being adopted in critical processes for inline, non-contact, full-wafer detection of such NVDs as sub-monolayer organic and metallic residues, process-induced charging, and other undesired surface non-uniformities that cannot be detected by conventional optical inspection equipment.

"The ability of the ChemetriQ 5000 to inspect any wafer at any layer at any time without requiring a change in recipe makes it uniquely suited for the type of advanced process development and integration work that this customer is doing," stated Robert Newcomb, executive vice president of Qcept Technologies. The customer, headquartered in North America, was not named in the release.

Qcept Technologies delivers wafer inspection solutions for non-visual defect (NVD) detection in advanced semiconductor manufacturing. More information can be found at www.qceptech.com.

Visit the Semiconductors Channel of Solid State Technology!