Category Archives: LEDs

A WaferNews Staff Report

It’s widely acknowledged that 2002 will be essentially flat for the equipment industry, with real recovery later in the second half. But, those in the field are already seeing some early signs of life.

“Some customers are asking for tools quickly. That’s a good sign. Business may be turning,” said Steven Lindsay, VP of corporate marketing, LAM Research, Fremont, CA.

Varian Semiconductor Equipment Associates Inc. (VSEA), Gloucester, MA, has had similar queries on how quickly orders could be filled (should they be made), according to President and COO Ernest Godshalk, who’s seen other early signs, as well.

“Some of the tangible things like demo activity have been picking up. Our training backlog has been pushed out as customers have started training more people, and we’ve seen a few pop-up orders,” said Godshalk. “This is hardly a recovery in terms of orders, but there’s definitely been a reawakening.”

There have been increased discussions with customers – some initiated by VSEA, some by the customers – on ways to get guaranteed lead times, including the idea that a deposit of sorts would be put down on equipment to ensure a quicker delivery after an order is placed.

Lindsay expects this to be an up quarter for Lam, but it is too early to know what will happen the rest of this year, although his sense is that there will be a continued pick up in tool orders. He sees demand rising at logic foundries and flash memory makers, with cell phone inventories depleted. There is also action in China, although this is still a small part of the world market. The market is still being led by Asia – Taiwan, Korea, Singapore, and now China – but Japan’s outlook remains bleak, Lindsay reported.

Godshalk agreed that that DRAM and flash are both showing initial signs of recovery, and VSEA is hearing from manufacturers of both. Geographically, however, most of the activity VSEA is seeing is in the US and Korea. Japan, again, is showing the least signs of life while Taiwan and Europe are somewhere in the middle, according to Godshalk.

Lindsay said the majority of buys that Lam is seeing are for upgrades to 0.13-micron and 0.10-micron, except in China, but the Chinese fabs are buying new equipment, rather than used, at present.

Still, Lindsay believes 2002 will be a down year for equipment, compared to 2001. There are only seven fabs buying 300mm equipment, Lindsay added, and he expects this to be 35 to 38% of Lam’s business this year, compared to 25 to 27% in 2001.

Godshalk said VSEA’s seen some increase in capacity buys – particularly for 200mm.

As far as the rest of the year, “That’s the $64,000 question,” said Godshalk.

“If I go out on a long limb, I think what we’re going to see is an increase in capacity buys and technology buys. Clearly customers are interested in advanced technology for ultra-shallow junctions. Customers are also interested in capacity buys at 200mm for newer tools and 200mm for older tools, as well.”

In addition to signs of life from equipment makers on the front lines, some hints of activity from Asia may signal the recovery slated for late 2002.

Air shipments from Japan to Asia increased 2.1% in January compared to January of 2001, the first year-on-year increase in 14 months, according to Japan’s Air Freight Association, apparently largely driven by electronics plants in Taiwan and China stepping up production for the US PC market. Some 70% of these air shipments are semiconductors and related electronic goods. Shipments to China were up 37%, to Taiwan 16%. A source from the trade group told the Nihon Keizai Shimbun that he was even starting to see some contracts to ship semiconductor production equipment, which has not moved at all for some time.

There’s action at Taiwan’s assembly and test houses as well, where sales were up in January, not just over December, but over January of 2001 – the first year-on-year increase since last March. Sales at Siliconware Precision Industries (SPIL) increased 6% in January year-on-year. Sales at the parent ASE were up 1% from a year ago, though total ASE group sales, including its testing business, were still lower than last year’s levels.

The companies saw increased demand from foundry customers TSMC and UMC, especially for higher priced fine pitch packages. SPIL’s packaging plants are currently running at 70 to 75% capacity, its test facilities at 60 to 65%, but its capacity for high-end, high pin count BGAs is in close to full use, and the company has started to buy wire bonders again. This quarter it is raising $180 million from a convertible bond issue and is likely to continue investing more aggressively, according to Deutsche Securities Analyst Yasuo Nakane at Nikkei Microdevices Online.
(By WaferNews editors Bob Haavind, Matt Wickenheiser and Paula Doe)

by Robert P. Donovan

Click here to enlarge image

No one would consider manufacturing many of today's precision products without a cleanroom. Difficult as it may be to believe, there was a time when cleanrooms were not used in fabricating semiconductor chips. Jack Kilby did not build the first integrated circuit in a cleanroom.

The first design of a filtered, well-controlled airflow to achieve improved air quality in a workspace was led by Willis Whitfield at Sandia National Laboratories in the early 1960s.

This development was in direct response to the need to produce components with greatly improved long-term reliability for use in the nation's weapons program. Sandia built a prototype cleanroom in 1961 that featured what was called laminar airflow between the inlet and outlet ducts of the protected space. The idea was that the makeup air entering through filters establishes a flow of clean air that sweeps any contamination created within the workspace into the exit ducts and away from product surfaces.

While Whitfield is now universally credited with developing the first cleanroom, no patent application was ever filed. I've heard many sighs around Sandia concerning this oversight.

The next step in developing the cleanroom concept grew through the actions of a Federal committee initially organized by Sandia workers. With the approval of the General Services Administration (GSA), this group prepared the first Federal Standard 209 (Fed-Std-209), dated December 1963.

It was during the early 1960s that awareness was building concerning the sensitivity of semiconductor structures to contaminants. This was forcefully illustrated by the experience of Bell Labs' Telestar I satellite.

After a period of successful operation in space, this satellite went dead for no apparent reason. Shutting off the electrical bias on the transistors and then repowering restored operation.

The explanation was that electrically charged surface impurities-ions-had drifted along the transistor surfaces under the influence of their operating voltages, creating accumulations of surface charges adjacent to the electrodes. This charge buildup, in turn, induced a surface inversion layer in the transistor collector region, much like the action of a gate voltage in controlling the source-drain current of today's MOS transistors.

In the Telestar transistor, however, the inversion layer acted as a shunting path, shorting or severely degrading the collector-base junction. Removing the transistor voltages eliminated the surface electric fields and allowed the ions to redistribute themselves more uniformly so that the inversion layer disappeared as did the junction-shorting path it created.

This experience foretold the importance of controlling surface alkali ions in MOS technology, as detailed in seminal publications by the research team of Grove, Deal, Snow and Sah at Fairchild Semiconductor Inc. in the mid-1960s. These publications and others made it clear that successful semiconductor manufacturing required new cleanliness standards and practices and that the existing specifications of what constituted an acceptable environment for semiconductor manufacturing needed significant upgrading.

The nascent cleanroom technology under development within the federal establishment thus proved most timely.

In subsequent years, GSA assigned the task of updating Fed-Std-209 to the Institute of Environmental Science and Technology (IEST), an organization representative of a broad spectrum of industrial activities concerned with clean workspaces. Five updated versions followed, the latest being Fed-Std-209E, September 1992. Since then, the IEST has led an international group in preparing international contamination standards, including those specifying cleanroom classification and verification procedures. With the issuance of the international standards, ISO 146644-1 and -2 in the late 1990s, the GSA, earlier this year, officially recognized them as the rightful successors to Fed-Std-209E, signaling that the 209 saga has ended.


Robert P. Donovan is a process engineer assigned to the Sandia National Laboratories as a contract employee by L & M Technologies Inc., Albuquerque, NM.

Food bacteria is here to stay, but that doesn't mean it can't be minimized

Mark A. DeSorbo

CHICAGO, IL-Zero risk is not a reality.

That's the gist of “Emerging Micro biological Food Safety Issues: Implications for Control in the 21st Century,” a report compiled by 21 scientists at the Institute of Food Technologists (IFT).

The authors of the report say the bacteria that cause food poisonings are not going away, despite gallant efforts to eliminate them. Consumers, they say, should be aware that new germs arrive in imported foods and bacteria already here morph into new and more virulent forms.

“Foodborne illness in the United States is a major and complex problem that is likely to become a greater problem as we become a more global society,” according to the report.

That means there will always be a risk, says Frank Busta, a professor emeritus at the University of Minnesota and one of the authors who penned the IFT report. “And that's very true of every risk in our lives,” says Busta, a past president of the food technologists' organization. “It could be a high risk or it could be a low risk. There is some risk, but no risk is not a reality. Our aim in food-safety programs is to reduce that risk.”

Measurable minimization

Reducing risk, he says, means setting a public health goal of allowing contamination to be measurable.

“Let's say, hypothetically, that there are 1,000 Listeria (microbes) per gram of food, but we decide we should have no more than a 100 Listeria per gram of food at consumption,” Busta says. “That, to the best of our knowledge, is a non-infectious dose for someone whose immune system is not compromised. Therefore, we are setting a measurable goal of reducing the risk tenfold, but not eliminating it.”


Scientists say humans are both the hosts and the target of food pathogens, accounting for two of the three factors in foodborne illnesses. The third factor is the actual pathogen. By minimizing potential risks that exist with all three factors, incidents of sickness and even death can be minimized.
Click here to enlarge image

The Listeria monocytogenes bacteria is a pertinent example because the report indicates that it so common in the environment, it is “practically impossible” to keep food entirely free of it.

Listeria contamination, Busta says, is frequently an environmental problem. “It grows on the ceilings, the walls, the floors, and that's why contamination control is extremely important,” he says. “Many foodborne illnesses are the direct result of incomplete or inadequate cleaning.”

Cleaning programs in a food-processing environment may be comprehensive, but they don't mean anything if they aren't executed correctly. “Sometimes organizations don't give enough emphasis to that aspect or pay their personnel enough to do it,” Busta adds. “They need to elevate the cleaning crew to the elite, and pay them the same as their production people. It's that kind of emphasis that gives you a good program.”

Fellow report author Douglas Archer, a professor of food science and nutrition at the University of Florida, Gainesville, agrees.

“There is no such thing as too clean,” he says. “The role of cleaning personnel is critical to the health and the success of the company. Cleanliness is so important in preventing cross-contamination.”

Bantering bacteria colonization

Archer, who served as deputy director for the U.S. Food and Drug Administration's (FDA) Center for Food Science and Applied Nutrition (CFSAN) for 20 years, noted that the IFT report also discusses the importance of learning more about biofilms, stable, hard-to-kill colonies of microorganism that can live and thrive on any surface.

“It's amazing how little we know about biofilms,” he says. “Many of these biofilms can live on stainless steel surfaces and cannot be killed by antibacterial agents. They are impregnable to chlorine and disinfectants.”

Bacteria can not only become resistant to some cleaning agents, but to antibiotics as well, and IFT scientists warn against overusing drugs in livestock, saying that it is causing microorganisms to become impervious and resilient.

“The widespread use of antibiotics in animal production and in the treatment of human illness facilitate the emergence of antibiotic resistance,” the report says. “The selective pressure caused by antibiotic administration causes the microbial populations that harbor the appropriate resistance determinant(s) to flourish. These antibiotic-resistant microbes can make their way into humans through contaminated food or animal-to-human transmission.”

Whether it's within an animal or a clean environment, Archer says “organisms are under stress and can change, and there's no telling if the outcome will be a good thing or a bad thing.”

Scientists also say the increasing use of manure as a fertilizer poses the risk of spreading harmful bacteria to food, either by contaminating irrigation water or by coming into contact with crops.

Manure, which harbors bacteria such as E. coli O157:H7, campylobacter and salmonella, substitutes for chemical fertilizer on organic and conventional crops. In some foreign countries, chicken manure is fed to farm-raised shrimp.

The report also raises concerns about federal regulations on imported fruits and vegetables and the potential for new pathogens getting into the country. According to the report, Cyclospora cayetanensis came to the United States through imported produce and rare forms of salmonella also have been appearing in the country.

“Certainly, you can grow produce that is free of pathogens in developing countries. It's just a matter of sanitary practices and the quality of water that is used for irrigation,” Michael Doyle, a University of Georgia microbiologist who assisted in the study told The Associated Press.

The FDA inspects less than two percent of imported fruits and vegetables, while major supermarket chains, worried about new outbreaks of salmonella and other bacteria, have started requiring domestic and foreign produce suppliers to be inspected by private firms.

From field to, er, digestion

According to the report, better monitoring of foodborne illnesses is needed to spot trends and identify causes. For example, doctors too often treat patients for food poisonings without reporting the illnesses to public health authorities or ordering tests to determine the exact causes. This lack of reporting could mean that government agencies and food companies may not be aware of new pathogens or dangerous products.

To fully combat that problem, Busta recommends another approach other than the industry slogan of providing safe food “from farm to fork.” Although he admits the humor in his recommendation, Busta says that viewpoint must be broadened to provide safe food from “field to flatulence.”

“If we have new nasty organisms coming up, we have to consider how we clean, how we eat, how we change what we eat and is the population susceptible,” he says. “We may be ill for some other reason that makes us susceptible. A person receiving chemotherapy treatments has to be careful about how their food is prepared. Various things can knock out our immune system, and as a result of that, we are far more susceptible. You have to involve the whole process, including the human being and the organisms within their digestive tract.”


Early times Heating Cooking foods kills many foodborne pathogens.
1770s-1800s Canning/thermal processing Significant discoveries in response to industrialization forces and Napoleon’s armies’ need for less dependence on local provisions.
1890s Pasteurization Thermal treatment of raw milk to prevent milk from transmitting pathogens.
1920s- 1930s Safe canning/processing parameters Calculation of product heat penetration curve and initial microbial contamination level to determine minimum time-temperature combination for commercial sterility.
1940s Freezing Mechanical quick-freezing methods to preserve while maintaining quality.
1950s Controlled atmosphere packaging Reduced oxygen levels, increased concentrations of carbon dioxide or selective mixtures of atmospheric gases to limit respiration and ethylene production, delay ripening and decay and increase refrigerated product shelf life.
1960s Freeze drying Rapid deep freezing followed by sublimation of water by heating the frozen product in a vacuum chamber. Best know applied to coffee to preserve delicate aroma compounds and maintain flavor and color.
1940s-1990s Aseptic processing and packaging High-temperature, short-time sterilization of food product independ-ent of the container, container sterilization and filling of product in sterile atmosphere, resulting in increased food quality and nutrient retention.
  Irradiation Non-thermal process to kill pathogens, insects and larvae, inhibit sprouting and delay ripening with and food spoilage.
1990s Carcass spray treatments (e.g. water, acid), steam vacuuming, steam pasteurization Carcass decontamination interventions to meet biological perform-ance criteria.
  High-pressure processing Food subjected to specified pressures and temperatures to preserve food while maintaining quality.

The evolution of food processing

Changes in how foods are processed-such as leaving out salt-can inadvertently lead to new safety problems by making food more hospitable to bacteria, or by causing the bacteria to evolve into hardier forms.

The report cites how yogurt manufacturers started replacing sugar with an artificial sweetener, which led to the growth of the bacteria that causes botulism. The sugar was removing water from the yogurt, making it difficult for the bacteria to grow. Yogurt was then reformulated to eliminate the problem.

“There are a lot of complicated factors resulting in foodborne illness,” says Jenny Scott, senior director of food safety programs for the National Food Processors Association (NFPA; Washington, DC). “You can focus in on one aspect, but things change. You think you are licking them, but something else pops up.”

To Busta and Archer, however, new things that pop up are the direct result of a human being.

“We, the human being, are two of the three factors in foodborne illnesses,” Busta says.

Archer says outbreaks and illness are preventable and are most likely caused by “a lapse in human performance.”

“It comes down to someone simply made a mistake and did not observe basic food-safety principles,” he adds. “Just getting people to was their hands would reduce foodborne illness by a huge percentage.”

The IFT's report, “Emerging Micro biological Food Safety Issues: Implications for Control in the 21st Century,” is available on the organization's Web site at www.ift.org.

March 28, 2002 — The merger of Acer Inc. with Acer Sertek took place yesterday, completing another step forward in Acer’s ongoing restructuring effort.

The English name remains Acer Inc., with 3,800 employees, and capital of NT$19.1B. Acer Inc.’s core businesses are sales and marketing of IT products, e-enabling services, and holding and investment businesses, and aims to become a world-class services company. Stan Shih is the chairman, while J.T. Wang is the president of Acer Inc.

Since the restructuring announcement in December 2000, Acer has undergone many changes, starting with the OEM business separating from the branded business to become Wistron Corp., and Acer Inc. transforming to a marketing and services company. Following that, the former Acer Communications & Multimedia changes its corporate identity to BenQ, and now the merger of Acer Inc. with Acer Sertek to form a new Acer.

The new Acer Inc. consists of five divisions: IT product business headed by Jim Wong, e-Enabling services headed by James Chiang, global business management headed by T.Y. Lay, Greater China operation headed by Scott Lin, and holding and investment business led by Stan Shih.

By Richard Acello
Small Times Correspondent

SAN DIEGO, March 12, 2002 — The telecom slump is leading many companies into a vicious spiral: Carriers are reluctant to purchase equipment from suppliers. But those suppliers, who are also financially strapped, are selling equipment that can reduce costs and increase competitiveness for the carriers.

Even so, startups with venture capital continue to innovate and come out with new products. A case in point is Newark, Calif.-based LightConnect Inc. The manufacturer of MEMS-based, fiber optic components will bring its new Dynamic Channel Equalizer (DCE) to the Optical Fiber Communications Conference and Exhibit, opening March 17 at the Anaheim Convention Center in California.

The DCE is a module roughly 4 inches wide and 8 inches long, powered by a MEMS

null

LightConnect’s Dynamic Channel Equalizer
can control and attenuate wavelengths at a fraction
of the cost compared to existing optical network
architecture, according to the company.
chip of about 3 millimeters. Using the company’s core diffractive MEMS technology, company execs say the DCE has the ability to control the power of more than 100 wavelengths in one module, eliminating the need for a multitude of components to control each individual wavelength.

“Typically, in one fiber you have between 80 and 100 wavelengths for long haul voice or data transmission and the first thing you do before you process the wavelength is to separate them and direct the traffic and you need attenuators,” said Yves LeMaitre, vice president of marketing for LightConnect.

The DCE can control and attenuate the wavelengths at a fraction of the cost compared to existing optical network architecture, said LeMaitre, and with a fourfold reduction in size and power. It can be used in optical multiplexers, wavelength modules and in optical switching applications. The DCE is also software configured, LeMaitre said, so that wavelengths can be selected for allocation to output ports.

LightConnect’s current product line includes a fast variable optical attenuator and a dynamic gain equalizer. LeMaitre said the company has a customer base of more than 40 vendors for these products, which are manufactured using standard CMOS technology.

INVESTORS SPRING FOR SECOND ROUND

Founder David Bloom developed the diffractive MEMS technology while he was a professor at Stanford University.

Using the MEMS component, said LeMaitre, “we can select individual wavelengths and either attenuate them or block them. The MEMS component creates a diffractive effect in the light path to eliminate certain channels or to attenuate (them).”

The DCE development has been funded by an undisclosed “large system vendor” and “we already have orders for systems,” LeMaitre added. Sample quantities of the DCE will ship in the second quarter.

LightConnect may be bucking the market with its product introduction, but so were Incubic LLC, Sevin Rosen Funds and Morganthaler Ventures, among others, who beamed $15.8 million into LightConnect in a July 2001 second round of financing. The company’s $8.4 million first round was funded in June of 2000, and was led by Sevin Rosen and Incubic.

LightConnect has 58 employees, and though LeMaitre concedes that “we are feeling the slowdown of the economy,” he said “our burn rate is low so we are in good shape.”

LightConnect’s revenue target for 2002 is between $5 and $10 million.

Though some analysts maintain that MEMS companies like LightConnect were born too late to prosper in the telecom spending spree of the late ’90s, Marlene Bourne, senior analyst for Cahners In-Stat, disputes that view.

“I don’t believe that,” she said. Though the market for telecom vendors is showing “ghastly sales” overall, Bourne said, “there are companies that are buying in Europe and Japan.”

Bourne said the fact that investors were willing to stake LightConnnect to a larger second round in a bear market “bodes well” for the startup.

Lawrence Gasman, president of Communications Industry Researchers, based in Charlottesville, Va., said companies like LightConnect are always ahead of the technology curve.

“It’s important to understand the value chain,” he said. “Say you bring out a product at the (trade) show. Someone says, ‘good product’ and later they say, ‘we’d like to design it into our system.’ First, you get the design win, then the business.”

On the wisdom of bringing out product now, during a telecom slump, Gasman said: “You could unexpectedly get revenue from bringing out new product now, versus zero revenue if you don’t.” Even with the slump, “it’s not like (telecom supply) is a tiny market.”

In any event, Gasman said he expects signs of growth in the telecom sector as early as the third quarter this year: “So if (LightConnect) is going to catch the next tech wave, their timing is about right.”

Related News
Dispatch from the telecom shakeout: Auction sells remnants of MEMS firm

By Rachel Robinson
WaferNews Associate Editor

Semiconductor technology is a key driver for not only the US economy, but for the world’s, as well, but the future of the industry may be in peril if the US government doesn’t step up to the financial plate, according to the Semiconductor Industry Association (SIA).

If the US is going to keep on pace in terms of technological growth, the federal government has to provide additional funding for university research for the physical sciences and engineering, SIA believes.

According to SIA, from 1993 to 1998, federal funding for key disciplines declined considerably. Funding for math and physics research declined some 20% during that time, funding for chemistry dropped about 10%, and funding for some fields of engineering dropped between 20 to 40%.

The danger here, Juri Matisoo, VP of technology at SIA told WaferNews, is that with a decrease in funding comes a decrease in professors and students. That decrease leads to a smaller pool of qualified professionals, which, in turn, translates into a lack of the university-based research which is often the basis for the future of information technology.

Staying in line with Moore’s Law had led to faster US economic growth, greater productivity, higher federal budget surpluses, and the creation of higher-paying high-tech jobs. But, as technology advances and new technical challenges emerge, SIA insists that current levels of government funding are not adequate.

The group believes that within the next six years, the semiconductor industry will be facing technical challenges for which there are no known solutions. With that knowledge comes the conviction that federal funding for research is imperative.

“I don’t think that the funding has ever been sufficient,” Matisoo said. “SIA has really put its efforts behind getting Congress to increase funding for universities. If you look at federal funding, the budgets for institutes of health have gone up. On the other hand, the physical sciences have been left behind, with budgets decreasing over five or more years.”

Matisoo told WaferNews that the Clinton administration increased the National Science Foundation (NSF) budget by 10% during his last two years in office. During the first year of the Bush administration, an increase of only 1% was proposed, but, after time spent lobbying Congress, it went up by 7.5%.

“Is the 7.5% increase enough?” Matisoo asked. “I doubt it.”

With the physical limits of semiconductor performance approaching, the latter stages of the ITRS roadmap would be affected by the lack of funding to university research. “That’s where the revolutionary ideas come from,” Matisoo said.

According to SIA, a loss of international leadership in semiconductor technology would be economically damaging, and would hurt the ability of the US to provide for national security. With that in mind, SIA believes that with cooperation between the US government and the chip industry, it can be ensured that the basic physical science and engineering needed to drive progress, and the economy, will be in place.

SIA supports the following initiatives:

*Multiyear funding for university-based research in advanced microelectronics by the defense department;

*A tripling of the funding for physical sciences of chemistry, materials sciences, physics, mathematics, electrical and communications systems engineering, computer and information science and engineering programs, and related engineering research centers;

*$5 million for work at NIST for measurement technology at the nanometer level.

Additionally, a top priority for SIA is multiyear funding for the Office of the Secretary of Defense’s Government-Industry Co-sponsorship of University Research (GICUR) program, which it hopes will receive $9.2 billion in FY02, up from FY01’s $6.7 billion. Currently, four semiconductor-related centers are under the GICUR program, with activities at 22 universities nationwide. The centers are the Gigascale Silicon Research Center/Design and Test at UC-Berkeley; the Interconnect Focus Center at Georgia Institute of Technology; the Materials, Structures and Devices Focus Center at MIT; and the Circuits, Systems, and Software Focus Center at Carnegie Mellon U.

According to SIA, the funding will allow MIT and Carnegie Mellon to reach planned levels and experience growth in their programs.

SIA also recommends that the Future Years Defense Program include sufficient funds to allow the focus center program to grow as originally planned in 1999 – the budget calls for: 2002 – $10 million; 2003 – $13 million; and $15 million in 2004, 2005, and 2006.

The long-term increase in the Future Years Defense Program should be within the context of the larger increases in defense R&D, according to SIA. In FY02, the Department of Defense will receive some $49.2 billion in funding, with a proposed increase to $54.5 billion in FY03.

SIA also supports the National Nanotechnology Initiative, and believes that funding should be increased from the FY01 level of $446 million to $579 million in FY02, and $679 million in FY03. According to published reports, President Bush’s budget, which has been sent to Congress, requests 17% more, or $679 million, for the federal Nat. Nanotechnology Initiative.

Additionally, SIA supports a doubling of the budget for the NSF from FY01 $4.4 billion to $8.8 billion in FY06.

WaferNews

By Avi Machlis
Small Times Correspondent

JERUSALEM, March 1, 2002 –Israeli scientists have found a way to use nanocrystals to turn polymers, or plastics, into devices capable of conducting optical telecommunications, a breakthrough that could pave the way to a new breed of optical components.

The discovery, reported in the latest issue of Science magazine, was made by Professor Uri Banin, a director of the nanotechnology center at the Hebrew University of Jerusalem, and Nir Tessler, from the electrical engineering department at the Technion-Israel Institute of Technology. Although the researchers admit practical

null

Illustration courtesy of Nir Tessler and Uri Banin
A breakthrough in nanomaterials for optical components
could enable information to stream to homes at unprecedented
rates directly to the end users via fast fiber-optic connections.

applications are far off, optics industry experts say the discovery is important because materials for optical communications have been limited to a small pool of specific inorganic semiconductors.

In recent years, plastics have been used to create light emitting diodes (LEDs) that emit visible light for a variety of products. But material scientists were not able to get them to work the range of 1.3 microns or 1.5 microns, the near infrared wavelengths used for optical telecommunications.

“We found a way to extend the optical activity to that range by combining nanocrystals with conjugated polymers,” said Banin. “What we have made is a prototype for a very rudimentary device. It provides alternative means of making electronics.”

Banin said that commercial applications for the patent-pending prototype are not around the corner. However, as the optics industry moves toward delivering high-speed data links to households, he believes applications will emerge for cheap devices needed for achieving optimal data transmission speeds in home electronics.

The technology, a device that measures about 100 nanometers thick and uses nanocrystals of indium arsenite, still needs to be improved. Its efficiency — the number of photons emitted divided by the number of electrons entering — only reaches 2 to 3 percent. This is still better than previous polymer-optics attempts, which yielded efficiency of 0.01 percent, but not enough for commercial applications. The scientists hope to ratchet up the efficiency of their device up to as much as 30 percent in forthcoming research.

In the future, producing devices based on the nano procedure could prove extremely cost-efficient because it may be possible to use standard ink-jet printing technology to construct plastic-based optical devices.

Before the discovery, there were only a limited number of materials that could be used for optical communications. John Prohaska, director of technology and research at the Center for Advanced Fiberoptic Applications, a Massachusetts-based industry consortium, said the discovery is an “important development” for optical communications. “It could conceivably give a price advantage or a performance advantage,” he said.

Moti Margalit, chief technology officer of Lambda Crossing, an Israeli optical component startup, also confirmed that the development is a breakthrough for the optics industry but said the benefits of future applications may not necessarily be price-related.

“When we look forward we are seeking a platform that allows us to integrate both optics and plastics in a cheap and reliable way,” he said. “Since these materials can be easily applied over a variety of substances and substrates they open up new possibilities for integration.”

Feb. 11, 2002 – Cambridge, MA – Nanosys Inc. announced the publication of a landmark paper in the Feb. 7 issue of Nature, entitled “Growth of Nanowire Superlattice Structures for Nanoscale Photonics and Electronics.”

Dr. Charles Lieber, Nanosys co-founder and the Mark Hyman Professor of Chemistry at Harvard University, authored the paper along with graduate students Mark Gudiksen, Lincoln Lauhon, Jianfang Wang, and David Smith. The paper demonstrates general methods for controlling the composition and charge of multiple materials on a single nanostructure.

The researchers take advantage of the flexible control of multiple variables to build novel devices in electronics, photonics, and opto-electronics. Lieber’s work is purported to be first example of a generic method for creating heterojunctions of arbitrary materials in a single nanostructure.

In addition, Lieber’s technique allows the modulation of doping levels along the length of a single nanowire. This ability to control charge and composition on the nanoscale enabled the Lieber group to make a variety of unique prototype devices such as GaAs emitters embedded in GaP wires, novel n-Si/p-Si nanowire diodes for high density nano-scale logic, and nanometer sized LEDs derived from alternating segments of n-InP/p-InP on a single nanowire.

This robust synthetic method allows for the creation of superlattice heterostructures of materials that cannot be interfaced in macroscopic structures. This could open the door to the discovery of new materials and devices for a semiconductor industry that faces the impending limit to Moore’s law.

The new process developed in the Lieber laboratory allows two different materials to be placed next to each other on the same nanometer-sized semiconductor crystal in a manner that creates a perfect electrical junction (defect-free). The Nature paper showcases the power of this technology by developing several novel prototype devices enabled by these new materials.

In the first set of experiments, the researchers controlled the growth of segments of light emitting GaAs spaced by non-emitting GaP segments on a single nanowire. These nanowires functioned as photonic devices with emission of light from the GaAs segments only when excitation light was directed along the nanowire. In addition to controlling the incorporation of different materials, the group of researchers also demonstrated that the length of each segment could be controlled precisely.

By varying the lengths of the GaP and GaAs segments, a nano-barcode was created that could be read by measuring the pattern of the fluorescence along the length of the wire. Just like a grocery-store barcode, in which a pattern of different-thickness lines is used to uniquely identify an item, these unique structures could potentially form the basis for a nano-encoding system. Other applications of this type of precise control fall in areas of photonics, electronics and quantum computing.

The general methodology of placing materials with different bandgaps in precise contact offers additional benefits, such as the ability to create individual wires with specifically engineered bandgaps. Other potential applications include the creation of both one-dimensional waveguides that are formed from the emitting material, and intrinsic cavities for nanowire lasers.

Finally, the researchers were able to use their new technology to create the most essential element in modern electronics within a single nanowire: a p-n junction.

In this demonstration, the researchers fabricated a batch of nanowires that were composed of half p-type silicon, half n-type silicon. The resulting interface produced a functional p-n junction on the nanoscale. In addition to applications in nanoelectronics, such as bipolar transistors and highly integrated nano-logic arrays, such devices could play a role in the creation of ultra-sensitive biological and chemical sensors for the detection of biowarfare agents among other sensing targets. P-n junctions in InP nanowires were also formed to create a single nanowire LED device.

Potential applications for a single nanowire LED include quantum cryptography and information processing. The ability to control charge in a binary format may lead to creative methods of designing assemblies of different nanowires to arrive at complex architectures.

“This work really sets our technology apart in the creation of superlattice structures that are varied by materials, doping modulation, and segment size control in heterostructures. In addition the excellent graduate students in my lab extended this basic materials development advance into the proof-of-concept stage by building three completely novel device applications in photonics, electronics, and opto-electronics. At the moment our laboratory is in the enviable position of deciding which type of new device to make, as the possibilities for materials development and device applications seem almost infinite at this point in time,” said Lieber.

“I am awestruck by how rapidly the development of nanowire technology is occurring. The Lieber technology and materials are truly a once in a lifetime advance that will only be completely appreciated once Nanosys translates this excellent scientific work into equally impressive products in the electronics, opto-electronics, and molecular sensing markets. The range of applications will force us to make some important decisions in terms of what technologies to develop internally and what applications to develop with partner companies,” said Nanosys President and CEO, Larry Bock.

By Richard Acello
Small Times Correspondent

Feb. 8, 2002 — University of California, Davis nanoscientist Alexandra Navrotsky recalls watching the light under her grandfather’s door. He was a civil engineer and young Navrotsky wondered about the “curiosities that keep lights burning after midnight.”

Her own curiosities about the nature of the earth recently led the Philadelphia-based Franklin Institute to award Navrotsky the prestigious Benjamin Franklin Medal in Earth Science for her work on thermochemistry, high pressure materials and nanomaterials. The Franklin medals have been described as “American Nobels.” In fact, 98 Franklin laureates have gone on to capture Nobels, including Marie Curie and

null

Alexandra Navrotsky is studying
Nanoparticles in the search for a
“geoengineering solution” to global
climate change. It’s theoretically
possible, but “the stakes are very
high,” says a fellow scientist.
Albert Einstein. Orville Wright, Thomas Edison and Stephen Hawking have also won Franklins.

Navrotsky’s research focuses on nanomaterials composed of small particles a few atoms in diameter. Compared to so-called bulk materials, nanomaterials have a high surface area for their volume so that nearly all of the atoms are on the surface. Navrotsky studies how the chemical and electrical properties in nanomaterials differ from larger structures and is a pioneer in the developing field of nanogeoscience.

“In the geological realm, nanomaterials are important to understanding where all reaction happens in the earth,” she says.

For example, she explains, the behavior of nanosize dust may hold clues in understanding global climate change, the movement of pollutants in the atmosphere and the weathering of minerals.

At UC Davis, Navrotsky holds a chair in mathematical and physical science, but divides her time between four departments: chemistry; chemical engineering and materials science; geology; and land, air, and water resources. She heads the university’s initiative on Nanophases in the Environment, Agriculture and Technology (NEAT), which includes 20 UC Davis faculty from a variety of departments. NEAT is funded in part by a grant from the National Science Foundation (NSF).

“Alex is amazing. She’s a powerhouse. There’s a lot of uncharted area of discovery in nanoscience,” says Professor Tony Wexler, a NEAT colleague whose specialties include mechanical and aerospace engineering, civil and environmental engineering and land, air and water resources.

Wexler’s NEAT research centers on understanding the role of nanoparticles in global climate change and their effect on everything from agriculture to the quality of air in pharmaceutical factories.

Nanoparticles, also called aerosol particles, are suspended in the air in varying quantities around the globe and are generally thought to be a force for global cooling. Carbon dioxide, on the other hand, is generally thought to be a force toward global warming, and is distributed evenly over the globe. “The spatial distribution of the heating and cooling effects is different than they used to be,” Wexler says. “And our understanding of these effects is very crude.”

A so-called “geoengineering solution” to global climate change is theoretically possible, but “the stakes are very high,” Wexler says.

Devices that measure aerosol particles have been commercially developed by companies such as St. Paul, Minn.-based TSI Inc., and Rupprecht & Patashnick Co. Inc., based in Albany, N.Y.

But, Wexler says, “We (UC Davis) are the only people in the world who can give you the size and composition of nanoparticles down to 10 to 20 nanometers.” Wexler says he has been in contact with parties that want to commercialize the research, but can’t disclose them due to patent considerations.

NEAT’s studies into nanoparticles and their effect on global climate change has obvious implications for agriculture, but also for clean rooms, pharmaceutical factories producing aerosol powders — where workers could be exposed to the drugs they are producing — and in other occupational safety and health issues. NEAT’s work could also resonate in areas such as energy pricing and conservation.

The NSF is also funding Navrotsky’s work as a principal investigator for the Center for High Pressure Research, charged with studying the deep interior of the Earth and other planets.

Navrotsky has written more than 200 papers centered around the question, “Why does a given structure form for a specific composition, pressure and temperature?” The answer lies at the intersection of thermodynamic properties, material structure and chemical bonding.

Of her methodology, she says, “You start reading things, and your idea of what’s exciting keeps changing. It’s always evolutionary.”

Her laboratory is a 5,000-square-foot space equipped with high-temperature calorimeters that measure the thermodynamic reactions, or heat given off or absorbed, when a material is exposed to a catalyst. This is done by heating the material to its melting point and when it dissolves, or changes from solid to liquid, measuring the difference in temperature.

The lab is populated by a research group of 25 with an annual budget of about $1 million. Though many of her friends are scientists, in explaining her work to lay people she might say, “I study how atoms love each other and how it affects the universe,” or simply “I count calories for a living.”

“She’s driven, has a great sense of humor, and her enthusiasm is contagious,” says Susan Kauzlarich, a professor of chemistry at UC Davis and a colleague in the NEAT program. Because of the interdisciplinary nature of NEAT with about 20 faculty, meetings of the entire group are rare, Kauzlarich says, but the group stays in touch through seminars, and of course, by e-mail. “We’re very proud of her,” says Kauzlarich.

Navrotsky traces her passion for geology to a childhood fascination with nature. “If the weather was nice, we’d go to our summer place outside of New York City,” she recalls, “and in the winter, we’d go to museums. I don’t know when I knew I wanted to be a scientist, but by the time I formed the question, I already had the answer.”

While her research is necessarily abstract, it’s not unusual for a colleague to contact her looking for advice on how to build a better light bulb. In fact, she says, her thermodynamic calculations apply to thermal technologies used by NASA. “Materials compatibility is critical to the performance of just about anything,” she says. “You don’t want John Q. Public yelling at you because your device failed in a humid environment.”

Ultimately, she hopes the emerging field of nanogeoscience will shed light on the movement of air and water pollution. Beyond that, she believes that understanding of reactions at the nanoscale could unlock mysteries about the origin of life.

“From an atom to a galaxy,” she reflects. “That’s what makes it fun. I want to understand the universe: don’t we all?”

Related News
Scientists want to send nanobots to search and destroy brown tide
Small sensors continually monitor water and air

Cool Chips 2002 Conference Lineup
Intertech has a set agenda for its power and thermal management development conference, “Cool Chips 2002 — Developing New Market, Material and Design Strategies for Thermal and Power Management,” which will be held April 3-4 in Monterey, California, at the Monterey Marriott Hotel. With packaging being a critical part of thermal management in electronic products, Advanced Packaging magazine is acting as the media partner for the event. The Conference Chair is John Baliga, an industry consultant formerly with Semiconductor International and Tru-Si Technologies.
The pre-conference seminar, “Future Directions of IC Packaging,” will be held during the morning of April 3. This seminar will be led by Joseph Fjelstad (Pacific Consultants) and will cover design, construction and key performance features of chip scale packages and their impact on future interconnection structures and design standards now in development, as well as innovative future options in IC package design, such as wafer level and folded and stacked chip packaging.
Conference speakers will address the industry’s “low power road map” and the changes in microarchitecture that need to be addressed to meet burgeoning market demand for computing power with reduced heat build-up and power loss. Presentations, question and answer sessions, and the panel discussion will also emphasize IC market trends, as well as the emergence of new semiconductor materials, innovative packaging technologies, system level architecture analysis, and novel thermal management techniques.
Jeffrey Demmin, Advanced Packaging‘s editor-in-chief, will present a talk entitled “Recent Advances in Thermoelectric Cooling.” He will review principles of thermoelectric cooling, present new molecular scale advances in the field and discuss implications of the breakthroughs.
For more information, contact Susan Brahms, Intertech Conferences, 19 Northbrook Drive, Portland, Maine 04105; 207-781-9800; Fax: 207-781-2150; E-mail: [email protected]; http://www.intertechusa.com/Cool_Chips/cool.html