Issue



Pushing the limits of light


07/01/2006







By David Forman, James R. Dukart and Elizabeth Gardner

From semiconductor lithography to the imaging of living cells, optical techniques are being challenged by alternative approaches. New technologies for stamping and writing nanoscale features are emerging for manufacturing. Researchers have long been using non-optical scanning techniques for peering into the nanoscale. But light, apparently, has a bright future.


Photolithographic masks like the one held here by a manufacturing technician won't be displaced by non-optical technologies anytime soon. Photo courtesy of SEMATECH
Click here to enlarge image


Immersion lithography poised to process future chips

It was just a few years ago that some pretty exotic forms of chip making were in contention to be the semiconductor industry’s next best bet. But now, analysts say, a modification of conventional optical lithography is going to be sufficient for at least a few more generations of chips, pushing off the semiconductor industry’s day of reckoning.

That has broad implications for nanotech, especially for the purveyors of technologies like nanoimprint lithography that are looking to cement their future on the semiconductor industry’s product development roadmap. It also affects toolmakers that provide the machines currently used for processing chips. When will Moore’s Law have to be amended? Experts say not for at least another five to seven years.

“The whole thrust is to be able to get photolithography at smaller and smaller dimensions,” explained Fred Zieber, a long-time tracker of the semiconductor industry who is the founder and president of Pathfinder Research, a market research firm in San Jose, Calif. “The point is to get features smaller than the wavelength of light that you’re working with. There is a phased progression down to 45, 32 and 22 nanometers, if possible.” But, he said, “To get there they have to solve a whole lot of problems.”


Pictured at left is an array of 29.9 nm wide lines and equally sized spaces created by IBM scientists using immersion lithography. These lines are less than one-third the size of the 90 nm features at right (same magnification) now in mass production by the microchip industry. They are also smaller than the 32 nm size that industry consensus held was the limit for optical lithography techniques. Image courtesy of IBM
Click here to enlarge image

The comparatively exotic routes all have major roadblocks. For extreme ultraviolet lithography, in which the lenses ordinarily used to focus ultraviolet light are replaced by mirrors, some of the challenges include creating those perfect mirrors and operating the process in a vacuum. E-beam lithography uses an electron beam rather than photons for processing. Nanoimprint is a stamping, rather than lithographic, process. Engineers across the semiconductor industry are not familiar with these technologies at the level of depth with which they know optical lithography using 193 nm light - today’s cutting edge.

Those difficulties explain why immersion lithography is poised to be the next leading semiconductor process. It has fewer problems. “This (immersion) is the one the industry is committing to,” said Klaus Rinnen, a managing vice president at market research firm Gartner Inc. who tracks semiconductor manufacturing. “In the near future, 193 nanometer immersion will allow an extension of the current infrastructure.” And, he added, he expects it to dominate.

The proof is in the numbers. By Rinnen’s count, two immersion devices were shipped in 2004 and 12 in 2005, and he expects between 20 and 25 more to ship in 2006.

“The first four to five were for R&D,” he said. But now the manufacturers - ASML and Nikon - are shipping second generation systems. By 2009 Rinnen said he expects the industry to ship more than 100 immersion lithography systems. That number would exceed the 97 non-immersion lithography machines, today’s standard, that shipped in 2005.


CAD renderings show ASML’s Twinscan XT:1700i, left, and Nikon’s NSR-S609B. Both lithography systems use immersion technology to produce smaller features than the current manufacturing standards. Image courtesy of ASML Image courtesy of Nikon
Click here to enlarge image

Immersion lithography uses the same 193 nm wavelength light as non-immersion lithography. However, with immersion lithography, water or some other liquid is placed between the lens and the semiconductor, a process that extracts higher resolution light capable of making smaller, more densely-packed circuits.

A big part of immersion lithography’s momentum, says Lawrence Gasman, principal of NanoMarkets LLC, a Glen Allen, Va., market analysis firm focused on emerging nanotechnologies, comes from the fact that it is an evolutionary, rather than revolutionary step. Although manufacturers will have to buy new equipment to stay ahead, all of the investment they have made in training and the development of institutional knowledge will continue to be valuable. The processes for making and cleaning masks, for example, as well as other common tasks, remain pretty much the same with immersion lithography, whereas more revolutionary alternatives will require wide-scale industrial retooling and retraining.

“Most of the companies would like to keep on doing what they are doing,” Gasman said. “If they can keep conventional optical lithography going for a few more years they will do it....Once the whole paradigm changes, all that experience goes out the door.”

In a sense nanoscale processing is following a path previously paved by nanoscale imaging. While optical microscopy is still widely used for looking at the micro world, peering and probing into the nano world is done with tools like atomic force microscopes and atom probes that take advantage of phenomena other than light. Non-optical techniques for nanoscale processing have likewise been around for decades and have been used to make individual samples of devices and prototypes. But, by definition, to be used for manufacturing, the process must become repeatable, cheap and reliable - a set of challenges that research tools for imaging and manipulation don’t have to meet at the same level.

It is still unclear to what scale immersion lithography will work as a production technique. The most advanced chips being produced today are made on a 65 nm scale while 90 nm processing is mainstream. According to the semiconductor industry roadmap, the next standard scales would be 45 nm, followed by 32 nm and then 22 nm.

Already companies are announcing new techniques. In February IBM announced that its scientists had created small, high quality line patterns only 29.9 nm wide using immersion lithography, comfortably under the 32 nm mark that many had previously considered the limit for optical lithography techniques.

Gasman says there is general agreement that optical lithography won’t get past 18 nm. Rinnen thinks immersion will be sufficient to get close - 22 nm. He and other experts say innovations will become more commonplace as more immersion lithography machines come online and more researchers and engineers have access to the technology and gain proficiency with it. The IBM research, by contrast, was done on a test apparatus designed and built at IBM’s research facilities.

Also in February Taiwan Semiconductor Manufacturing Corp. (TSMC), the largest semiconductor foundry in the world, announced that it had produced semiconductor wafers within “acceptable parameters” for volume manufacturing using immersion lithography to create 45 nm features on 12-inch wafers.

The company characterized the test as a milestone toward production immersion lithography. Recently TSMC produced multiple test wafers with defect rates as low as three per wafer - better than other immersion results to date, and comparable to dry (that is, non-immersion) lithography results, according to statements by Burn Lin, senior director of TSMC’s micropatterning division. He said that now that the company understands the root causes of the defects, it can focus its attention on improving throughput for high-volume manufacturing.


Intel Corp. announced in January that it had produced fully functional test chips using 45 nm process technology. The company said it will eventually use the technology to make chips in Oregon, Arizona and Israel. Photo courtesy of Intel
Click here to enlarge image

A byproduct of immersion lithography’s ramp-up will be that the developers of other technologies will gain a reprieve. Since immersion lithography will push back the demise of Moore’s Law, these technologies have more time to mature in parallel with immersion lithography.

That could be a real boon to developers of the more revolutionary techniques. For example, says Pathfinder’s Zieber, right now extreme ultraviolet lithography is “different enough that the cost would be prohibitive.” But nobody knows what can happen with five to seven years of development. The same goes for e-beam, nanoimprint and other processing technologies.

Extreme ultraviolet lithography remains positioned as the most likely follow-on technology. For starters, it has the backing of Intel Corp., which has integrated the technology in its roadmap and which was one of the first companies to join an industry coalition promoting the technology. Intel has been active in developing the technology itself and has invested in other companies developing solutions for some of EUV’s problems.

In late January, for example, Intel announced an investment in Xtreme Technologies GmbH of Gottingen and Jena, Germany, along with a strategic development agreement. The company, which is a joint venture between a subsidiary of Jenoptik AG and Ushio Inc., is working on developing an extreme ultraviolet light source for photolithography. The development of such a source has been one of the roadblocks in the way of commercializing EUV lithography.

However, other companies have been slower to invest in the technology. And industry coalitions promoting technology have fallen apart before. A similar coalition devoted to promoting e-beam lithography as a next-generation mainstream production technology stalled out in 2001.

The lack of industry-wide support shouldn’t necessarily derail EUV, according to Gasman. “Intel, after all, is pretty influential in these things.” But, he acknowledged, there’s a flipside to that argument. “The business problem is if Intel wakes up one morning and decides it wants to do something else.”

The adoption of immersion technology will give the industry some time to decide. Of course immersion has its technical challenges too, the analysts say. Among the potential problems are bubbles and watermarks caused by the use of the liquid, residues left behind by the liquid, and damage from particles present in the liquid.

But TSMC claims to have developed techniques that mitigate these problems, and Gartner’s Rinnen says others will too. “I don’t view them as showstoppers,” he said. “I view them as nuisances.”

- David Forman


Next generation manufacturing: the contenders

The race to keep semiconductor manufacturing ahead of Moore’s Law for the next decade or so boils down to a few competing technologies, each of which has its own strengths and weaknesses. The goal of each technology, of course, is to obtain that elusive Triple Crown of micro-manufacturing: low cost, high throughput and increasingly small size. Herein we handicap some of the main contenders:

Immersion lithography

Immersion lithography has the strongest odds to take an early lead, and, in terms of next-generation lithography technology, is certainly fastest out of the gate. In simple terms, immersion lithography is standard optical lithography with wafers, masks and lenses but using water or some other liquid to increase resolution. Companies such as ASML and Nikon are already shipping immersion systems for 45 nm half pitch production, with analysts predicting 20 or more such systems shipped in 2006.

Immersion lithography uses the same wavelength light (193 nm) as non-immersion photolithography, and thus benefits from the installed base of companies and technical staffs already familiar with the process. Primary drawbacks to immersion lithography include costs higher than standard photolithography and defects - primarily watermarks or bubbles - due to the liquid being used for immersion.

That said, Mike Lercel, director of lithography for SEMATECH, said water-based immersion lithography is definitely the horse to back for commercial semiconductor lithography in about 2009 and beyond. “People have actually seen results demonstrated and the results have been very good,” Lercel noted. “The defect levels were a bigger issue two years ago, but it seems the companies have gone off and solved them and we are now down into the single-digits.”

Extreme ultraviolet lithography (EUV)

EUV shines new light - literally - on chip manufacturing. Heavily backed by Intel, EUV has been around since at least the late 1990s, with Intel promising high-volume production by about 2009.

EUV is essentially an extension of optical lithography, using 13.5 nm wavelength light from the extreme ultraviolet region of the spectrum. Because light at a 13.5nm wavelength is absorbed by materials, including the glass of traditional lenses, EUV systems must use reflective surfaces - mirrors - to focus the light. Lithographic masks must also be reflective, and the entire system must be enclosed in a vacuum.

Therein lie the primary challenges for EUV as a production technology - increased costs of materials and tooling, as well as the costs associated with maintaining vacuum conditions in the lab or production facility.


NanoInk’s Nscriptor dip pen nanolithography system allows users to build patterns, layers and structures at resolutions less than 15 nm. Photo courtesy of NanoInk
Click here to enlarge image

According to Stefan Wurm, EUV strategy program manager at SEMATECH, EUV has a good chance to supplant immersion lithography by about 2012 and beyond. Attendees of the Litho Forum, a three-day gathering in May of global lithography experts, agreed. They gave EUV in 2015 as high marks as immersion lithography in 2009. A key will be ongoing technical developments. “We have seen great extendibility,” Wurm said. “You can increase throughput by 50 percent just by adding two more mirrors.”

E-beam lithography

E-beam lithography uses the same principles as photolithography, except that instead of light the system shoots electrons (x-rays, essentially). Electrons have a much shorter wavelength than light, giving e-beam the promise of being able to write much smaller than photolithography.

The drawbacks to e-beam have always been relatively low throughput and high complexity - as well as high cost - of the exposure tools. That said, the cost of traditional masks continues to rise, making e-beam as a direct-write technology more attractive to chip makers. Some are looking at relatively slow e-beam technology to create critical chip layers with very small patterns while using traditional optical lithography for non-critical layers.

Another development to watch in the e-beam space is the use of multiple beams to increase throughput. For now, SEMATECH’s Lercel commented, e-beam appears to be more applicable to prototyping or very low volume production for research or development. Using multiple beams for volume production, he said, will be at least five years into the future, “if you can prove that it works.”

Nanoimprint lithography

Nanoimprint lithography takes a completely different approach than optical, EUV or e-beam. Michael Falcon, strategic marketing manager for nanoimprint toolmaker Molecular Imprints, called it “almost like stamping DVDs.” The process uses a mold - or master - that has a circuit imprint or other imprint on it, and then imprints or stamps that directly onto a wafer.

Falcon claims nanoimprinting can and will be able to go well beyond 45 nm processing at fractions of the cost of any type of optical or e-beam lithography (including immersion or EUV). That said, he doesn’t foresee nanoimprint supplanting photolithography so much as being chosen in place of other types of lithography for certain applications. Key among these are high-brightness LEDs (solid state lighting) and pattern imprinting of disks in microdrives used in iPods, cell phones, MP3 players and the like.


Molecular Imprints’ Imprio 250 nanoimprint lithography system offers sub-50 nm half pitch resolution, sub-10 nm alignment, integrated magnification control and fully automated wafer and template loading capability. It is intended for device and process prototyping and pre-production, as well as for alignment sensitive lithography applications such as thin film heads and molecular electronics. Photo courtesy of Molecular Imprints
Click here to enlarge image

Larry Koecher, chief operations officer of nanoimprint toolmaker Nanonex, concurred. “We’re on the verge of moving nanoimprint into manufacturing,” he says. “It is being accepted quite nicely as an R&D technology in research labs, but it is starting to catch the eye of those who want to move into mass production mode.”

The Litho Forum found nanoimprint generating increased interest in the 2012 to 2015 time frame. “Solving template defects is the real issue,” Lercel said of nanoimprint lithography. “There are those who are not going to use it for the semiconductor space, but for other nanotechnology applications.”

Dip pen nanolithography (DPN)

Dip pen nanolithography uses the tip of an atomic force microscope (its “pen”) to write patterns directly on substrates using molecules (its “ink”).

“It is fundamentally the same thing as dipping a pen in ink and writing on paper,” said Tom Levesque, senior director of DPN global sales for NanoInk, which makes DPN tools. “The material you can deposit can be from small molecules to biological components such as proteins or polymers.”

Examples Levesque gave for the use of DPN include attaching viruses to see how they attack cells or molecules, production of DNA arrays and other medical diagnostic tools and using DPN to functionalize and align carbon nanotubes on a substrate. The technology, he said, allows for “bottom-up” manufacturing.

DPN may be unlikely to replace or displace much optical lithography in the semiconductor industry, at least in the short term. Primary applications promise to be fast turns of prototype material, since the technology is direct-write but doesn’t require the high cost of materials or vacuum conditions of other approaches. “Immersion lithography using a $20 million tool for mass production will continue to work,” Levesque predicted. “E-beam will continue to be a specialty. We (DPN) will be a niche in that marketplace for people that have more research functionality in their application.”

- James R. Dukart


Imaging: behold the optical nanoscope

Can microscopy morph into “nanoscopy”? Magnification using visible light and a series of lenses has been around since Galileo’s time. But because of the physics of light, there’s a natural lower limit - about 240 nm - to the size of things that can be viewed with a traditional optical microscope.

That’s small, but it’s not small enough to see the tiniest features on new generations of semiconductors, to check for uniformity of nanoparticles or to see viruses or many parts of a live cell. The alternatives - scanning probe technologies such as the scanning tunneling microscope or the atomic force microscope - can “see” things at the atomic level using nanoscale probes that trace surfaces and send back a signal. But they cost six figures and can take several minutes or longer to complete the scan for a single image. And their requirements for sample preparation can preclude making certain kinds of observations - for example, imaging live cells.

As a result many researchers are on a quest to harness the economy, efficiency and versatility of optical microscopy to see things that are supposedly too small to be seen - down to 100, 60 or even 10 nm. And they’re pushing the physical limits of light in various ways.

Aetos Technologies of Auburn, Ala., markets a device called CytoViva, which was developed by a researcher at Auburn University. Using a patented optical system that replaces the condenser on most standard lab microscopes, and a special light source, CytoViva tightly controls how a sample is illuminated. It can image objects in the 100 nm range and can detect objects as small as 10 nm.

“The unit has a fixed geometry that creates a perfect alignment that’s not achievable in traditional microscopes,” said Tom Hasling, director of technology development. “It gives us an extremely good signal-to-noise ratio because there’s not a lot of stray optical noise.” The device produced the first video of Lyme disease viruses infecting a cell and has also imaged 20 nm polystyrene particles.


These two images show a view of a microscope calibration slide. The image at top was taken with a CytoViva-equipped optical microscope. The image at bottom was captured with a field emission scanning electron microscope. Image courtesy of Aetos Technologies
Click here to enlarge image

CytoViva has been on the market since late 2004. Its first installation was a U.S. Department of Agriculture facility in Ames, Iowa, which sponsored the development of the tool as part of its animal and plant health inspection service. The company hopes to have 150 units in the field by the end of the year. It recently introduced a fluorescence module that allows viewers to see both labeled and unlabeled entities at once. Prices range from $10,000 for a basic unit to about $45,000 for a full system, including the microscope, the fluorescence device and a camera.

Because CytoViva operates by shining light through the sample, it can’t be used for solid objects such as computer chips. To address the needs of the semiconductor industry, scientists at the National Institute of Standards and Technology are experimenting with a way to use scanning probe microscopes and optical microscopes together with computers to see features as small as 10 nm.


This image taken with a CytoViva-equipped optical microscope shows a slice of skin tissue with red quantum dot labeled nanoparticles embedded in hair follicles. Image courtesy of Aetos Technologies
Click here to enlarge image

Currently the industry relies on scanning probe technologies to do quality control, but they can damage the samples that they’re supposed to be measuring, said Rick Silver, a physicist in NIST’s precision engineering division. “Optical tools are low-cost, high-throughput and nondestructive.”

Silver and his team are developing a technique called phase-sensitive scatter-field imaging. It uses illumination whose wavelength and angle are tailored to the particular target. The target’s general shape is determined through imaging by a scanning electron microscope and an atomic force microscope. Using that information, along with the patterns produced when the light scatters off the target, a computer algorithm can create a precise image of even the tiniest details. Silver said the technique can detect differences of as little as one nm between two similar objects.

“This technology is likely to evolve into complex sensors to keep close control on the manufacturing process,” Silver said.

- Elizabeth Gardner