the Physics of metrology instruments
06/01/1998
The physics of metrology instruments
Mark Davidson, Spectel Company, Mountain View, California
Andras E. Vladar, Hewlett-Packard ULSI Research Laboratory, Palo Alto, California
Metrology is growing in both complexity and breadth, as more technologies are brought to bear on the increasingly difficult problems faced by the process engineer. As IC parts shrink, metrologists are being pushed into the quantum domain. This article reviews the tools available for CD and overlay metrology.
The horizon for process technology these days is the 16-Gbit DRAM. These devices will probably go into production around the year 2007. The smallest lines and spaces in the resist patterns on the chip will be as small as 100 nm, but it is unclear how such fine lithography will be achieved. The principal candidates are proximity x-ray, extreme-UV, and direct-write electron beam lithography, although some form of nearfield optical lithography or tunneling probe technology could surprise the industry.
It will be necessary to control these lines to about 10 nm. The metrology tool will have to measure with a precision and accuracy of better than 2 nm - about the size of 4 Si atoms put side by side. Resist aspect ratios as high as 1-5 may be required, and the metrology instrument must be able to measure the base of the photoresist line. Currently, it is unclear how these goals will be met. The table summarizes the current status of the three established technologies.
![]() |
CD measurement is important, because the length of CMOS gates affects their speed. Gate length for high-performance microprocessors and memory devices must be controlled carefully, and it must be very uniform over the total chip area. Moreover, CD variation often indicates instability in some critical part of the semiconductor process, and CD monitoring is thus an important safeguard against process drift.
Mask metrology is also growing in importance. Phase-shifting and proximity-corrected masks are tightening the requirements for mask dimension control, and it is getting more difficult to measure mask CDs with optical technology. Scanning electron microscopes (SEMs) and scanning probe microscopes (SPMs) are being used, but optics is likely to be pushed as far as it can go. Ultimately, users want to know how the mask looks to light that passes through. SEMs and SPMs give only a surface measurement. Considerable sophistication, including computer simulations, will be required to keep optical metrology competitive with other technologies.
By the year 2007, most television sets and computer monitors will probably be flat panel displays, and their manufacture will require much the same type of metrology. Dimensional metrology is also required in the manufacture of thin film heads for the magnetic recording industry, where the linewidths are already below 0.5 mm in some cases.
There are two different philosophies concerning the use of metrology in wafer manufacturing. The first approach uses metrology extensively in research and development, as well as in pilot lines to perfect a high-yield process. Metrology is not done extensively during production, however. The second approach monitors processes in-line and continuously during production as well. Continuous monitoring can extend to in situ sensors for real-time metrology, where relatively modest yield improvements can have tremendous financial payback.
This article concentrates on two types of metrology: linewidth and overlay measurement. Linewidth control is required for developed resist and etched lines, and is desirable for latent images in resist. Overlay control is critical to ensure the alignment of the 30 or so independent mask layers in modern ICs.
SEMs
The SEM uses a highly focused beam of electrons to scan an object while detectors record the scattered electrons. It dominates submicrometer linewidth measurement, despite major complications and accuracy issues. Figure 1 shows the essential elements of a CD metrology SEM.
![]() |
Figure 1. CD SEM components.
The venerable history of electron microscopy is worth recalling [1, 2]. The first evidence for electrons, and indeed the first electron gun, was the demonstration of a cathode ray in a low-pressure gas discharge tube by Professor Julius Plucker at the University of Tubingen around 1878. Subsequently, Sir William Crookes greatly developed the understanding and technology of cathode ray tubes. This work laid the foundation for the discovery of the electron in 1897, a scant 100 years ago, by John Joseph Cavendish, professor of experimental physics at Cambridge University. In the 1930s, the first transmission and scanning electron microscopes were developed by Ruska, Knoll, Von Ardenne, and others, in Germany. At RCA in the US, Zworykin and his coworkers made significant advances in the 1940s. The first commercial SEMs were produced by Cambridge Instruments in Great Britain and JEOL in Japan in the mid-1960s. In the 1980s, the realization that optical techniques were likely to "run out of gas" led to the development of SEMs dedicated specifically to linewidth measurement.
The SEM consists of an electron gun for creating free electrons, electrostatic focusing elements for creating a relatively collimated beam, and a final stage of magnetic focusing or combined electrostatic-magnetic focusing to force the electrons to hit the sample within a small, 2-6 nm spot.
Electron guns. Good electron guns produce high primary electron current, have narrow energy spread, and are stable over time. All CD-SEMs today use some form of field emission gun, in which quantum tunneling - the ability of electrons to penetrate classically forbidden energy barriers - leads to electron emission at negatively charged needle tips. There are three basic types: cold-field emitters, which rely exclusively on quantum tunneling to get electrons out of a sharp, tungsten needle; thermal-field emitters, which use a combination of quantum tunneling and thermal kinetics; and Schottky-field emitters, which rely on a combination of quantum tunneling, thermal kinetics, and the Schottky effect. Cold-field emitters have small energy variation and are very bright, but unstable. Thermal-field emitters are less bright and have larger energy variation, but are more stable. Schottky-field emitters, perhaps the best compromise, combine relatively high brightness, narrow energy spread, and high stability.
Objective lens. All CD-SEMs use cylindrically symmetrical magnetic focusing lenses. Most lens studies suggest that immersion lenses, in which the sample sits in a strong magnetic field, have the best resolution, and, consequently, most of the newest CD-SEMs use immersion lenses. Many newer CD-SEMs also combine some electrostatic focusing elements with the magnetic objective lens (Fig. 1). The idea is to apply a brake to the electrons as late as possible to gain the advantages of higher beam energy through part of the magnetic focusing process. Pure electrostatic focusing elements with the same resolution as magnetic lenses have not been satisfactory, and so, since the 1930s, virtually all SEMs have used magnetic lenses for final focusing. However, recent developments may change this in the future.
Electron beam scanning. The x-y deflection needed to make a raster scan can be achieved with either magnetic (yoke) or electrostatic (plate) deflectors. The electrostatic deflectors are faster and are immune to magnetic hysteresis.
Detectors. Everhart-Thornley detectors are photomultiplier tubes fed by a light pipe from a scintillation disk that is bombarded by secondary and backscattered electrons. The light impinges on a photocathode, which creates an electron inside the photomultiplier tube. This electron then triggers a cascade of electrons in the tube, amplifying the signal. Microchannel plates can also be used to detect electrons. Analysis suggests that microchannel plate detectors are noisier than photomultiplier types due to higher variability of the gain, but they detect the electrons directly and can track the spatial distribution of the electrons, not just the intensity signal. The electrons arriving at the detector are, at best, distributed by Poisson statistics. The detectors and electronics add noise to the Poisson noise. The optimal detector can count electrons by discriminating between discrete time events. Such a system requires very fast electronics, but is likely to be used only in more advanced CD-SEMs.
Modern CD-SEMs are highly automated machines, with rapidly evacuated vacuum systems, which can automatically position a wafer, and then measure selected sites. The throughput rate of these instruments is critically important, as is the repeatability and accuracy of the measurements. Throughputs for SEMs are now comparable to optical machines (up to 70 wafers/hr at 5 sites/wafer).
![]() |
Figure 2. The energy-charging curve, where the total electron yield is shown as a function of landing beam energy.
![]() |
Figure 3. Beam deflection due to charging.
![]() |
Figure 4. Top-down vs. cleaved edge-on view of line.
Inherent instability. Since the SEM`s optics are under electronic control, they are always subject to drift, and the astigmatism and resolution, in particular, tend to vary with time. Automatic astigmatism adjustment and monitoring have been included in SEMs, but so far, no vendor has totally solved the problem of basic electron optical stability.
Specimen charging. Since electrons are charged particles and charge is conserved, the specimens in a SEM will generally charge up. If the specimens are insulating, then there is nowhere for the charge to go, and it will inevitably cause some distortion in the image. For a horizontal surface, there are special electron energies (E1 and E2) at which the number of incoming, primary and emitted, backscattered, and secondary electrons are the same (Fig. 2). Unfortunately, these energies for real objects vary slightly due to geometry and local electromagnetic fields. No energy works everywhere. To lessen the effects of charging, one can detect only backscattered electrons, but these have a total yield of only 1/3 to 1/10 that of secondary electrons. A larger dose, with 10-50? more beam electrons, is needed to obtain a reasonable signal-to-noise ratio. This higher current causes more contamination, and can damage the resist or thin gate oxides. It even causes enough charging to deflect the primary electron beam appreciably (Fig. 3). To get a strong signal and avoid damage, one must use secondary electrons, which are very sensitive to local electric fields, i.e., sample charging.
The interaction volume of the beam is fixed by the beam energy and the material being viewed. The way to minimize the effective interaction volume is to detect backscattered electrons that have lost only a small fraction of their incoming energy (low-loss backscattered), but these are only a small fraction of the total. A very large dose, perhaps two orders of magnitude more than for secondary electrons, is required. The problems with SEMs are most severe when looking at the base of dense resist lines or contact holes on insulators, as it can be difficult to get a satisfactory signal of secondary electrons from such locations. Large extraction fields may pull the secondary electrons out of holes and trenches, but these large extraction voltages can lead to unusual and little-understood charging effects. The basic problems stem from fundamental principles, such as charge conservation, diffusion of electrons in materials, etc.
Mysterious phenomena. Some mysterious problems plague SEMs, too. The resolution of CD-SEMs seems to be much lower for some specimens than all the engineering and simulation estimates say that it should be. This phenomenon plagues virtually all SEMs and a wide variety of specimens, and is not due to anything simple like vibrations, scan jitter, or signal bandwidth limitations. The cause is unknown at present. Another unusual phenomenon is that a line measured with backscattered electrons tends to measure smaller than the same line measured with secondary electrons. This behavior has been observed for insulators as well as conductors, so charging models are ruled out. Comparisons with SPM and cleaved images suggest that the backscattered electron signal is more accurate than the secondary one (Fig. 4).
The power density deposited by a 5-pA electron beam, whose mean energy is 1 keV, for an assumed interaction volume of 30-nm dia., is about 100 kW/mm3, a rather high value. There is clearly the potential for damage or ablation of material, particularly photoresists, and also substantial heating depending on dwell times and scan timing. Damage and ablation have been ruled out as the cause of the two mysterious phenomena above. Uneven transient heating of the substrate might possibly be the culprit, as secondary electron yields increase with the temperature of the specimen, whereas backscattered yields are much less sensitive.
If backscattered electrons are detected, then good resist-line accuracy is achieved with a detector that primarily collects those backscattered electrons travelling inside a narrow cone centered on the optical axis, i.e., backscattered electrons with nearly 180? scattering. This type of detector minimizes the effects of sidewall absorption. Only a fraction of the backscattered electrons are of this type, however, and the signal-to-noise ratio is, thus, very low.
At present, most CD-SEMs use mainly secondary electrons. Some systems are adding backscattered electron detection capability for mixed-signal detection, and this trend can be expected to continue in the future.
Damage to the object being measured sets a fundamental limit to the total dose that can be tolerated. Damage can include actual ablation of the sample or deposition of carbonaceous material. If active circuitry is measured, then possible damage mechanisms also include charge trapping of electrons in thin gate oxide layers. Such trapped electrons will affect the electrical characteristics of the finished device, although subsequent annealing may reduce or eliminate them.
Optical farfield techniques. Optical farfield microscopes and sensors will continue to play a significant role in several metrology areas: overlay registration, linewidths on photomasks and reticles, larger structures, micromachined devices, thin-film heads, process monitoring, defect characterization, long-distance registration accuracy, defect and process inspection, and advanced flat panel display metrology. Moreover, optical alignment systems will be important in achieving high throughput with SEM, SPM, or scanning nearfield optical microscope (NFOM) systems. Optical alignment will continue to be used in wafer steppers, one of the most critical areas of wafer fabrication. This alignment task will become more demanding as geometries shrink. Wafer planarization has further complicated the task of aligning wafers and measuring overlay, since, in some situations, the buried alignment marks are difficult to image. Optical innovations will continue to be required to meet these challenges. Figure 5 shows a typical optical microscope metrology tool. The following technologies are being used: classical brightfield and darkfield microscopes, laser confocal microscope, white light real-time confocal microscope, coherence probe microscope, and scatterometry.
![]() |
Figure 5. Typical optical system.
Classical brightfield microscope. The brightfield microscope technique is still used for CD metrology on photomasks, general inspection of wafers, pattern alignment in all sorts of machines, including steppers, and overlay metrology. The compound microscope has evolved through the efforts of many innovators in various countries from the 17th century to the present. One of the main limitations of optical systems arises because lenses are not exactly the same. Polishing leaves a rough surface on the atomic scale. A typical high-resolution objective lens may have 12 or more elements inside, and 24 polished surfaces. A particular ray will pick up a random phase, depending on its path through these 24 random surfaces. The result is a kind of granular noise in the image at very high magnification, leading to errors in measurement that are difficult or impossible to calibrate and subtract. For transparent lines, a severe problem called the resonant waveguide effect also makes CD measurement very difficult.
Classical darkfield microscope. In the darkfield microscope, used for alignment and defect detection, no illuminating rays are collected by the objective lens unless there is nonspecular scattering of light from the object being viewed. Darkfield illumination tends to eliminate many interference effects when aligning to targets that are buried beneath a transparent film layer. Further, a defect on a clear field appears as a bright spot in a black field. Such a spot is easy to find in an electronic system, and so larger pixel sizes with fewer gray levels can be used, and higher throughputs can be achieved.
Laser confocal microscope. The laser confocal microscope images a diffraction-limited spot on the object and re-images this spot on a pinhole before detecting the scattered signal. The re-imaging produces a sectioning microscope: the signal from out-of-focus parts of the object goes to zero. The original idea and prototype for this technology appear to be due to Marvin Minsky (Confocal Scanning Microscope: US Patent 3013467, 1955), although he chose not to publish his results at that time. Davidovits and Egger are more commonly credited with the invention [3]. Laser confocal microscopy is used for CD measurements on photomasks and reticles and for defect review.
White light real-time confocal microscope. This technology projects a field of diffraction-limited spots onto the object by imaging a rotating disk (called a Nipkow disk) with clear holes in an opaque field. The principle is the same as the laser confocal microscope, except that instead of scanning a laser beam, the moving holes cover the entire field of view evenly. The image can be collected by a time-averaging area sensor, such as a video camera or a human eye. This technique is used for photomask metrology, defect review, and larger structure CD metrology. The pinholes in commercial microscopes are oversized to allow more light, which reduces the confocal effect significantly.
Coherence probe microscope. Most microscopes make an image proportional to a photon count. The coherence probe microscope (CPM) (Fig. 5) [4], also referred to as the correlation microscope [5], collects raw images from a phase-shifting interference microscope. Calculating the mutual coherence - between the light scattered off of an object and the light scattered off of a reference (typically a good mirror) for every pixel in the image field - yields an image. This technology, used for overlay measurement and larger-structure CD metrology, can adapt to different imaging problems by analyzing phase as well as amplitude. CPM can also measure z information. Resolution in the simple CPM mode is about 100 nm, but
analysis of the full-phase information can improve the accuracy considerably.
![]() |
Figure 6. Overlay signals underneath CMP layers.
Figure 6 shows the improved imaging of the coherence probe method for looking at a buried overlay target underneath a chemical mechanical polishing (CMP) layer. The ordinary microscope image has poor contrast and signal, even after significant image enhancement. The coherence probe image has a much better signal-to-noise ratio.
Scatterometry. The idea in scatterometry is to eliminate objective lenses and, therefore, the problems with making repeatable lenses mentioned above. When a laser beam scatters off an object, its angle, intensity, phase, and polarization will change. Detectors collect the scattered data as a function of angle and polarization, then a data analysis package comes up with a measurement value. This technology may be used for measuring the latent image in resist before it is developed, or for measuring linewidths in general. Scatterometry has the potential to provide in situ feedback to production tools such as etchers and even wafer steppers. The industry does not expect scatterometry to displace SEMs for CD measurement, however, due to the widespread belief that optics is not capable of measuring linewidths below 0.8 ?m.
Scanning force microscope
The 1981 invention of the scanning tunneling microscope is credited to G. Binning and H. Roher [6]. They discovered that a flexible cantilever with a very small spring constant could measure the topography of a surface provided that the forces between the cantilever tip and the surface were small enough not to displace the individual atoms of the sample. The first SPM had atomic resolution but was limited to conductive tips and samples. The atomic force microscope (AFM), another member of the SPM family, was demonstrated in 1986 [7]. There are several types of SPMs, and their number is growing as new contrast mechanisms are discovered and employed. Scanning tunneling, atomic force, and magnetic or electric force microscopy are used to study a wide variety of materials and devices.
The AFM can produce a high-resolution, 3-D image of sample surfaces. It uses optical technology, direct contact, and a laser to sense the position of the tip in relation to the sample.
![]() |
Figure 7. Typical scanning probe microscope.
Figure 7 shows an AFM system capable of linewidth and sidewall measurement. The tip protrudes from a small flexible cantilever.
In the simplest AFM, a laser beam reflects off the top surface of the tip and is directed to a split photodiode. The topography of the sample surface deflects the cantilever, changing the position of the laser on the photodiode. The difference in voltage from the photodiode elements provides a "picture" of the surface as the probe tip scans over the sample. This type of SPM is said to be operating in "repulsive mode."
A more sophisticated AFM oscillates a needle close to the object and senses phase shifts induced by van der Waals forces on the needle. This type of AFM, pioneered by IBM, can measure vertical resist profiles, and is, consequently, much more interesting for the semiconductor industry than the simple AFM mentioned above. It can operate in either the "attractive" or "repulsive" mode.
The attractive mode AFM is by far the more difficult to achieve and the most interesting for CD measurements. In repulsive mode, the AFM is essentially a low-force stylus profilometer. To measure 16-Gbit DRAMs, a tip width of <70 nm would be required. This might be difficult, but electron beam deposition methods seem capable of making very thin tips.
One of the main problems with the AFM is throughput. Current machines are exceedingly slow. Another problem is tip characterization. The linewidth measured depends on the tip geometry, as well as the actual geometry being measured. The tip geometry may not be known accurately or it may change with time. One way to deal with this problem is to use a built-in calibration structure to determine the tip geometry as needed. The AFM is considered a most promising alternative to SEMs.
Nearfield optical microscope
E.H. Synge described the scanning NFOM in a remarkable paper in 1928 [8], although the instrument was not actually fabricated until the 1970s. In farfield optical microscopes, the highest spatial frequency that can be present in an image has a wavelength of l/2, where l is the wavelength of the light. NFOMs avoid this diffraction limit, and can have much higher resolution for the same wavelength of light. If a small aperture is held close to the object, the resolution is determined by the aperture size and not the wavelength of the light. The evanescent or nearfield waves that surround the aperture provide the extra resolution. A number of configurations can achieve this effect, including dielectric needles and small holes in a metal needle.
Nearfield optical microscopes have not yet been used on thick resist films in production. A simple needle aperture would be inappropriate for this measurement, just as it was inadequate for the SPM. A forked-tip double aperture would have to be developed. The NFOM`s working distance can be tens of nanometers, whereas the SPM working distance is closer to one nanometer.
Another interesting nearfield approach places a special multilayer optical resonator near an object. Unraveling the resonant response in the presence of the object may lead to metrology methods for some structures, but is unlikely to lead to a general metrology method.
Electrical measurement
Techniques for electrical measurement are likely to continue to improve, and provide a means of measuring conducting lines. Since most CD materials of interest are insulators, this technique will continue to be ancillary. Electrical metrology results are not hampered by distortions from beam or probe interactions with the specimen or from image-forming optics. On the other hand, electrical metrology requires conductive materials, such as metal, poly, silicides, etc., and any inhomogeneity or geometrical irregularity causes error in the measurement. The process engineer cannot "see" the image from which the CD measurement was made, and so his anxiety level is higher about the technique. Special test structures, which take up real estate, must be used. It is not possible to perform CD measurements on electrically active parts of the circuit. Both overlay and CD measurements can be done with electrical metrology.
The role of simulation and modeling
Computer simulation and inverse scattering techniques can be expected to play an increasing role as technology evolves. The hardware for optical and electron microscopes is reaching physical limits that make further improvements difficult or irrelevant. Interpretation of the image based on the fundamental interaction between the beam and the object is still in its infancy, however. It is likely to play a larger role in the future of these technologies. Scatterometry depends on inverse scattering and simulation to obtain a measurement. Atomic force microscopy will need a model for tip shape and probe deformation and deflection in order to achieve ultimate accuracy. In scanning electron microscopy, simulation and modeling of signal formation, sample geometry, a local electromagnetic field in the specimen chamber, and signal detection can help in overcoming serious accuracy problems.
Sources for optical simulation range from antenna theory and diffraction gratings to miscellaneous semiconductor equipment models. The techniques may be summarized as follows:
Finite element methods are general but slow, and can have boundary problems and possible convergence problems in the time domain for conducting materials.
Rigorous coupled wave methods are mainly restricted to line structures. They are fast and easy to implement, but tend to have convergence difficulties for the transverse magnetic mode for conductors.
The analytic waveguide method is an improvement over the rigorous coupled wave method, which solves exactly for the eigenfunctions. It is fast and accurate, but limited to line structures.
Boundary integral methods are general solutions, and are slow but accurate. They are difficult to implement when more than one material is involved.
There is also a vast literature for electron beam simulation [9, 10]. The preferred approaches, for the most part, use Monte Carlo methods. The Monte Carlo codes describe an idealized electron scattering process consisting of instantaneous elastic scattering events governed by Rutherford or Mott scattering formulae, and separated by periods of rectilinear motion during which kinetic energy is lost at a rate derived from an empirical formula similar to the Bethe energy-loss formula. No fully satisfactory charging models are available at the present time, and exact agreement between simulations and real scans of real lines has not been achieved. There appears to be some source of signal degradation in real images, which the simulations cannot yet account for. This mystery must be overcome in order for Monte Carlo codes to become dependable in SEM calibration.
X-ray microscopes
It is now possible to make highly reflective x-ray mirrors, and a two-element reflecting x-ray microscope lens of the Schwarzchild design was demonstrated at Stanford University a few years ago. With all the work going into EUV lithography, it is likely that a new metrology tool based on mirror x-ray optics will emerge, too. Such a tool could have resolution equal to a SEM, without the charging problems and without as much damage as SEMs may cause. It could also be designed to see deeper below the surface than the SEM does, and could thus be used for overlay measurement as well. n
References
1. D. McMullan, "Scanning Electron Microscopy 1928-1965," Scanning, Vol. 17, pp. 175-185, 1995.
2. E. Ruska, "The Development of the Electron Microscope and of Electron Microscopy," Reviews of Modern Physics, Vol. 59, pp. 627-638, 1987.
3. T. Wilson, C. Sheppard, Theory and Practice of Scanning Optical Microscopy, Academic Press, 1984.
4. M. Davidson, et al., "An Application of Interference Microscopy to Integrated Circuit Inspection and Metrology," Proceedings SPIE Microlithography Conference, Santa Clara, Vol. 775, pp. 233-241, March 1987.
5. G.S. Kino, T.R. Corle, G.Q. Xiao, "New types of Scanning Optical Microscopes," Integrated Circuit Metrology, Inspection, and Process Control II, Vol. 921, SPIE, 1988.
6. G. Binning, H. Roher, C. Gerber, E. Weibel, "Surface Studies by Scanning Tunneling Microscopy," Phys. Rev. Lett., Vol. 49, pp. 57-61, 1982.
7. G. Binning, C.F. Quate, C. Gerber, "Atomic Force Microscope," Phys. Rev. Lett., Vol. 56, pp. 930-933, 1986.
8. E.H. Synge, "A Suggested Method for Extending Microscopic Resolution into the Ultra-Microscopic Region," Phil. Mag., Vol. 6, pp. 356-362, 1928.
9. L. Reimer, Scanning Electron Microscopy, Springer-Verlag, Berlin, 1985.
10. J.I. Goldstein et al., Scanning Electron Microscopy and Electron X-Ray Microanalysis, Plenum, New York, 1992.
MARK DAVIDSON has a background in theoretical physics and experience in semiconductor equipment design and analysis. He has developed physical simulation software for a variety of optical and electron beam instruments, including simulation products and SEM image sharpness improvement utilities. Davidson is the founder and president of Spectel Co. 958 San Leandro Avenue, Suite 600, Mountain View, CA 94043; ph 650/254-0532, fax 650/254-0437.
ANDRAS E. VLADAR received his PhD degree in electronic engineering from the Technical University of Budapest, Hungary, in 1984, and was a research fellow at the Research Institute for Technical Physics of the Hungarian Academy of Sciences. From 1991-1995, he was a guest scientist at the National Institute of Standards and Technology (NIST). Vladar is a member of the technical staff at the ULSI Research Laboratory of Hewlett-Packard Co.