Challenges in the deep-submicron
01/01/1997
Challenges in the deep-submicron
Within two or three generations beyond quarter micron, many current mainstream semiconductor manufacturing processes will not be extendable. They will be incompatible for various reasons: the smaller physical geometries, the greater control required across larger dimensions, and the tighter particle and molecular contamination specs. Controlling, modeling, and practicing complex interactive processes on larger than 200-mm wafers that contain larger than one-inch dice (in the die-size-limited-yield regime) will become our time-consuming preoccupation. In this essay, I offer my personal view on seven major requirements for this "long march" into the deep-submicron arena.
Lithography
Laser-based lithography in the EUV (soft x-ray) and x-ray regime will challenge the developers of radiation sources to evaluate synchrotron and ion-beam sources for production equipment. Lithographic equipment manufacturers may be called on to offer "complete" solutions that take into account the interaction of the light source, lens, and the resist. The process of tuning these components to optimize performance may be passed from end-users in fabs to the equipment manufacturers. Image prediction systems will be needed in conjunction with physical measurements to match and optimize wavelengths to tools in a manufacturing environment.
Deposition/etch integration
Chips with at least five or six interconnect layers will become a mainstream manufacturing reality in the coming 15 years. Etch equipment will have to evolve into "mini"-CVD factories as the control of surfaces becomes more critical. Tools may have to clean a surface, deposit the required film, and then cap the layer for protection while the wafer is in a single "etch" chamber. This requirement will be superimposed on the need for higher density plasmas, increased uniformity within die and wafer, and reduced device charging.
CD modeling
The combined litho-etch CD budget will be modeled phenomenologically, based on measured data with algorithmic co-relations developed as transfer functions (rather like the lumped circuit models well known to designers). At a recent modeling conference, there were indications that Japanese manufacturers were embarking on "die-" level models. Modeling will probably evolve at several levels. First is the atomistically accurate level where electrical and physical plasma models incorporate proximity, and charging effects may be used to understand the effects of increasing wafer size. At the second level, data-based phenomenological models will, with sufficient investment of time and energy, become truly "predictive." Lastly, and at the other end of the modeling spectrum, the increased use of advanced statistical tools including neural networks and stochastic processes will be needed to analyze and predict "once-in-a-while" excursions.
Metrology
Metrology will face the challenge of measuring 5 to 10-atom thick films withaccuracy, speed, and reproducibility. The variation introduced by measuring instruments may become a larger part of the total allowable process variation. Direct measurements by scanning electron microscope or image recognition may be prone to errors. Instrument-to-instrument matching errors will compound accuracy issues, and therefore complicate process control. In-situ endpoints, though challenging, will come as welcome relief to the various "timed" processes that now pervade the industry.
Wet cleans
About 20 years ago, many of us in the industry predicted that, "We will remove all wet-bench processing by the end of the century," and we put in place ambitious programs to make this a reality. It is painfully obvious to those of us still working that this is not likely to happen in our lifetime (and hopefully some of us will live well into the next century). In fact, the trend seems to be just the opposite, with an increased use of wet steps to achieve ever-more-exacting levels of cleaning.
Low-k dielectrics
The deposition and planarization of low-k interlayer dielectrics will be driven by the booming budget of the back-end product of resistance and capacitance (RC). On the face of it, this seems like a simple challenge: decrease both R and C. However, the realization of this goal is complicated by many manufacturing and reliability issues. There are as many divergent solutions in the current literature as there are manufacturers. Process integration issues dominate the current debate.
Operation voltage
Lastly, in the arena of general device characteristics, it seems that the goal of microprocessors operating on a 1.5-V "A" cell battery will be achieved. However, the power consumed by the millions of transistors will be large - very large. Reduced power consumption will become the major goal of both the circuit designers and the software engineers controlling the microprocessors. The interactions between package and silicon will increasingly expose our inability to predict complex time/temperature-dependent reliability. Lower operating voltage chips that operate at tens of gigahertz are within the forseeable future.
Conclusion
These are just some of the challenging and achievable goals on this not-so-long march ahead of us. More cooperation between companies will be needed to realize them. User`s groups, such as those sponsored by the North California Chapter of the American Vacuum Society, are a successful example of how people from companies can come together to discuss details and develop new technical strategies.
In addition to the obvious need for specialized engineers, this tremendous growth will require a great number of well-trained technicians. Trade schools and community colleges need to develop curricula that can produce qualified process techs. The combination of technical vision and training will enable us to cross the bridge into the next century.
Krishna Seshan is a member of the technical staff at Intel Corporation; ph 408/765-0194, e-mail [email protected].