Contamination control evolves into a science

Contamination control evolves into a science

In the last 20 years, contamination control has been improved based on experiences and available technologies.

By HAROLD D. FITCH

Over the course of the last 20 years, a new scientific discipline and career has evolved out of a widespread lack of understanding of the destructive force of contamination.

My official career in contamination control began during 1977 to 1978. However, my introduction to semiconductors was ten years prior in 1968 when IBM was manufacturing a 64-bit memory device in Burlington, VT. I joined the product engineering group to improve yields on this device and to work with our Fishkill, NY, plant to introduce a 128-bit memory into production in Burlington.

My first observation was that our mask fabrication facility, which made the photo masters for our photolithographic processes, was operating in a much cleaner environment than our semiconductor line. This led to the processing of experimental lots of semiconductor product through photolithographic processes set up in the glove box line within the mask fabrication area. The glove box line was really the early precursor of the modern minienvironment, with clean hoods completely sealed off from the surrounding environment. This early experiment led to fewer defects and higher yield on the wafers processed in the cleaner environment, and it kicked off a major event to improve the semiconductor line.

The glove box concept was not utilized, however, in the semiconductor line, because it was slow and difficult to work in the glove box environment.

In 1977, I was asked by my management to review the contamination control activities in Burlington and to make recommendations for improvements. This involved a thorough audit of the semiconductor and mask manufacturing lines in Burlington, then a review of the other clean manufacturing activities in IBM manufacturing worldwide. What became apparent was that good contamination control was both complex and essential to future high yields.

Divide and conquer

The Fishkill plant adopted an approach to contamination control that involved dividing its efforts into four categories: people, environments, processes, and tools.

This was the first step in what eventually was dubbed the “Systems approach to contamination control.” It was a significant move because contamination control is a very large and complex task. By dividing the tasks into several steps, the overall problem was more easily evaluated.

By the 1980s, quite a bit was known about controlling contamination from people and the environment. The lessons learned were not necessarily being followed in the actual manufacturing, however! Gathering all available information, putting it in a logical order, and instructing facilities and manufacturing on proper implementation was our first step.

The years spent uncovering problems rarely seen today added to our body of knowledge about contamination control. For example, downflow rooms were combined with front intake clean stations, defeating the advantages of the downflow designs. Clean hoods had blowers that often ran backwards when they were shut down and turned back on. The materials of construction often were major sources of contamination themselves. People were poorly garmented. Cleanroom gloves were too short, so the critical area between the garment sleeve and the cleanroom glove was open, funneling contamination toward the product.

Measuring defects

It became apparent that the major focus of contamination control should be the product, and that the characteristics of the product should drive the contamination control requirements. This led to the concept of defect density, a measurement of defects or faults per unit area of the product being manufactured. In our case, defect density is measured in defects per square centimeter of chip surface area.

The need to measure and quantify contamination drove a major effort in measurement and instrumentation. The resulting equipment improvements provided the basis for tightening control in our contamination control system.

The first efforts were directed toward measuring defects on product. At first this was a very tedious process — primarily microscopic examination and counting of defects by operators. Bright light inspection stations provided relief for a period of time. A bright light source, such as a Kodak slide projector, was directed toward the product at an oblique angle, and the reflecting spots were counted. This technique was much faster than physically counting, but it only provided information on the number of defects. Microscopes still were needed to gather additional information such as size and possible cause of the defects.

This led to the development of complex surface inspection equipment by companies like Hamamatsu (who later became Inspex) and Tencor. These systems varied in sensitivity, area of inspection, and type of product being examined. Semiconductors required very high sensitivity — less than two microns — on very reflective surfaces, with complex patterns and small surface area. Photomasks required high sensitivity on transparent substrates. Printed circuits required less sensitivity — larger than ten microns — on much larger surface areas, often several square feet.

By the mid-1980s, improved inspection equipment now provided the means to start focusing on the tool element of the four categories mentioned above. It was possible to run product through a specific tool and to measure the defect level before and after that sector. The difference between before and after was the adder for that sector — the amount that a tool or operation was contaminating the product. It was now possible to measure how much contamination was contributed by a wafer handler, a load or unload operation, an etch process, storage, or most other operations. Sometimes the process couldn`t be measured directly because operations, like a material deposition step, already covered the product surface that existed before. In this case, the process was simulated. All steps except the actual deposition were performed, and this was considered representative of what the deposition process added.

The tool targets established in this process were at first used internally at IBM, but it soon became apparent that to accelerate tool improvement, the tool manufacturers should know and understand the contamination characteristics of their tools. This way, they could be involved in improving the tools and meeting defect density requirements.

Tightening control

About the same time that progress was being made in tool control, process engineers were looking at tightening the control of the various process steps. Here again, the improved measurement equipment was useful in tracking the contamination of actual process steps such as rinses, strips, the etch process, and various cleaning operations. While the product inspection provided measurements that improved understanding and control, it became apparent that improved measurements of the solutions and how they were becoming contaminated would be important as well. This led to the need for improved air and liquid particle counters. The first emphasis was primarily on control of particles. Commodity targets were generated with limits for particle control in commodities including air, other process gases, and liquids such as water, cleaning solutions, and reactive chemicals like the silicon etch solutions.

By 1986, defect density targets, tool targets and commodity targets were combined, and roadmaps of current and future product requirements were generated. In the case of semiconductors, these roadmaps covered four generations of product — a time frame of about 12 years. The first generation represented the current product being manufactured at that time, and its targets corresponded pretty closely with product actuals. The next three generations were engineering judgment and the extrapolation of what was required to manufacture each subsequent product generation.

Defect density targets, tool targets and commodity targets got tighter with each generation, and, in time, new controls were added. The first chemical commodity targets primarily specified particle contamination but subsequent generations specified dissolved impurities starting with many parts per million and reaching current levels of a few parts per billion.

The extent of the progress on leading edge products is apparent when you consider that 64 and 128 bits per chip were the state of the art in semiconductors when I started. Now, 64 and 128 million bits per chip are state of the art in manufacturing. When I first became involved in contamination control, one-micron geometry was state of the art; now 0.25 micron is reality.

Today, new industries are adapting contamination control practices all the time, and many of the new applications are still concerned with large contamination.

With this diverse and rapidly growing need for contamination control, it is important to tie it all together. The lessons of the past 20 years have underscored the importance of defect density, tool and commodity targets, and well-thought-out product roadmaps in getting a handle on contamination control. CR

Harold Fitch is president of Future Resource Development, a consulting firm in Burlington, VT, which specializes in cleanroom education and problem solving.

POST A COMMENT

Easily post a comment below using your Linkedin, Twitter, Google or Facebook account. Comments won't automatically be posted to your social media accounts unless you select to share.