Issue



Automated contamination monitoring: Faster is better


04/01/2005







Across all cleanroom industries, manufacturers are looking for quicker ways to gather more high-quality data from their contamination monitoring devices, but each industry is taking a different approach

By Sarah Fister Gale


Contamination monitoring is the last element of the cleanroom’s production process to become automated, but as manufacturers are pushed to the limits of their manual contamination-monitoring capabilities, demand for new technologies is growing. Production processes continue to grow in complexity while standards for contamination limits tighten across industries, and demands are ever increasing for higher yields and less downtime.

Click here to enlarge image

“Continuous contamination monitoring is critical across all cleanroom industries,” says Morgan Polen, vice president of application technology for Lighthouse Worldwide Solutions (San Jose, Calif.). “The cleanroom is a chaotic environment where things can change instantly between routine monitoring intervals,” he says.

As a result however, more monitors are gathering more data more frequently, but it’s not being managed or interpreted effectively. There is simply too much data in too many places to get an accurate read on what’s going on with regard to contamination.

Adding to the challenge of keeping watch over all of the possible contamination issues is the fact that the workers doing the monitoring represent the biggest contamination risk. From bacteria transfer in food and pharmaceutical environments, to increased particle counts in wafer fabs, people create problems in the cleanroom and the fewer of them there are, the less likely a facility is to have contamination issues.

For all of these industries, there is a common need for faster and more effective control of data and monitoring with less human involvement (see Fig. 1). However, for each industry, unique sets of guidelines, standards, shelf-life issues, economic drivers and manufacturing techniques have led them to varying tools and technologies and different levels of urgency.

The semiconductor and pharmaceutical industries are at the forefront of implementing revolutionary automated systems because they have the most to lose. For semiconductors, yield determines success, and an undetected burst in particle counts or airborne molecular contamination can ruin a batch and devastate profits. For pharmaceuticals, the risk is more dire: Contamination could potentially put patients at risk and threaten the viability of a company and its brands.

In food and beverage industries, the trend toward automated contamination monitoring is slower due to cost and manufacturing challenges. For these industries, bacteria is the big issue and finding ways to speed the testing process is key, but the return on investment has to be quick and obvious.

Semi industry leads the way with Interface A

In order to remain competitive, semiconductor wafer manufacturers seek to continuously improve overall equipment and manufacturing effectiveness. To facilitate these improvements, they are increasingly implementing computer-based applications to employ such techniques as equipment health monitoring, fault detection and classification, run-to-run control, predictive and preventative maintenance, collection and analysis of data from manufacturing equipment, equipment productivity monitoring, in-line defect monitoring, integrated metrology, the reduction or elimination of nonproduct wafers, equipment matching, and others.

Click here to enlarge image

All of these techniques depend on the availability of information-from the manufacturing equipment to the software-that can either automatically make decisions or will present the data to human decision-makers. Having easily-centralized access to relevant and accurate data will result in better decisions, but in many current fabs, that data is decentralized, cumbersome and difficult to access.

Click here to enlarge image

“In the semiconductor industry, you want all of your environmental data to be fed into your Manufacturing Execution System (MES) because real-time monitoring of data is tied directly to ability to execute,” Polen says. “There may be more than 200 tools in the cleanroom and if you have high particle counts on one of them, you can have a problem.”

These days most wafer fabs have continuous contamination monitors as part of their overall mission to automate production processes. However, access to the data they generate is complicated. Many of the tools are individually hard-wired to deliver data to Advanced Process Control (APC) applications, creating a complex web of networks over which data is sent in dozens of different directions. For example, particle counts from a specific tool may be fed to a Fault Detection System (FDC), but through some proprietary data-gathering scheme.

Complicating matters further is the overwhelming flood of data that results from this vigorous automation process and the multitude of variables that affect it. “The ability to collect data is easy,” says Bill Fosnight, senior vice president of engineering for Brooks Automation (Chelmsford, Mass.), maker of manufacturing efficiency tools for the semiconductor and other manufacturing industries. “The challenge is knowing what to do with it.”

Because the monitors are generally located in the cleanroom itself, they pick up data that could be the result of any number of issues. “A jump in particle counts is meaningless unless you know where it’s coming from,” he says. Fosnight has seen companies invest millions in automated contamination monitoring systems that were useless because nobody knew what to do with the data.

That is all about change, however, with the recent implementation of Interface A, says Dave Faulkner, executive vice president of sales and marketing for Cimetrix (Salt Lake City, Utah), a software solutions provider for connectivity solutions in the semiconductor industry. Cimetrix manufactures a family of software products called CIMPortal that enable equipment suppliers and IC makers to realize the requirements and intentions of high-speed data access through Interface A, a new standard recently released by SEMI and supported by International Sematech.

Since the 1980s, semiconductor manufacturers have relied on SECS/GEM technology to control their automated tools. “SECS/GEM has a lot of history, and it is required for connecting tools to today’s 300-mm MES systems. But the technology is old,” Faulkner says. “There are a lot of things we want to do today with equipment data collection, such as querying the tool to see what data is available, that we can’t because the SECS/GEM technology isn’t capable. SECS/GEM is great for controlling tools and handling recipes, but it wasn’t designed for advanced process control and e-diagnostics.”

To get around its limitations in the past, manufacturers were forced to hard-wire equipment to APC systems using individual, proprietary networks, which is costly and creates a complicated and cumbersome data delivery system. As processes and control issues become more complex, the need increases for higher-level access to detailed process control data and hard-wiring every individual piece of equipment is not a feasible option.

Click here to enlarge image

Recognizing the problem, SEMI and International Sematech Manufacturing Initiative (ISMI) recently created and approved the Interface A standard (formerly known as Equipment Data Acquisition [EDA]) in late 2004, with publication of the standard in March 2005 (see Fig. 2).

The Interface A standard, based on XML Web services, allows chipmakers access to more complete and useful data for applications, such as equipment diagnostics and advanced process control between fab equipment, factory information and control systems. It doesn’t replace SECS/GEM, rather it creates a new connection, in addition to SECS/GEM, over which complex data can be collected from cleanroom equipment.

“The only way an Automated Process Control (APC) system can work to its potential is if it has excellent data to base decisions on. Interface A will provide that data,” Faulkner says. “It will give fabs the ability to fine tune their production systems through access to high-quality, high-volume data.”

The standard includes four parts. “E120 Common Equipment Model” defines a set of vocabulary to describe the physical structures of every piece of equipment, so that when the host queries a tool they use the same terms. “E125 Equipment Self-Description” requires the equipment to use the E120 vocabulary when communicating with the host so that the host understands the unique characteristics of each tool, how it is built, and what its capabilities are.

“This is a huge stride for the semiconductor industry,” Faulkner says. “Before Interface A, the host had little idea what the equipment had inside it or any context for the data it was retrieving.”

“E132 Equipment Client Authentication / Authorization” defines the security rules for who has access to which data; and “E 134 Data Collection Management” defines how the host will request tool-specific data and how the equipment will respond.

A key requirement of the implementation of Interface A is that the technology not be layered on top of the existing tool software. Equipment internals behind the interface must be designed to provide dedicated high-throughput data acquisition while maintaining equipment run rates. Layering the technology on top of existing systems may not meet performance and reliability needs, which could cause data quality issues, reduction in sampling-to-reporting latency, time stamp faults, and integration problems.

Interface A will eliminate that complicated web of back-door hard-wiring to access information from individual tools, and give manufacturers access to data and control through a single portal. “It will be up to them to determine how they use that data to maximize efficiencies and find their competitive edge,” Faulkner says.

For toolmakers, it could mean completely redesigning their products to meet the data access requirements of the new standard. Some toolmakers already have the architecture in place and will only need to format it correctly to meet the Interface A protocols, whereas others will have to create brand-new avenues for data delivery. “Interface A is going to give semiconductor manufacturers and toolmakers a lot to think about in the coming year,” Faulkner predicts.

The industry is already anxious to start incorporating Interface A. According to an ISMI/SEMI poll of ISMI members and equipment suppliers, Interface A is now the key ISMI member company focus. Industry predictions suggest that some fabs will begin to implement Interface A as early as third quarter of 2005, with toolmakers trailing slightly behind, and by 2006 many more facilities will have begun implementation processes. Faulkner anticipates that “Interface A is going to change how fabs collect and monitor data to a point that most people in the industry don’t even realize.”

Pharmaceuticals go paperless

There is a growing trend in pharmaceutical facilities, as in the semiconductor industry, to automate contamination monitoring to meet regulatory requirements for validation and data security, as well as to satisfy demands for more cost-effective manufacturing techniques. Automated contamination monitoring in a pharmaceutical environment delivers real-time data throughout batch runs instead of offering checks before and after each run, Polen says. “With constant monitoring, operators can see that nothing changes during aseptic processing steps.”

Many large-scale production facilities are implementing automated monitors in the facility that can be controlled using remote monitoring tools, reducing the need for human interaction to collect data, and allowing for constant or more frequent testing.

Most remote contamination monitors are inconspicuous enough to integrate seamlessly into the cleanroom, close to the tools that present the greatest risk.

Remote monitors are typically installed in the fixtures, and wiring is run through the walls to a client-server application, Programmable Logic Controller (PLC), or a stand-alone PC outside of the cleanroom where data is constantly delivered. This remote connection allows signal data to be sent back and forth from the equipment to the PLC, creating a two-way form of communication and allowing operators to gather relevant test data without entering the cleanroom. Initiation of specific test parameters to measure pressure, run-time or pass criteria can be sent from the control system to the equipment and test results can be returned for integration with other manufacturing data, explains Emma Bartin, product manager for Pall Corporation (Portsmith, UK). Test results can be as simple as a pass/fail notation or delivery of complex details about the facility environment.

If baseline levels of contamination are exceeded, an automated alarm can signal the staff that a problem has occurred, enabling them to respond in real time, increasing the likelihood that they can identify the source before it has significant impact on a product batch.

“Historically, the pharmaceutical industry has been slow to change,” says Bartin, noting that it has been only in the last two to three years that demand for automation of contamination monitoring in this industry has increased. She attributes the recent interest to the industry’s growing comfort level with the security of automated software systems in combination with increasing economic pressures to achieve more with less. “They see that the automated processes have been validated and they know that they have less room for operator error.”

There are several benefits to this style of automation: It creates a paperless system with easy-to-retrieve data, eliminating the chance of operator error, Bartin explains. Each facility writes custom software to translate and implement the data to their own specifications, creating a customized system and increasing the security of facility data.

It also adheres to the FDA’s 21CFR Part 11 standard for the storage of electronic data. This standard applies to all GxP Information Technology (IT) systems that create, modify, maintain, archive or retrieve electronic records.

The scope of 21CFR Part 11 for pharmaceutical manufacturers is significant. Its intent is to accept and promote the use of new technology while maintaining the ability to protect public health. In other words, it sets ground rules for paperless processes applied to the pharmaceutical and related industries.

This FDA regulation on the use of electronic records and signatures, which went into effect in August 1997, forced many pharmaceutical companies to come to terms with the way they collect and store data and their use of computer systems. It’s been a long time coming but many now see the standard as an enabling legislation designed to govern paperless processes within the industry.

“21CFR Part 11 and the FDA’s Guidance for Industry document, Sterile Drug Products Produced by Aseptic Processing, have had a big impact on automation because they spell out the details for monitoring,” Polen says. The FDA’s Guidance includes directives on things such as the distance a monitor can be from a piece of equipment and the frequency of data collection to gain meaningful samples. “It made things a little bit clearer for manufacturers because they can refer to the guidance document.”

The biggest challenge in implementing automated systems, Polen says, is customizing the software to meet the needs of the client’s existing enterprise-wide system. It may be a high-speed network with multiple subnetworks, group access, and firewalls that need to be overcome. “The larger the company, the more red tape there is to navigate,” he says. “The biggest hurdles can often be convincing the IT department to let you implement something on their bandwidth.”

Food and beverage look for faster results

In the food and beverage industries, manufacturers are less concerned about creating completely automated systems to gather contamination data, and more interested in speeding the process of receiving test results. Rapid Microbiology is the hot new trend in contamination control and environmental monitoring for food and beverage processors who are constantly looking for faster ways to test for microbes and who are frustrated that current standard tests are woefully slow. “Traditional culture media tests use technology that dates back a hundred years and can take several days to incubate,” says Peter Ball, director of business development for Pall Corporation. The majority of testing performed today is based on the recovery and growth of microorganisms, using solid or liquid microbiological growth media. This is true in part because these methods can be very effective and have a long history of application in both industrial and clinical settings. However, they are often limited by slow microbial growth rates, the unintended selectivity of microbiological culture, and the inherent variability of microorganisms in their response to culture methods.

In spite of the limitations of current culture methods, acceptance of new and potentially superior methods is often slow, partly because of a lack of clear guidance regarding the demonstration of their equivalence to existing methods. Processors want new methods to get faster results, but the quality and consistency provided by current methods must be maintained.

The solution that is gaining popularity is Adenosine Triphosphate (ATP) bioluminescence testing. ATP monitoring is changing the way food safety programs operate by delivering real-time sanitation data on the spot, enabling cleaning teams to make informed decisions in real time about the cleanliness of equipment. ATP tests deliver results in 11 seconds, alerting sanitation staff to the presence of bacteria and organic materials before they sanitize. This saves facilities money because they aren’t wasting expensive sanitizing chemicals on equipment that hasn’t been properly cleaned and it’s a great training tool because the cleaning staff has access to immediate feedback on whether equipment has been cleaned correctly.

The ATP monitor gives instant positive or negative results for the presence of bacteria or fungi. It doesn’t define what the contamination is, but alerts the staff that there is a problem.

“You get much more information about your processes and you can demonstrate in real time that they are under control,” says Lisa Madsen, technical manager for the food and beverage group at Pall Corporation. If there’s a positive test result, it means the equipment is contaminated, requiring that it be recleaned before it is sanitized and before launching a product run. With culture tests it could take two days to see results, which means several product batches may have been run using unsanitized equipment.

Sanitation is a major factor in every outbreak of foodborne illness. ATP monitoring can prevent that. It also increases the speed-to-market of products because they aren’t held up in storage while manufacturers wait for test results. And if there is a contamination incident, it can be resolved before it damages multiple batches of product.

Admittedly, food and beverage makers are slow to adopt new technologies, and there is a significant investment in gearing up with new testing, Madsen explains, noting that ATP testing has been embraced initially by larger corporations and is slowly making its way into smaller facilities as tool prices come down.


Figure 2: The Interface A standard allows chipmakers access to more complete and useful data for applications. Source: ISMI
Click here to enlarge image

The tools can be simple, low-cost, handheld devices or complex, sophisticated systems that cost hundreds of thousands of dollars. The complexity of the tool determines its interface and data storage possibilities. For example, the Pallchek Rapid Microbiology System from Pall Corporation measures a count in photons per second, displays these values on an LCD display and simultaneously prints the same values on a printer. In normal operating mode, there is no interface with a computer or LAN, so 21CFR part 11 (which would otherwise be applicable) is irrelevant. At the other end of the spectrum, some of the very sophisticated systems use complex algorithms to determine results and to report data. These products require extensive software validation. For some users in the microbiology environment who may have limited previous experience with software validation, this makes close collaboration with their software validation experts critical.

Because the decision to perform or not perform ATP monitoring is often based on cost, ATP monitoring had been out of reach for smaller facilities. However, Hygiena, a designer and manufacturer of the System Sure II ATP monitor, now has a low-priced model with the same quality of results, making it feasible for smaller companies to invest in the technology and use it more regularly at routine checkpoints in the process.

“Making it more affordable would have a huge impact on smaller food companies and food service operations,” says Joellen Feirtag, a food science specialist and associate professor of food science and nutrition at the University of Minnesota. “At that price, ATP monitoring becomes affordable for everyone.III