Issue



Implementing a strategy for effective fab data management


06/01/2007







IC manufacturing is performed with hundreds of sequential steps, each of which could experience problems leading to yield loss. Maintaining quality in these production lines requires the tight control of hundreds, or even thousands, of process variables. To maintain high yield, high quality, and low cycle times, IC manufacturers are focusing on the development of several critical capabilities of manufacturing automation systems, including: advanced process control (APC); fault detection and classification (FDC); predictive and preventive maintenance (PPM); parametric yield modeling; and scheduling and dispatch optimization [1].

In order to mine the increasingly large volumes of production data effectively, the fab’s manufacturing information technology (IT) systems need to be able to handle enormous amounts of information from diverse sources (equipment, fab applications, data bases, human operators, flat files, etc.) (Fig. 1). A robust EDA framework is a key component in increasing manufacturing speed, accuracy, and agility.

Data management strategy

The goal of the data management system is to provide the infrastructure to transform raw data into consistent, accurate, and reliable business information that can be used to drive manufacturing process changes. A carefully planned and implemented manufacturing data management strategy consisting of people, policies, processes, and tools can ensure that manufacturing data is managed effectively and efficiently.

For effective implementation of automated control systems such as APC and PPM, it is essential to gather a sufficient sample of representative data to develop proper models. Results from early implementations using Cimetrix products in 300mm production facilities show that data collection rates will be on the order of hundreds or thousands of data points/sec. for each process tool [2], potentially generating tens or hundreds of gigabytes of data/day in a single fab.


Figure 1. Flow of manufacturing information.
Click here to enlarge image

Effective data management solutions will need to abstract the data collection interface to facilitate integration and correlation of fab data from multiple, heterogeneous data sources. Correlating data from multiple sources and providing context for the equipment data give a more accurate picture of the equipment or process “health” and more effective automated decision systems.

Data quality, metadata, and delivery

The ROI of new data management applications will obviously have a high correlation to the quality of data the solution produces. Data quality can be defined as the state of completeness, validity, consistency, timeliness, and accuracy that makes data appropriate for a specific use. Data quality is multidimensional and includes many factors beyond accuracy, such as resolution and precision, data completeness, data timeliness (currency), data synchronization, relevance, and appropriate data context.

Data quality management, however, goes beyond the data source because it is not sufficient to rely on the source alone to provide the quality necessary for the consuming application. The data management system must be extensible to allow for the insertion of data quality improvement “rules” between the data source and the data consumer. Proper context resolution, correlation, and synchronization of multiple data sources, data cleansing, and other operations can be performed to improve data quality as it is manufactured into an information product for manufacturing or business analysis applications.

Effective data management is often hampered by a lack of knowledge about the data itself. Metadata, or “data about data,” provides a more accurate data profile and promotes understanding and management of how data are derived, the fundamental relationships between them, and how they are used. Understanding these relationships is critical for effective analysis and decision operations across the enterprise.

One key objective of a fab data manage-ment strategy is to enable the fab engineers to easily create and modify data collection plans using corporate-wide standards and to have these changes automatically update data collection activity at the equipment and other data sources. Descriptive information about the data-what it is, where it’s located, value ranges, update rates, etc.-allows the fab engineers to visualize the data and its context, providing the critical descriptive information needed to quickly define the correct collection plans.

After collecting the data and ensuring its quality, data management must drive accuracy and speed data delivery to the manufacturing and business applications through data routing services. This will require the system to support integration with multiple, heterogeneous consuming applications.

Any data management solution must integrate with the existing fab IT infrastructure so that the solution leverages the existing manufacturing IT investment and so that those applications can leverage the improvements in data availability, flexibility, and quality.

Adoption of Interface A

The new SEMI Interface A (a communications standard for EDA) provides high-speed, high-quality, flexible data collection from manufacturing equipment. This new technology at the data source enables implementation of a robust manufacturing data acquisition framework.

Although Interface A does not address all of the data management problems, it eliminates a tremendous amount of variability among the equipment data sources; some fabs have more than 60 unique equipment types. It also provides a number of services for easy, remote, dynamic data collection configuration through a generic web service interface.

Interface A provides easier access to more data. Early implementations using Cimetrix products on 300mm production equipment have demonstrated that equipment can easily produce data at rates of over 400 parameters @ 10Hz, generating almost 1GB of data/day for each process tool [3].

Solution survey

Although there are various data management methods used in fab-level APC implementations, the successful adoption of Interface A (Fig. 2) will change the nature of data management in fabs. The data consumer-centric approach is based on a key premise used in the implementation of the Interface A standard, which allows multiple clients to independently connect to a single piece of equipment and manage data collection according to the needs of each particular client. Raw data is transferred from the equipment directly to the fab client applications. Different clients can have different views of the data, based on the security and permission configuration.


Figure 2. Introducing Interface A into a typical CIM architecture.
Click here to enlarge image

This is a relatively simple architecture with a flexible and extendable framework (Fig. 3). Still, it has some significant limitations. Existing client applications can benefit only if they are modified to provide Interface-A-compliant web services. This framework does not facilitate integration of new data sources, other than those accessed through Interface A.


Figure 3. Direct consumer-to-equipment Interface A connections.
Click here to enlarge image

Other industry initiatives, such as EEQA (enhanced equipment quality assurance) and EEQM (enhanced equipment quality management), have produced some solutions that are focused primarily on an equipment-centric, “OEM-owned” approach. These types of solutions require the equipment to store all the raw data and provide data modeling and analysis tools. Although all the data is technically available with this architecture, it is expected that only summary results or exception data are regularly reported to the fab, minimizing the impact on the fab networks and providing additional value to manufacturing operations, while minimizing changes to the existing IT infrastructure. However, it also introduces an additional database server and analysis application for each tool type. Regardless of whether the application lives inside the equipment or outside, these OEM-provided solutions, typically are not well-suited for integration with other data sources or applications from other manufacturers. This approach increases the number of data “silos” and is not particularly extendible or flexible.

Despite their limitations, legacy solutions have important benefits. They can be relatively inexpensive and quickly developed and deployed. Because they are often tailored for a small set of requirements, they perform those functions very well. With these limitations, however, these applications may not be easily extended or integrated with other systems to provide the additional features necessary for advanced manufacturing automation and long-term continuous improvement.

As none of the individual, independent components can satisfy all requirements, it is necessary to address the gaps in the capabilities of the existing data collection systems and to develop a long-term solution roadmap for providing advanced capabilities to address future requirements. Each of the approaches discussed above provides clear value and allows for incremental investment. It’s not clear, however, that the previous solutions sufficiently cover all of the gaps or provide a suitable framework for future improvements and changing requirements.

The data management system must be viewed as a critical and valuable tool needed to help achieve business objectives, not just as a tool to capture data. The purpose of this system is to provide common, consistent methods for management of manufacturing data collected from multiple data sources, primarily equipment data provided through Interface A, and deliver to various applications within the manufacturing execution system (MES, e.g. SPC, RMS, R2R, YMS, etc.). In other words, the data management system must ensure delivery of the right information to the right place at the right time.

Figure 4 is a conceptual model of a centralized manufacturing data management system consisting of services to support data collection, data quality and transformation, data delivery, management of data collection plans, and configuration and management of metadata.


Figure 4. Manufacturing data management system concept.
Click here to enlarge image

Ultimately, such a centralized data management system would eliminate data silos, simplify configuration and integration, ensure end-to-end data integrity, and improve data quality as both the volume of data and the fab’s ability to depend on that data to run the business increases. Centralization also supports data quality requirements such as data cleansing, data context, proper time stamping and management of reference data, and feeding APC systems with quality information products for more accurate models.

Conclusion

Business demands are requiring more prolific deployment of high-quality automated manufacturing control applications. These analysis and decision tools need better information-information that is only as good as the data coming in from the manufacturing systems. It is necessary to address the gaps in the capabilities of the existing data collection systems and develop a long-term solution roadmap for providing advanced capabilities to address future manufacturing requirements.

A carefully planned manufacturing data management strategy consists of people, policies, processes, and tools. A critical component of that strategy is the development and deployment of a manufacturing data management system-a computer system that provides the enabling technology for automatic data acquisition, storage, access, analysis, transformation, delivery and presentation.

References

  1. S. Fulton, H. Wohlwend, “Striving to Realize Productivity in Truly Optimized Fabs,” MICRO, April 2005.
  2. International Sematech, “EEC High-Level Requirements for Advanced Process Control (APC),” June 2004; available from Internet: http://ismi.sematech.org/emanufacturing/docs/EECReqs.pdf.
  3. A. Weber, “Preparing the Way for Interface A: Evolution Strategies for APC/EES Applications,” presented at the 17th AEC/APC Symposium, Indian Wells, CA, Sept. 24-29, 2005.

Doug Rust is the co-chairman of the SEMI GEM300 Task Force and is director of customer support at Cimetrix Inc., 6979 South High Tech Drive, Salt Lake City, UT 84047, United States; ph 801/256-6500 e-mail [email protected].

Bill Reid is director of the Data Management Solution Center at Cimetrix Inc.