Issue



A Users Guide to accurate gas flow calibration


11/01/1996







A user`s guide to accurate gas flow calibration

Dan LeMay, David Sheriff, Unit Instruments Inc., Yorba Linda, California

A NIST round-robin survey of 22 flow calibration labs around the US found in 1993 that a surprising number of labs measured gas flow with errors as large as 8%. Many engineers and technicians apparently rely on consistency rather than accuracy, thinking that if their flow calibrations are the same as last month`s, they are accurate enough. This article shows how to make accurate flow standards available throughout the semiconductor industry, and how to transfer these standards from one location to another.

As wafers grow larger and circuit architecture becomes smaller, gas flow calibration accuracy becomes more important for several reasons: to optimize process control, increase yield, and save money. Uniform calibration standards reduce the cost of process tool qualification. When equipment is well calibrated, processes can be exported from lab to production without redevelopment. Yields are maintained in production when flow calibrations can be reproduced accurately over time.

The gauge problem

The best flow metrology laboratories in the world have combined accuracy and reproducibility errors on the order of ?0.2% of flow. Fab engineers expect their mass flow controllers (MFCs) to be accurate to ?1.0% on the process tool. There is a fundamental gauge problem here: the fluid flow laboratory at NIST is only five times better than field requirements. If flow measurement "traceability" is perceived as a chain where devices are calibrated by other devices extending back to a national standards laboratory, then there can be five such generations from NIST to MFC. Consequently, a strategy of exporting flow accuracy from a national laboratory cannot meet the needs of the industry because errors grow 3:1 to 10:1 for each calibration generation.

To ensure that process gas flows are equivalent around the world, every MFC must be no more than one or two generations away from the world`s best flow metrology laboratories. In practice, this means every flow metrology lab has to be in that class. Working with NIST, Unit Instruments has developed a cross-check flow calibration reference system that achieves those results. The strategy rests on two fundamental concepts: that flow must be determined by several different primary methods that do not share the same error sources, and that a suitably reproducible comparison artifact must be used to compare the different methods and to cross-check remote sites.

Molecule count

Flow can be expressed in terms of either volume or mass. Mass flow rate is the same from point to point as a gas flows steadily through a leakless system. Volumetric flow changes with density, which changes with pressure and temperature, and so is not the same from point to point. Velocity measurements are a further step removed, since they must be averaged over the flow stream to give bulk volumetric flow. Mass units ensure the exact proportion of reactants. Stoichiometry cannot be maintained using volume measurement alone.

By convention, we use what appear to be volumetric units for mass flow - standard cubic centimeters per minute (sccm) or standard liters per minute (slm). Describing 1 slm of steam as the product of 2 slm of H2 and 1 slm of O2 is more intuitive than the equivalent in grams: 0.80 gm/min. of steam from 0.09 gm/min. of H2 and 0.71 gm/min. of O2. The "standard" in sccm makes it a mass unit, tying the cubic centimeters to a defined, standard pressure and temperature. There is a subtlety implicit in this concept that is frequently misunderstood. A standard cubic centimeter is not just a cc of real gas at standard conditions. It would be a cc if the gas were perfect at standard conditions. A number of real gases occupy less volume than near-perfect gases such as N2 at the same pressure and temperature. Since we know that one gram molecular weight (mole) of a perfect gas has 6.02 ? 1023 molecules and occupies 2.24 ? 104 cc at standard conditions, one standard cc has 6.02 ? 1023/2.24 ? 104 or 2.68 ? 1019 molecules. SEMI Standard E12-96 defines "standard density" as the molecular weight divided by 22,413.6, the equivalent perfect gas density, rather than as the real gas density at standard conditions [1, 2].

Ideally, we would just put a molecule counter on the pipe, run up a digital total per minute, and divide by a constant to get sccm. Lacking that, we might turn to an authority such as NIST for the definitive reference sccm. Unfortunately, unlike the kilogram, the sccm is a rate, and as such, cannot be preserved on a shelf, let alone shipped by common carrier to other laboratories. NIST has to create the sccm each time it is needed. Others can do this, too, and get results not only traceable to NIST flow standards, but also of equivalent accuracy. For the first time, direct flow comparison and traceability can be acheived using special critical flow nozzles (CFNs) and laminar flow elements (LFEs). Traceability has previously been practical only for the fundamental units from which flow is derived - mass, time, length, temperature, and the force of gravity. Such limited traceability cannot detect systematic errors in the machines that use these units to measure flow rates. CFNs and LFEs are not new, but their use as precision flow comparison artifacts is new. These applications have recently been pioneered by NIST. Small, rugged, and very reproducible, these devices provide methods that can essentially preserve and transport a flow rate. They are capable of comparing different calibrators at the accuracy levels the industry requires. Working with NIST, Unit Instruments has developed the first comprehensive industry calibration traceability program using NIST-calibrated CFNs (Fig. 1).

Click here to enlarge image

Figure 1. a) Calibration chain of traceability in conventional setups; b) calibration chain of traceability at Unit Instruments.

Flow calibrators

All flow calibrators can be classified as either primaries or secondaries. Primary calibrators are devices that can derive flow rate from base measurements of mass and time; or volume, pressure, temperature, and time [3]. Secondary calibrators are devices that must be calibrated from some other flow device with higher authority. There are many examples of secondaries, but there are only three known gas flow primaries. Their subtle details are not widely appreciated, even by professional metrologists. The primary gas calibrators are:

 Gravimetric - measures the weight of gas that flows into a tank per unit time.

 Volumetric - measures the volume of gas that flows into a variable-volume container per unit time.

 Rate-of-rise (ROR) - uses the same gas equation with a different twist: instead of variable volume at constant pressure, it uses variable pressure at constant volume.

Each flow calibrator has unique advantages, disadvantages, sources of error, and associated instrumentation requirements. What works best in a given situation depends on the experimental details. A summary of the main characteristics of each type of calibrator follows.

Gravimetric (primary). (See Fig. 2.) The flow to be calibrated is regulated to a constant value by means that are not part of the calibrator. Gas at this constant flow rate is introduced into a container with known tare weight for a measured period of time. The container is then disconnected and weighed again. The average mass flow rate, m? , for this timed collection is

Click here to enlarge image

where m1 and m2 are the initial and final container masses and Dt is the collection time. To convert to sccm, this mass flow is divided by the standard density of the gas. This is the molecular weight divided by 2.24 ? 104 scc/mol [1]. For example, the mass flow rate of nitrogen is simply 28.0 gm/mol/2.24 ? 104 scc/mol, or 1.25 ? 10-3 gm/scc.

Click here to enlarge image

Figure 2. The gravimetric calibrator weighs gas added to the receiver as a function of time.

There are major advantages to using the gravimetric calibrator. Contributing errors are limited to mass and time measurements and no assumptions about gas compressibility have to be made. Vapors can be handled at low pressure by cryopumping into a receiver immersed in liquid nitrogen and the equipment can be compatible with corrosives. Even though good scales are expensive and delicate, the mass standards used to check them are cheap and rugged.

Gravimetric calibrators cannot usually compute flow during the run. An average rate is determined after the fact. Collection time can be very long in order to get good accuracy at low flows. For example, a typical test for BCl3 at 100 sccm can be done by cryopumping into a bottle having a 1600 gm tare mass. A very good commercial scale for this range has an accuracy of ?0.01 gm and a resolution of ?0.01 gm, so the mass uncertainty is ?0.02/m, where m is the net mass of gas for the test. To keep this error down to ?0.2%, the mass of gas must be at least 10 gm. With a density of 5.23 ? 10-3 gm/scc, collecting 10 gm of BCl3 will take 19 minutes at 100 sccm.

The problems with measurement accuracy and collection time are illustrated in the example above. If precision-traceable test weights are used to calibrate the scale at values near those measured, then scale errors such as linearity and temperature coefficient can be made negligible. The remaining errors then come from the short-term resolution and repeatability of the scale. The buoyancy of the collection bottle will essentially cancel if the tare and final weights are taken the same day in an air-conditioned lab. Short-term buoyancy effects may be more complex if the container floats in a liquid to reduce tare. By reducing tare in this way, the extremely sophisticated gravimetric calibrator at Oak Ridge National Laboratories [4] achieves the measurement resolution to compute a continuous outlet reading.

As with any timed collection technique, start-stop anomalies must be kept small. These errors include upsets to the regulated flow caused by the sudden opening and closing of valves into the receiver. They can also include variations of gas inventory in the plumbing. If the receiver remains attached to the system during weighing, then there can be relatively large errors due to the forces added by the "flexible" gas connection. If it is detached, errors can result from improper accounting of the unweighed gas left in the plumbing or from carelessly substituting a different safety cap during one of the weighings.

Volumetric (primary). (See Fig. 3.) A constant flow to be calibrated is introduced into an expandable volume of known geometry at constant pressure for a measured period of time. The change of volume is measured and the average flow rate is:

Click here to enlarge image

where

DV= the change of receiver volume in cubic centimeters

Dt= the run time in minutes

M= the molecular weight of the gas in gm/mol

P= the gas pressure in atmospheres

T= the absolute temperature of the gas in Kelvins

Z= the compressibility factor for the gas at that temperature and pressure

Click here to enlarge image

Figure 3. The volumetric calibrator measures the change of volume per unit time at steady pressure and temperature and converts the rate to mass flow.

The 82.056 factor is the gas constant, R, in compatible units. The mass flow, gm/min, can be converted to sccm as detailed for the gravimetric calibrator above.

From small bubble meters to large bell provers, the range of flow rates covered by volumetric calibrators is sufficiently wide for all but the lowest flow rates. The equipment is simple and generally stable. Since virtually all volumetric devices operate at atmospheric pressure, however, they cannot be used for measuring vapors that are liquid under that condition. The seals between moving parts, usually oil or mercury, can constitute a contamination hazard to both personnel and equipment, and preclude calibration on toxics and corrosives. Gas can diffuse through the film of very simple bubble meters or be absorbed in the liquid reservoir. Manually timed calibrators suffer from inconsistencies in operator technique on the order of 0.25 sec. Data reduction complexity also invites mistakes. Automatically timed calibrators have their own uncertainties, which depend on how the piston`s position is detected, particularly if the run is too short. Size and fragility make most volumetric calibrators a poor choice if portability is required.

Gas temperature and pressure must be held constant during the collection period. The lab ambient pressure is usually sufficiently stable, but the pressure inside the receiver may be different because of friction or inaccurate counterbalancing of the piston or bell weight. An uncompensated difference of just one inch of water gives a flow error of 0.3%. Barometric accuracy is also a large error source. Temperature can vary, especially during rapid collections, due to Joule-Thompson expansion cooling of the input gas. An error of just 0.3?C gives a 0.1% error in flow rate. Further, estimating Z for heavy gases can produce significant errors.

Common operational errors include failure to read the pressure inside the volume or failure to read a barometer in the same room with the calibrator. Air handling equipment typically changes the ambient pressure enough to matter. Gas temperature inside the volume cannot be measured accurately, and the assumption that the equipment temperature is indicative of the gas temperature may be in error because of the incoming gas temperature or the placement of heat-generating equipment near the calibrator. In piston or bubble calibrators, failure to keep the tube or the sealing fluid clean can introduce timing errors.

ROR (primary). (See Fig. 4.) Calibration gas flows at a constant rate into a receiver of known volume and the pressure and temperature in the volume are measured as a function of time during the collection. At any time the mass of gas in the volume obeys the relationship

Click here to enlarge image

where the variables M, V, P, Z, and T are the same as those defined for Eqn. 2 and the 82.056 factor is the gas constant, R, in compatible units. The flow rate is inferred from this equation by measuring the change of P while all other factors remain constant. This can be done in either of two ways.

Click here to enlarge image

Figure 4. The ROR calibrator measures the rate-of-rise of pressure in a receiver of known volume and converts it to mass flow.

1) Ramp mode. The slope of the pressure-time ramp is computed in real time and the flow is displayed during the collection. This mode is more convenient for making adjustments to the device under test, but the accuracy suffers from dynamic flow and sensing problems.

2) Batch mode. The tank pressure is measured before the test run. A constant flow bypassing the receiver is switched into it for a measured period and then to the bypass again. The final pressure is measured a few seconds later, after conditions have stabilized. This less convenient mode gives only the average flow rate for the run, but with greater precision.

If all of the hardware in the ROR is made from compatible materials such as stainless steel, it can be used with toxics and corrosives. An ROR can calibrate heavy vapors at low absolute pressures, where their compressibility factors are negligible. On the downside, the ROR presents a changing backpressure to the upstream equipment under test unless isolated by a CFN or backpressure regulator. While the ROR is the most rugged and portable of the primaries, it is typically too bulky to qualify as carry-on luggage when used in field work (see "Rate of rise Gas Stick technology" on page 90).

Rate of rise Gas Stick technology

The Technical Staff, MKS Instruments Inc., Andover, Massachusetts

There is now a new way for process engineers to calibrate and check mass flow controllers, without system shutdowns or MFC disconnection, through the use of a new rate of rise (ROR) Gas Stick that is installed in the gas manifold line between all of the MFCs and the process chamber. The ROR Gas Stick is compact enough to fit within existing gas box designs.

The accuracy of mass flow controllers can be critical to the performance of CVD, PVD, and - particularly - etch processes. The industry is used to operating with MFCs well outside manufacturers` stated accuracies because of the difficulty in performing accurate calibrations on a routine basis (see accompanying article). Unfortunately, this kind of variation can have a major impact on processing. In a titanium nitride deposition process, for example, a 5% change in the argon:nitrogen ratio could affect the color, resistivity, and stress of the film. Such variations make it difficult to reproduce process performance from tool to tool.

ROR measurements for MFC calibration are based on the basic gas law PV = nRT, where P = pressure, V = volume, n = number of moles of gas, R is the universal gas constant, and T = temperature.

Pressure is measured at two points in time while gas flows into a given closed volume at a known temperature. Currently, MFCs are checked and calibrated either by using the ROR method with the process chamber and accompanying gas lines as the active volume or by removing the MFC from the tool and using a dedicated calibration cart with an appropriate calibrated volume and set of gauges, valves, and electronics. Each of these metods has limitations and drawbacks.

With 4-10 MFCs per gas box, it is time-consuming and costly to remove and check each one using the calibration cart method. When the MFCs are connected to gases such as BCl3, there is also the potential for corrosion damage to the MFC if moisture comes in contact with the gas residue. The calibration measurements themselves can only be carried out with surrogate or purge gases such as N2, rather than the actual process gas.

For existing process chamber ROR methods, the system must be shut down before measurements can be taken. Shutdowns are very costly and result in lost production time. The readings may vary, depending on environmental conditions or the condition of chamber walls, or modifications to the system configuration may change the gas volume. Even in stable conditions, gas temperatures are often estimated and volumes can change. Also, the reading results provide only an average MFC flow over the entire rate of rise period.

Because current methods are cumbersome and interrupt processing, MFCs are most often calibrated on a casual basis or checked only after a problem has occurred. Too often, when a processing problem occurs, the MFC is suspected, removed, checked, and found to be all right. This unnecessary replacement is a costly burden during an already stressful troubleshooting process.

To simplify the MFC calibration and checking process, we have developed a specialized ROR Gas Stick that uses in-line ROR measurements for accurate determination of MFC flow rates and performance on an ongoing basis, without shutting the process system down or removing the MFC from the tool. With this ROR Gas Stick, process and equipment engineers can monitor MFC performance and detect problems before they degrade process performance or system production time.

The ROR Gas Stick avoids costly shutdowns and time consuming MFC test procedures and results in more consistent MFC performance. ROR Gas Sticks can be connected to the gas manifold, in-line between all the MFCs and the process chamber within the gas box (see figure). With the ROR bypass feature, process gases can flow directly to the process chamber or to the ROR Gas Stick to provide quick and accurate flow measurements. Calibrations can be performed in 6-60 sec for flows from 1 sccm to 1 slm. Gas Stick monitoring provides MFC references of better than 1% of reading, and precision of better than 0.2% of reading. MFCs do not have to be removed during the process, thus extending usable lifetime, enhancing performance, ensuring stable conditions, and maintaining system purity.

Click here to enlarge image

Schematic diagram of Gas Stick configuration.

Each Gas Stick consists of integrated valves, a calibrated volume, a specially modified Baratron Pressure Transducer, and electronics. The ROR electronics operate the valves, monitor continuous pressure changes, compensate for gas temperature, record data, and analyze stability. All components use metal-sealed, ultraclean technology to avoid contamination. The Gas Stick also allows operators to alternate between purge gas and actual process gases for calibration.

This new technology allows MFC calibration to be incorporated as part of the regular process tool operation routine. Calibration routines and stability checks can be performed daily prior to processing, or between process runs, to check for any indications that controllers may be drifting or failing. By tracking several small changes in calibration or noting a nonstable rate of rise, system operators can be alerted to a potential MFC problem before misprocessing occurs or valuable product is lost.

Proactive calibration techniques help ensure that MFC performance is consistent and accurate. Gas ratios can be maintained at much more constant levels than those to which the industry is presently accustomed. If process errors do occur, equipment engineers can now quickly eliminate MFCs as a problem source before pulling the system apart, and focus their diagnostic efforts on other system components.

Equation 3 presumes isothermal operation, so any change of temperature during the run, such as that due to adiabatic compression or Joule-Thompson expansion, causes an error. It is not possible to compensate for dynamic temperature changes because they cannot be measured with adequate accuracy. Since there is not much energy in the adiabatic compression heating, it usually transfers to the tank walls within seconds. Therefore, compression heating error is virtually eliminated in the batch mode operation, but limits the pressure ramp rates that can be accommodated in the ramp mode.

The volume of the ROR must be correctly identified as the total volume whose inventory of gas changes with the measured pressure. This total includes all plumbing volumes and the exit volume in the upstream device under test or the CFN or regulator used to isolate that device from changing pressure. Alterations in these volumes due to valve position changes or switching of test units must be taken into account. During pressure ramps, the gas density in the flow lines can be vastly different from that in the tank because of pipe friction drops and the Bernoulli effect at the pipe flow velocities. In-situ RORs that check MFCs by the rate-of-rise in the process chamber of a semiconductor tool suffer from these plumbing velocity errors, and from errors caused by temperature gradients in the process chamber. Timing errors due to the change of flow conditions in the plumbing at start and stop can be significant, and become more so as flow increases and the length of the run decreases.

One feature is common to all three primaries: they are batch devices. Since the measured flow is directed to the calibrators instead of through them, they are "dead ends" during calibration. The primaries fill up (or empty) and thus cannot be run steadily for long periods or connected in series for simultaneous comparison. To cross-check two of these primaries directly, the upstream calibrator must run in the batch-empty mode into the downstream one as batch-fill. This procedure is cumbersome, but still a good validation test.

The size, weight, cost, delicacy, and limited portability of most primaries make it clear that a stable, portable, flow-through secondary is needed as a transfer standard. Such a standard is necessary to validate primaries periodically, to compare them with NIST in a single step, and to carry main-plant metrology values to remote sites.

In search of a portable flow artifact

NIST addressed this problem when contracted by SEMATECH to explore the gas flow calibration discrepancies in the semiconductor industry. MFCs had been used as transfer devices with some success, but they were not sufficiently reproducible when transported frequently and used under differing conditions. George E. Mattingly, leader of the Fluid Flow Group at NIST, investigated the use of LFEs and CFNs as transfer standards. The following is a short description of each.

Critical flow nozzle (secondary). A critical flow nozzle looks like a tiny quartz venturi tube with a throat diameter on the order of 0.1 mm. When the pressure ratio across the CFN is greater than ~2, the flow reaches the speed of sound in the gas (sonic throat velocity) and is described as "choked." When choked, the flow rate is a function only of the upstream conditions:

Click here to enlarge image

where

Cd= the discharge coefficient

C*= a gas function dependent upon composition and state [5]

At= the nozzle throat area

Po= the upstream gas pressure

R= the gas constant for the test gas

T= the gas temperature

The calibration of the CFN against a higher standard establishes Cd as a function of the throat Reynolds number (Re). This function can be determined by fitting Cd vs. Re/Cd using a least-squares third-order polynomial regression. Once completed, the flow rate can be calculated from Eqn. 4, given Po and T for the inlet gas.

The benefit of using CFNs is that they are small, rugged, portable, and stable. The auxiliary instrumentation needed to measure absolute pressure and temperature is simple and can generally be certified locally to adequate accuracy. Unlike primaries, CFNs are flow-through devices that can be connected in series with other flow meters. They must, however, be calibrated against a standard having better accuracy than that needed for the CFN. Data reduction is simple once it is set up, but getting it right in the first place takes some work. The accurate work done by NIST applies only to nitrogen, and it considers flow rates below 100 sccm still experimental. CFNs should work for other well-behaved gases, but not with vapors. NIST quotes ?0.25% for a 650 sccm size, and ?0.31% for a 200 sccm size CFN. These are conservative figures based on three standard deviations plus an additional ?0.1% estimated systematic error. At Unit Instruments, CFNs were found to agree consistently with the primaries within ? 0.25%.

The small quartz nozzles must be mounted in metal fittings for practical use. Care must be taken to avoid leakage past the CFN, partial blockage, or stress that could crack the quartz. Small bypass leaks cannot be detected because they cannot be isolated from the main flow through the nozzle. Operation below the choking pressure ratio must be avoided; it is prudent to monitor downstream pressure and flag the test if the ratio drops below the value recorded during CFN calibration.

Laminar flow element (secondary). Laminar flow through a capillary tube follows the equation

Click here to enlarge image

where

C= a constant

P= the inlet absolute pressure

DP= the pressure drop

D= the inside diameter of the round capillary

?= the gas viscosity

L= the length

T= the absolute temperature

Z= the gas compressibility at the operating state point.

For LFEs that are not circular in shape, the D4 term will change to some other function of the geometry. Converting to the terms of the previous equations

Click here to enlarge image

where

M = molecular weight in gm/mol.

An advantage of the LFE is that, for other conditions held constant, the flow is linearly proportional to DP. Extremely small flow rates can be accommodated. The pressure drop can be quite small to accommodate situations where limited pressure is available. However, gas viscosity is a complicated function of temperature, pressure, and composition, and so is not always known to the accuracy desired. The calibration of a differential pressure transducer for very small DP values can also pose accuracy limitations. The small passages in the LFE are sensitive to contamination. Some designs, such as annular passages, can be extremely sensitive to temperature gradients or transients because differential temperatures between the core and the shell affect the gap directly. Flow varies with the third power of the gap distance between the core and the shell. As with any secondary, LFEs must be calibrated against systems with higher accuracy. Theoretically, the LFE can also be a primary calibrator per Eqn. 5. However, the dependency on the fourth power of a very small dimension makes it difficult to know the numbers with an accuracy sufficient for primary purposes.

Commercial LFEs designed for precision laboratory use are generally comparable to CFNs, but their more delicate nature may increase uncertainty when they have been exposed to unfiltered gas or shipped as cargo. Sensitivity to shock and contamination means that LFEs should not be used for long periods without calibration spot checks. LFEs` sensitivity to temperature change requires monitoring of initial runs for several minutes to ensure stabilization has been reached. When switching to a different gas, the system must be flushed thoroughly.

Measurement uncertainties

Much has been written about measurement uncertainty [6, 7]. In a nutshell, the total uncertainty of a measurement having a number of independent individual uncertainties is the square root of the sum of the squares (usually called the RSS value) of all the error contributions. The RSS procedure has the effect of making the trivial uncertainties even more negligible, and so focuses on the big errors. For example, the RSS uncertainty for a calibrator with one 0.200% uncertainty and four 0.020% uncertainties is just 0.201%. The big-ticket items in the total uncertainty of all of the above methods except gravimetric are pressure and temperature.

Uncertainties in absolute pressure are the most difficult to resolve. Primary standards for absolute pressure include deadweight testers and mercury barometers, both of which depend on gravity. The mercury barometer makes its own vacuum reference at the top of its mercury column, but the deadweight tester must be the type that can be pumped down to a measured vacuum.

The gravitational constant, g, varies with latitude, altitude, and local effects. Over the continental United States, it varies about ?0.1%, and that is too much to ignore. Standard gravity is decreed by SI to be 980665 mgals (milligalileos: a galileo is one cm/sec/sec), and the "ideal" variation as a function of latitude is shown in Fig. 5. Local surface gravity can be additionally corrected for the distance the barometer is above the ground at a lapse rate of 0.3 mgals/m.

Click here to enlarge image

Figure 5. The local surface gravity varies up to ?0.1 % within the continental U.S.

Mercury barometers must also be corrected for temperature and the mercury must be clean because both affect mercury density. Using these barometers also requires a bit of technique, so it is worthwhile to assess that component of a lab`s uncertainty by getting readings from two or more technicians.

Unit Instruments recently conducted its own round-robin tests of barometer standards in the Los Angeles area using a precision absolute pressure instrument as a transfer standard. In an effort to find why the errors exceeded 0.2%, each lab was asked to explain how the calibration was determined on its own barometric standard. One eminent metrologist said that his standard didn`t need calibration because it was digital. Another said that he calls the local airport, which is at about the same altitude; however, he was unaware that airports correct their readings to sea level, and that air conditioning can pressurize the lab.

Temperature calibration is relatively easy. Most labs have good standards and techniques. However, a deviation as small as 0.3?C constitutes a 0.1% fractional error in temperature. Actual gas temperature is very difficult to measure because the heat capacity of even a very small temperature sensor is much greater than that of the gas that surrounds it. The sensor tends to change the local temperature of the gas more than the gas changes the temperature of the sensor. Unstable temperature environments can be created by setting warm power supplies near any part of the gas containment, turning the air conditioning off at night and not allowing adequate stabilization in the morning, or running at high flow rates without allowing time for heat transfer to attenuate the effects of adiabatic compression or Joule-Thompson cooling.

Unsteady ambient temperature is not a serious problem with any equipment types except LFEs, as long as the temperature is measured on the container wall and remains steady during the run. The gas assumes the temperature of the container walls rather quickly. Some LFEs, however, are extremely sensitive to ambient temperature transients because the size of the tiny laminar flow passages is directly affected by differences in the thermal expansion of the internal elements.

The portable flow artifact payoff

In his SEMATECH-sponsored study of flow metrology capability, Dr. Mattingly selected CFNs as portable artifacts, and proved them capable of carrying the NIST flow uncertainty of ?0.22% around the country as checked luggage. Over a period of four months (Jan. through April, 1993) he visited 22 lab sites, including commercial metrologists, wafer fabs, and MFC manufacturers. Dr. Mattingly ran 55 tests at these sites to check the accuracy of the local "standards." The results were shocking, if not surprising. Labs quoting accuracies in the range of ?0.1% to ?0.5% were found to have errors as high as 8% [8].

The 22 metrology labs were (in alphabetical order): Advanced Micro Devices, AT&T, Brooks Instruments, Coastal Instruments, Digital Equipment Corp., Drytek, DXL, Foss, Intel, MKS, Motorola, National Cash Register, National Semiconductor, National Semiconductor Maryland, NIST, Precision Flow Devices, Rockwell International, Sandia National Laboratories, SEMATECH, Texas Instruments, Tylan General, and Unit Instruments.

What was going wrong? In most cases the problems were minor, but accurate calibration is intolerant of even small errors. An old salt with a wet finger in the wind can calibrate air flow to ?10%, but ?1% requires good equipment and procedures, and ?0.25% requires sophistication and attention to detail. Sources of error include:

 Generational decay. Every time a secondary is calibrated from a higher-authority calibrator, the uncertainty grows.

 Equipment maintenance. Sources of error include leaks, friction in devices with movable parts, contamination, and sensor drift.

 Equipment design. Poor design can give both systematic offsets and erratic results. Issues include poor placement of temperature or pressure sensors, slow transducers that fail to indicate unsteady operation, and sensitivity to temperature gradients and transients. Laminar flow devices are notorious for this.

 Basic measurement accuracies. With each type of calibrator, the results depend upon the measurement of some combination of weight, time, pressure, and temperature. Most use relatively stable sensors that are easily checked; if electronics are involved, however, there can be additional problems with warm-up time, zero drift, temperature coefficient, etc. As shown above, barometer readings and absolute pressure are deceptively difficult to validate.

 Neglected compensations. Was the internal pressure measured when using a volumetric standard, or was it assumed to be one atmosphere? Were corrections for real gas compressibility under the actual test conditions made? Was the mercury barometer reading corrected for the temperature of the mercury and for local gravity? Was atmospheric buoyancy compensated for any change since the tare weight of the collection bottle was measured?

 Gas purity. When various gases are tested and equipment is sensitive to gas properties, it is important to flush well after changes, to use a tailpipe to prevent back diffusion, and to protect the purity of the source bottles by using vacuum-purged connections.

Having shown the magnitude of the calibration problem in the 300 to 800 sccm flow range, NIST`s initial round-robin study called for many labs to "re-assess their measurement processes and contact NIST for retesting." NIST also proposed similar tests for the full flow range of interest: 10 to 50,000 sccm, where larger errors are expected, but that project still awaits funding. This left the full range of flows untested and the labs to assess their own problems, requiring them to call back NIST for a fee-based reassessment.

In further support of semiconductor industry interest in low flows, some leak rate measurement tests were conducted by Stuart Tison, who leads the Pressure and Vacuum Group at NIST. Tison tested thermal mass flowmeters from five manufacturers; ranges were two to five sccm nitrogen full scale. All five met the manufacturers` specifications for stability, but only three met the accuracy specs [9, 10]. The other two were off by 9% and 17%. This implied that products in this range are good, but the flow metrology to calibrate them is inadequate.

Current practices

In an interview last March, three years after completion of the round-robin, Dr. Mattingly said, "I have seen no evidence that would lead me to believe that calibration is being done any better now than it was when we did the round-robin study in 1993. The only companies which called us back to recheck their calibration were companies that did not have the problem, companies like MKS and Unit Instruments." He doubted that the other labs are calibrating to NIST flow standards, since they have not asked NIST for rechecks.

The multiple-standards approach

In order to achieve calibration uncertainties in the 0.2% range, metrology laboratories must adopt a comprehensive approach. Multiple flow standards are needed to cover the same overlapping ranges to check one against another. The systems should be of different types so that they do not have the same systematic errors. For example, all volumetric primaries depend upon a barometer reading, so comparing a bell prover with a mercury-sealed piston could give readings that agree but are both wrong by the same amount if the barometer reading is in error.

A multiple-standards approach can be very reassuring when all calibrators agree within the target uncertainty, and very revealing when they don`t. Agreement between a NIST-calibrated CFN and a volumetric standard within 0.2% is good, but could be pure coincidence. But if a check point is run laboriously with a gravimetric calibrator and it falls within the combined uncertainties of the two calibrators, one can be confident of the absolute accuracy of the measurement, not just about the agreement of several devices. This is the benefit of having several different types of standards.

As an example of the multiple-standards approach, Unit Instruments maintains all three types of primaries. The gravimetric calibrator uses NIST-certified weights as standards. The RORs are calibrated by pressure transfer from a dimensionally certified standard volume. The volumetrics (bell provers) are certified by the state, using a volume transfer method. The pressure and temperature instruments used in conjunction with the primaries are certified by laboratories whose standards are traceable to NIST counterparts, and whose capabilities have been verified.

Unit Instruments uses both CFNs and LFEs as secondaries. Some CFNs are certified by NIST, and others are certified by a lab whose standards are traceable to NIST. These CFNs are used within the main plant to cross-check between the primaries, and agreement is required within ?0.25%. Since all three types of primaries can be operated in the batch-empty mode as well as the more common batch-fill mode, pairs of different primaries are occasionally operated one into the other as a primary-primary check. The CFNs are taken to Unit`s service centers in Europe and the Far East on a regular basis to cross-check the standards used there.

This multiple-standards approach gives high confidence in absolute flow measurement. The sources of traceability include standards from NIST, the state, and certified laboratories. They include independent standards of weight, volume, temperature, and pressure. There is no single parameter common to all, and thus no possible systematic error that could make all the flow measurements equally wrong. This agreement within the prescribed uncertainty among four independently derived standards gives confidence not just of their relative agreement, but of the absolute flow measurement.

References

1. SEMI Standard E12-96, "Standard for standard pressure, temperature, density, and flow units used in mass flow meters and mass flow controllers," Semiconductor Equipment and Materials Institute, 1996.

2. D. B. LeMay, "Redefining the Standard Cubic Centimeter," Semiconductor International, June 1994.

3. G. E. Mattingly, "Flow Measurement Proficiency Testing for International Intercomparisons and Traceability," presented at Flomeko `96, Beijing, Oct. 1996.

4. C. J. Remenyik, J. O. Hylton, "An Instrument for Gravimetric Calibration of Flow Devices with Corrosive Gases," Proc. 41st Int. Instrumentation Symposium, Aurora, CO, May 1995.

5. R. C. Johnson, "Real-Gas Effects in Critical-Flow-Through Nozzles and Tabulated Thermodynamic Properties," NASA Tech. Note D-2565, Jan. 1965.

6. "Guide to the Expression of Uncertainty in Measurement," ISO/TAG 4/WG3: Second Edition, 1993.

7. B. N. Taylor, C. E. Kuyatt, "Guidelines for Evaluating and Expressing the Uncertainty of NIST Measurement Results," NIST Technical Note 1297, 1994 edition.

8. G. E. Mattingly, "Round-Robin Mass Flow Controller Gas Flow Calibration Test Results Using the NIST-SEMATECH-Mark 1 Gas Flow measurement Artifact," Final Report #93051631A-ENG, May 30, 1993.

9. S. A. Tison, "A Critical Evaluation of Thermal Mass Flowmeters," J. Vac. Sci. Technol., July/Aug. 1996.

10. S. A. Tison, "Accurate Flow Measurement in Vacuum Processing Using Mass Flow Controllers," Solid State Technology, p. 73, Oct. 1996.

DAN LEMAY has 30 years of experience with gas flow measurement and control. He was educated at CalTech and MIT, and was the co-founder of Tylan Corp. (now Tylan General). He was awarded a SEMMY by the Semiconductor Equipment and Materials Institute for his contribution to the development of the thermal gas mass flow controllers now common in all wafer fabs. He holds numerous patents, mostly relating to gas mass flow control, and he is currently a senior scientist for Unit Instruments in Yorba Linda, California.

DAVID SHERIFF is the director of marketing for Unit Instruments. He is a former co-chairman of both the subsystems and MFC standards committees for SEMI, and also organized and chaired a SEMI task force to develop a standard for digital MFC communication. He was part of a SEMATECH working group that developed a new set of MFC test methods and oversaw the creation of the MFC test lab at the Oak Ridge National Laboratory. Unit Instruments, Yorba Linda, CA; ph 714/921-2640, e-mail [email protected]