Calibration

From Infogalactic: the planetary knowledge core
Jump to: navigation, search

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

An example of a device whose calibration is off: a weighing scale that reads ½ ounce without any load.

Calibration is process of finding a relationship between two unknown (when the measurable quantities are not given a particular value for the amount considered or found a standard for the quantity) quantities. When one of quantity is known, which is made or set with one device, another measurement is made as similar way as possible with the first device using a second device.The measurable quantities may differ in two devices which are equivalent. The device with the known or assigned correctness is called the standard. The second device is the unit under test, test instrument, or any of several other names for the device being calibrated.

The formal definition of calibration by the International Bureau of Weights and Measures is the following: "Operation that, under specified conditions, in a first step, establishes a relation between the quantity values with measurement uncertainties provided by measurement standards and corresponding indications with associated measurement uncertainties (of the calibrated instrument or secondary standard) and, in a second step, uses this information to establish a relation for obtaining a measurement result from an indication."[1]

History

Origins

The words "calibrate" and "calibration" entered the English language as recent as the American Civil War,[2] in descriptions of artillery. Some of the earliest known systems of measurement and calibration seem to have been created between the ancient civilizations of Egypt, Mesopotamia and the Indus Valley, with excavations revealing the use of angular gradations for construction.[3] The term "calibration" was likely first associated with the precise division of linear distance and angles using a dividing engine and the measurement of gravitational mass using a weighing scale. These two forms of measurement alone and their direct derivatives supported nearly all commerce and technology development from the earliest civilizations until about AD 1800.[4]

Calibration of weights and distances (c. 1100 CE–)

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

Early measurement devices were direct, i.e. they had the same units as the quantity being measured. Examples include length using a yardstick and mass using a weighing scale. At the beginning of the twelfth century, during the reign of Henry I (1100-1135), it was decreed that a yard be "the distance from the tip of the King's nose to the end of his outstretched thumb."[5] However, it wasn't until the reign of Richard I (1197) that we find documented evidence.[6]

Assize of Measures
"Throughout the realm there shall be the same yard of the same size and it should be of iron."

Other standardization attempts followed, such as the Magna Carta (1225) for liquid measures, until the Mètre des Archives from France and the establishment of the Metric system.

The Industrial Revolution and the calibration of pressure (c. 1600 CE–)

One of the earliest pressure measurement devices was the Mercury barometer, credited to Torricelli (1643),[7] which read atmospheric pressure using Mercury. Soon after, hydrostatic manometers were designed, with a linear calibration for measuring lower pressures ranges. The Industrial Revolution (c. 1760 CE – c. 1840 CE) saw widespread use of indirect measuring devices, in which the quantity being measured was derived functionally based on direct measurements of dependent quantities.[8] During this time, scientists discovered the energy stored in compressed steam and other gases, leading to the development of gauges more practical than hydrostatic manometers at measuring higher pressures.[9] One such invention was Eugene Bourdon's indirect design Bourdon tube.

Indirect reading design showing a Bourdon tube from the front
Indirect reading design showing a Bourdon tube from the rear
Indirect reading design showing a Bourdon tube from the front (left) and the rear (right).
Direct reading design of a U-tube manometer
Gas pump with rotary flow indicator (yellow) and nozzle (red)

In the direct reading hydrostatic manometer design on the left, an unknown applied pressure Pa pushes the liquid down the right side of the manometer U-tube, while a length scale next to the tube measures the pressure, referenced to the other, open end of the manometer on the left side of the U-tube (P0). The resulting height difference "H" is a direct measurement of the pressure or vacuum with respect to atmospheric pressure. The absence of pressure or vacuum would make H=0. The self-applied calibration would only require the length scale to be set to zero at that same point.

This direct measurement of pressure as a height difference depends on both the density of the manometer fluid, and a calibrated means of measuring the height difference.

In a Bourdon tube (shown in the two views on the right), applied pressure entering from the bottom on the silver barbed pipe tries to straighten a curved tube (or vacuum tries to curl the tube to a greater extent), moving the free end of the tube that is mechanically connected to the pointer. This is indirect measurement that depends on calibration to read pressure or vacuum correctly. No self-calibration is possible, but generally the zero pressure state is correctable by the user, as shown below.

Even in recent times, direct measurement is used to increase confidence in the validity of the measurements.

The age of the automobiles (c. 1900 CE–)

In the early days of US automobile use, people wanted to see the gasoline they were about to buy in a big glass pitcher, a direct measure of volume and quality via appearance. By 1930, rotary flowmeters were accepted as indirect substitutes. A hemispheric viewing window allowed consumers to see the blade of the flowmeter turn as the gasoline was pumped (see image on the right). By 1970, the windows were gone and the measurement was totally indirect.

Indirect measurement always involve linkages or conversions of some kind. It is seldom possible to intuitively monitor the measurement. These facts intensify the need for calibration.

Most measurement techniques used today are indirect.

Modern calibration

A U.S. Navy Airman performing a calibration procedure on a temperature test gauge

Modern metrology calibration targets different types of industrial instruments that can be categorized based on the physical quantities they are designed to measure. Exact categorizations vary internationally, e.g., NIST 150-2G in the U.S.[10] and NABL-141 in India.[11] Together, these standards cover instruments that measure various physical quantities such as electromagnetic radiation (RF probes), time and frequency (intervalometer), ionizing radiation (Geiger counter), light (light meter), mechanical quantities (limit switch, pressure gauge, pressure switch), and, thermodynamic or thermal properties (thermometer, temperature controller). The standard instrument for each test device varies accordingly, e.g., a dead weight tester for pressure gauge calibration and a dry block temperature tester for temperature gauge calibration.

A U.S. Navy Machinist Mate using a 3666C auto pressure calibrator

Calibration methods for modern devices can be both manual and automatic, depending on what kind of device is being calibrated. The picture on the left shows a U.S. Navy Airman performing a manual calibration procedure on a pressure test gauge. The procedure is complex,[12] but overall it involves the following: (i) depressurizing the system, and turning the screw, if necessary, to ensure that the needle reads zero, (ii) fully pressurizing the system and ensuring that the needle reads maximum, within acceptable tolerances, (iii) replacing the gauge if the error in the calibration process is beyond tolerance, as this may indicate signs of failure such as corrosion or material fatigue.

In contrast, the picture on the right shows the use of a 3666C automatic pressure calibrator,[13] which is a device that consists of a control unit housing the electronics that drive the system, a pressure intensifier used to compress a gas such as Nitrogen, a pressure transducer used to detect desired levels in a hydraulic accumulator, and accessories such as liquid traps and gauge fittings.

Basic calibration process

Calibration Target of the "Mars Hand Lens Imager (MAHLI)" (September 9, 2012) (3-D image).

Purpose and scope

The calibration process begins with the design of the measuring instrument that needs to be calibrated. The design has to be able to "hold a calibration" through its calibration interval. In other words, the design has to be capable of measurements that are "within engineering tolerance" when used within the stated environmental conditions over some reasonable period of time.[14] Having a design with these characteristics increases the likelihood of the actual measuring instruments performing as expected. Basically,the purpose of calibration is for maintaining the quality of measurement as well as to ensure the proper working of particular instrument.

Frequency

The exact mechanism for assigning tolerance values varies by country and industry type. The measuring equipment manufacturer generally assigns the measurement tolerance, suggests a calibration interval (CI) and specifies the environmental range of use and storage. The using organization generally assigns the actual calibration interval, which is dependent on this specific measuring equipment's likely usage level. The assignment of calibration intervals can be a formal process based on the results of previous calibrations. The standards themselves are not clear on recommended CI values:[15]

ISO 17025[16]
"A calibration certificate (or calibration label) shall not contain any recommendation on the calibration interval except where this has been agreed with the customer. This requirement may be superseded by legal regulations.”
ANSI/NCSL Z540[17]
"...shall be calibrated or verified at periodic intervals established and maintained to assure acceptable reliability..."
ISO-9001[18]
"Where necessary to ensure valid results, measuring equipment shall...be calibrated or verified at specified intervals, or prior to use...”
MIL-STD-45662A[19]
"... shall be calibrated at periodic intervals established and maintained to assure acceptable accuracy and reliability...Intervals shall be shortened or may be lengthened, by the contractor, when the results of previous calibrations indicate that such action is appropriate to maintain acceptable reliability."

Standards required and accuracy

The next step is defining the calibration process. The selection of a standard or standards is the most visible part of the calibration process. Ideally, the standard has less than 1/4 of the measurement uncertainty of the device being calibrated. When this goal is met, the accumulated measurement uncertainty of all of the standards involved is considered to be insignificant when the final measurement is also made with the 4:1 ratio.[20] This ratio was probably first formalized in Handbook 52 that accompanied MIL-STD-45662A, an early US Department of Defense metrology program specification. It was 10:1 from its inception in the 1950s until the 1970s, when advancing technology made 10:1 impossible for most electronic measurements.[21]

Maintaining a 4:1 accuracy ratio with modern equipment is difficult. The test equipment being calibrated can be just as accurate as the working standard.[20] If the accuracy ratio is less than 4:1, then the calibration tolerance can be reduced to compensate. When 1:1 is reached, only an exact match between the standard and the device being calibrated is a completely correct calibration. Another common method for dealing with this capability mismatch is to reduce the accuracy of the device being calibrated.

For example, a gage with 3% manufacturer-stated accuracy can be changed to 4% so that a 1% accuracy standard can be used at 4:1. If the gage is used in an application requiring 16% accuracy, having the gage accuracy reduced to 4% will not affect the accuracy of the final measurements. This is called a limited calibration. But if the final measurement requires 10% accuracy, then the 3% gage never can be better than 3.3:1. Then perhaps adjusting the calibration tolerance for the gage would be a better solution. If the calibration is performed at 100 units, the 1% standard would actually be anywhere between 99 and 101 units. The acceptable values of calibrations where the test equipment is at the 4:1 ratio would be 96 to 104 units, inclusive. Changing the acceptable range to 97 to 103 units would remove the potential contribution of all of the standards and preserve a 3.3:1 ratio. Continuing, a further change to the acceptable range to 98 to 102 restores more than a 4:1 final ratio.

This is a simplified example. The mathematics of the example can be challenged. It is important that whatever thinking guided this process in an actual calibration be recorded and accessible. Informality contributes to tolerance stacks and other difficult to diagnose post calibration problems.

Also in the example above, ideally the calibration value of 100 units would be the best point in the gage's range to perform a single-point calibration. It may be the manufacturer's recommendation or it may be the way similar devices are already being calibrated. Multiple point calibrations are also used. Depending on the device, a zero unit state, the absence of the phenomenon being measured, may also be a calibration point. Or zero may be resettable by the user-there are several variations possible. Again, the points to use during calibration should be recorded.

There may be specific connection techniques between the standard and the device being calibrated that may influence the calibration. For example, in electronic calibrations involving analog phenomena, the impedance of the cable connections can directly influence the result.

Process description and documentation

All of the information above is collected in a calibration procedure, which is a specific test method. These procedures capture all of the steps needed to perform a successful calibration. The manufacturer may provide one or the organization may prepare one that also captures all of the organization's other requirements. There are clearinghouses for calibration procedures such as the Government-Industry Data Exchange Program (GIDEP) in the United States.

This exact process is repeated for each of the standards used until transfer standards, certified reference materials and/or natural physical constants, the measurement standards with the least uncertainty in the laboratory, are reached. This establishes the traceability of the calibration.

See Metrology for other factors that are considered during calibration process development.

After all of this, individual instruments of the specific type discussed above can finally be calibrated. The process generally begins with a basic damage check. Some organizations such as nuclear power plants collect "as-found" calibration data before any routine maintenance is performed. After routine maintenance and deficiencies detected during calibration are addressed, an "as-left" calibration is performed.

More commonly, a calibration technician is entrusted with the entire process and signs the calibration certificate, which documents the completion of a successful calibration.

Success factors

The basic process outlined above is a difficult and expensive challenge. The cost for ordinary equipment support is generally about 10% of the original purchase price on a yearly basis, as a commonly accepted rule-of-thumb. Exotic devices such as scanning electron microscopes, gas chromatograph systems and laser interferometer devices can be even more costly to maintain.

The extent of the calibration program exposes the core beliefs of the organization involved. The integrity of organization-wide calibration is easily compromised. Once this happens, the links between scientific theory, engineering practice and mass production that measurement provides can be missing from the start on new work or eventually lost on old work.

The 'single measurement' device used in the basic calibration process description above does exist. But, depending on the organization, the majority of the devices that need calibration can have several ranges and many functionalities in a single instrument. A good example is a common modern oscilloscope. There easily could be 200,000 combinations of settings to completely calibrate and limitations on how much of an all inclusive calibration can be automated.

Every organization using oscilloscopes has a wide variety of calibration approaches open to them. If a quality assurance program is in force, customers and program compliance efforts can also directly influence the calibration approach. Most oscilloscopes are capital assets that increase the value of the organization, in addition to the value of the measurements they make. The individual oscilloscopes are subject to depreciation for tax purposes over 3, 5, 10 years or some other period in countries with complex tax codes. The tax treatment of maintenance activity on those assets can bias calibration decisions.

New oscilloscopes are supported by their manufacturers for at least five years, in general. The manufacturers can provide calibration services directly or through agents entrusted with the details of the calibration and adjustment processes.

Very few organizations have only one oscilloscope. Generally, they are either absent or present in large groups. Older devices can be reserved for less demanding uses and get a limited calibration or no calibration at all. In production applications, oscilloscopes can be put in racks used only for one specific purpose. The calibration of that specific scope only has to address that purpose.

This whole process in repeated for each of the basic instrument types present in the organization, such as the digital multimeter pictured below.

A digital multimeter (top), a rack-mounted oscilloscope (center) and control panel

Also the picture above shows the extent of the integration between Quality Assurance and calibration. The small horizontal unbroken paper seals connecting each instrument to the rack prove that the instrument has not been removed since it was last calibrated. These seals are also used to prevent undetected access to the adjustments of the instrument. There also are labels showing the date of the last calibration and when the calibration interval dictates when the next one is needed. Some organizations also assign unique identification to each instrument to standardize the record keeping and keep track of accessories that are integral to a specific calibration condition.

When the instruments being calibrated are integrated with computers, the integrated computer programs and any calibration corrections are also under control.

Quality

To improve the quality of the calibration and have the results accepted by outside organizations it is desirable for the calibration and subsequent measurements to be "traceable" to the internationally defined measurement units. Establishing traceability is accomplished by a formal comparison to a standard which is directly or indirectly related to national standards ( such as NIST in the USA), international standards, or certified reference materials. This may be done by national standards laboratories operated by the government or by private firms offering metrology services.

Quality management systems call for an effective metrology system which includes formal, periodic, and documented calibration of all measuring instruments. ISO 9000[18] and ISO 17025[16] standards require that these traceable actions are to a high level and set out how they can be quantified.

Instrument calibration

Calibration may be called for:

  • a new instrument
  • after an instrument has been repaired or modified
  • when a specified time period has elapsed
  • when a specified usage (operating hours) has elapsed
  • before and/or after a critical measurement
  • after an event, for example
    • after an instrument has had a shock, vibration, or has been exposed to an adverse condition which potentially may have put it out of calibration or damage it
    • sudden changes in weather
  • whenever observations appear questionable or instrument indications do not match the output of surrogate instruments
  • as specified by a requirement, e.g., customer specification, instrument manufacturer recommendation.

In general use, calibration is often regarded as including the process of adjusting the output or indication on a measurement instrument to agree with value of the applied standard, within a specified accuracy. For example, a thermometer could be calibrated so the error of indication or the correction is determined, and adjusted (e.g. via calibration constants) so that it shows the true temperature in Celsius at specific points on the scale. This is the perception of the instrument's end-user. However, very few instruments can be adjusted to exactly match the standards they are compared to. For the vast majority of calibrations, the calibration process is actually the comparison of an unknown to a known and recording the results.

International

In many countries a National Metrology Institute (NMI) will exist which will maintain primary standards of measurement (the main SI units plus a number of derived units) which will be used to provide traceability to customer's instruments by calibration. The NMI supports the metrological infrastructure in that country (and often others) by establishing an unbroken chain, from the top level of standards to an instrument used for measurement. Examples of National Metrology Institutes are NPL in the UK, NIST in the United States, PTB in Germany and many others. Since the Mutual Recognition Agreement was signed it is now straightforward to take traceability from any participating NMI and it is no longer necessary for a company to obtain traceability for measurements from the NMI of the country in which it is situated.

To communicate the quality of a calibration the calibration value is often accompanied by a traceable uncertainty statement to a stated confidence level. This is evaluated through careful uncertainty analysis. Some times a DFS (Departure From Spec) is required to operate machinery in a degraded state. Whenever this does happen, it must be in writing and authorized by a manager with the technical assistance of a calibration technician.

See also

<templatestyles src="Div col/styles.css"/>

References

Crouch, Stanley & Skoog, Douglas A. (2007). Principles of Instrumental Analysis. Pacific Grove: Brooks Cole. ISBN 0-495-01201-7.

  1. JCGM 200:2008 International vocabulary of metrology — Basic and general concepts and associated terms (VIM)
  2. http://dictionary.reference.com/browse/calibrate
  3. Lua error in package.lua at line 80: module 'strict' not found.
  4. Lua error in package.lua at line 80: module 'strict' not found.
  5. Lua error in package.lua at line 80: module 'strict' not found.
  6. Lua error in package.lua at line 80: module 'strict' not found.
  7. Lua error in package.lua at line 80: module 'strict' not found.
  8. Lua error in package.lua at line 80: module 'strict' not found.
  9. Lua error in package.lua at line 80: module 'strict' not found.
  10. Lua error in package.lua at line 80: module 'strict' not found.
  11. Lua error in package.lua at line 80: module 'strict' not found.
  12. Lua error in package.lua at line 80: module 'strict' not found.
  13. Lua error in package.lua at line 80: module 'strict' not found.
  14. Lua error in package.lua at line 80: module 'strict' not found.
  15. Lua error in package.lua at line 80: module 'strict' not found.
  16. 16.0 16.1 ISO 17025: "General requirements for the competence of testing and calibration laboratories" (2005), section 5.
  17. Lua error in package.lua at line 80: module 'strict' not found.
  18. 18.0 18.1 ISO 9001: "Quality management systems — Requirements" (2008), section 7.6.
  19. Lua error in package.lua at line 80: module 'strict' not found.
  20. 20.0 20.1 Lua error in package.lua at line 80: module 'strict' not found.
  21. Lua error in package.lua at line 80: module 'strict' not found.

IS:ISO:ISI:17025:2005

External links