Temperature is the most commonly measured parameter in commercial and industrial settings. Industries as diverse as food processing, pharmaceuticals, cold storage, paper manufacturing,
and others absolutely rely on process temperatures being within a certain range. To maintain accuracy, calibrating temperature monitoring equipment is necessary.
Measuring temperature involves a wide range of specialty sensors such as thermocouples, thermistors, resistive temperature detectors (RTDs), infrared (IR) sensors, bimetal thermometers, and
others. These sensors produces an electrical output such as resistance, millivolts or milliamps which correspond to the temperature. These output signals are sent to a readout or controller
where they are displayed or used to control a process function.
When discussing temperature calibration it's important to note that output of the temperature sensors, themselves, cannot be adjusted. Instead, the controller or readout is adjusted to account
for the inaccuracy of the sensor.
Temperature calibrations are done in accordance with The International Temperature Scale of 1990. ITS-90 is the legal temperature scale that establishes a number of fixed point temperatures that
can be used as reference values. The purpose of which is to define procedures for calibrating temperature equipment in such a way that the values of obtained are precise and reproducible, while
at the same time approximating the corresponding thermodynamic values as closely as possible.
Calibration is a comparison between two devices. The first device is the unit to be calibrated, often called the unit under test. The second device is the standard, which has a known accuracy. Using
the standard as a guide, the unit under test is adjusted until both units display the same results while exposed to the same temperature. Typically, calibration of an instrument is checked at several
points throughout the calibration range of the instrument.
Not all standards are created equally. While all standards have a known accuracy, there are some—known as primary standards— that are the highest level of accuracy for a specific parameter. Primary
standards achieve their high accuracy by relying upon measurement technologies using fundamental physical constants that do not drift such as the triple point of water. These fixed values minimize
uncertainty, making primary standards the most accurate calibration tools.
The hierarchy of temperature calibration standards from lowest to highest:
- Field standards, also known as industrial standards typically have accuracies ranging from 5°C to 0.5°C. These are useful for spot checking sensors at
the point of use rather than a laboratory environment.
- Secondary standards, also known as laboratory standards can provide calibration accuracies from 0.5°C to 0.02°C. They can be used to calibrate field
- Primary standards can be as accurate as 0.001°C.
To improve the quality of a calibration to levels acceptable to outside organizations, it is generally desirable for the calibration and subsequent measurements to be traceable to internationally
recognized standards. Establishing traceability is accomplished by a formal comparison to a standard which is directly or indirectly related to national standards ( such as NIST in the USA),
international standards, or certified reference materials.
Types of Temperature Calibrators
Temperature calibrators have been designed according to the needs of the technician using it. Each application has certain demands regarding sensor type, location, budget, need for
accuracy/stability/uniformity, and temperature range. As a result, there are differences—some pronounced, others subtle, between types of temperature calibrators.
There are three specifications that are of extreme importance when selecting a temperature calibrator. Understanding these specifications and their implications will go a long ways towards helping
you select the best calibrator for your needs.
Accuracy: An expression of how closely a measured value agrees with the true or expected value of the quantity of interest (NCSL glossary). For temperature calibrators, accuracy
is the relationship between the instrument's display temperature and the actual temperature of calibration well. Accuracy is improved by regular calibrations to a traceable standard.
Stability: The tendency of an attribute to remain within tolerance (NCSL glossary). When a calibrator reaches its set-point, there is some fluctuation in temperature as the unit
tries to maintain that temperature. That fluctuation can influence calibrations. Stability adds to accuracy in determining overall system uncertainty.
Uniformity: Temperature homogeneity of the heat source throughout the test zone. All calibrators have slight temperature differences from the bottom of the of the test well to
the top as well as from the middle of the test well outward. A few simple strategies allow users to place probes in such a way as to minimize uniformity errors.
Dry blocks are versatile temperature calibrators that work by heating, and in many cases cooling, a metal block to a specific temperature and maintaining that temperature. Most dry blocks utilize
one or more interchangeable inserts into which holes are drilled. These holes accommodate a range of temperature sensors to be calibrated. The size of the holes correspond to the diameter of the
temperature sensors under test.
Dry block calibrators can designed as portable or benchtop configurations. Though specifications can vary considerably between models, dry bocks typically offer an accuracy better than ±0.5°C and
ranges from about –25 to 650°C. Hole-to-hole temperature uniformity is typically ±0.05°C.
Dry block calibrators provide a solid combination of accuracy, portability, stability, and price. They excel in performing field- or industrial-level calibrations on nearly any type of temperature
sensor, including: RTDs, thermocouples, thermistors, , PRTs, bi-metal thermometers, etc.
Good heat transfer between insert and sensor is critical for accurate calibrations when using a dry block temperature calibrator. This transfer depends on a very close fit between the sensor and the
insert. Ideally, there should be no more than a couple of thousandths clearance between the two. Selecting the proper insert to match your sensor is critical.
Liquid baths are a temperature measurement and calibration tool in which a liquid, or in some cases a material that acts as a liquid, is heated or cooled to a specific temperature and maintained.
In many ways, liquids baths are similar to dry block calibrators except that they utilize a liquid as the calibration medium, rather than a metallic insert, which permits easier calibration of oddly
shaped or sized probes. Since liquid baths do not rely on drilled inserts, they are also capable of calibrating many more inserts at a time. Because the liquid is being stirred, these baths do not
suffer from vertical temperature differences experienced in dry blocks and by consequence usually will provide much better overall uncertainty.
Depending upon the model, liquid baths using an oil medium can achieve temperatures from about –30°C to 200°C. Specialized baths using sand, salt, or aluminum oxide particles instead of oil can reach
temperatures up to 700°C. These "fluidized" baths act as high temperature liquid baths.
Liquid baths offer greater precision and larger calibration volumes than dry block calibrators. They also offer excellent stability over the entire temperature range. Accuracy with liquid baths can
be as high as ≤0.2°C, better than that which can be achieved by most dry blocks. Liquid baths offer secondary-level calibrations on nearly any type of temperature sensor, including: RTDs, thermocouples,
thermistors, , PRTs, bi-metal thermometers, etc.
Blackbody sources are used for calibrating infrared thermometers. Generally consisting of a target plate that can be heated or cooled to very specific temperatures, blackbody sources take their name
from their very high emissivity.
Emissivity is the relative power of a surface to emit heat by radiation. The lower the emissivity the more heat radiates from the surface. Higher emissivity, "blackbodies", radiate little heat and
are, therefore, not prone to errors due to interference from radiation upon the infrared sensor. For best results, the emissivity of the infrared sensor under test should match the emissivity of
the blackbody source.
Once emissivity issues are accounted for, calibrating infrared thermometers using a blackbody source are straightforward. The infrared thermometer takes a measurement of the target plate. The
temperature of the plate is compared with the reading. The infrared thermometer is then adjusted until the results match.
Depending upon the model, blackbody sources have temperature ranges from about -30° to 500°C with an accuracy of ±0.5°C making them excellent tools for field- or industrial-level calibrations.
Some models incorporate fixed-point cells making them suitable as primary standards.
Multifunction calibrators are the do-everything calibration instrument. Capable of accepting input from a wide range of sensors, many have functions related to temperature calibration. Multifunction
calibrators are not temperature calibrators in the truest sense as they do not provide a temperature reference as a point of comparison. What they do is simulate and source thermocouples, RTDs, and
other temperature sensors. Using sophisticated electronics, multifunction calibrators can compare the temperature measured by the sensor with the voltage or milliamp signal produced. This gives a
pretty good idea of the accuracy and allows technicians to easily field test sensors.
Multifunction calibrators can also be used to check the accuracy of temperature controllers, often a crucial component in temperature calibrators (as well as a very wide range of other applications).
Thermocouple / RTD Calibrators
Thermocouple / RTD calibrators are much like multifunction calibrators without the multifunction part. Generally single-purpose instruments, thermocouple / RTD calibrators test the accuracy of temperature
sensors through sophisticated electronic circuitry rather than comparison to a reference. In this way they are not true calibrators though they are able to provide technicians a simple, cost effective
way to field test thermocouples and RTDs by comparing the temperature measured by the sensor with the voltage or milliamp signal produced.
Fixed Point Cells
Fixed point cells are primary standards and offer the greatest accuracy and stability of any temperature calibration method. Fixed cells work by heating or cooling a highly pure substance to the
temperature at which a phase change occurs. Phase changes are the transitions between solid, liquid and gaseous states of matter. These transitions occur at very specific temperatures and, at the
point of phase change, temperatures become very stable, often plateauing for several hours or even days providing a highly accurate and stable reference temperature.
Fixed point cells are usually incorporated into dry blocks or liquid baths to provide the necessary heating/cooling and temperature control. Fixed point cells can achieve an uncertainly of just
The most accurate of fixed point cells are triple point cells. Triple point cells are based on the principle that certain substances in a highly pure (99.9999%) state have a
triple point, or a temperature at which all three phases (gas, liquid, and solid) of that substance coexist in thermodynamic equilibrium. The most common of these is water which has a triple
point of 0.01°C. Triple point cells have an uncertainty better than ±0.0001°C, so accurate that ITS–90 uses triple point cells of hydrogen, neon, oxygen, argon, mercury, and water for delineating
six of its defined temperature points.
Thermocouple Reference Equipment
Thermocouple reference equipment provides high accuracy monitoring by eliminating the need for cold junction compensation in thermocouples, a major cause of measurement errors. Thermocouples are
a temperature-measuring devices formed by the junction of two dissimilar metals. A thermocouple produces a voltage proportional to the difference in temperature between the hot junction and the
lead wire (cold) junction. Since the measurement is of the temperature difference between the two junctions, it is a relative reading. To make a more useful absolute reading, a reference sensor
is set to and placed next to the cold junction. This is referred to as cold junction compensation.
Though cold junction compensation makes thermocouple readings more useful, they can distort readings by 1°C, or more. In thermocouple reference equipment, a controlled reference temperature,
usually 0°C, replaces the cold junction compensation. The accuracy when using reference equipment can be as high as ±0.05°C, much better than standard thermocouple set-ups and there is no
Thermocouple reference equipment is integrated into existing monitoring systems and, depending upon the model, can monitor dozens (or more) of thermocouples.
Things to consider when selecting a temperature calibrator:
- What type of electrical signal does it output?
- Do I need to calibrate sensors or just temperature displays?
- Which is more important? Speed to change temperatures or doing multiple probes at a time?
- How deep does the dry block or liquid bath need to be?
- What are the physical characteristics (size, shape) of the sensor?
- Over what temperature range is it used?
- And what accuracies are relied on over those temperatures?
- Must the calibrations occur in a laboratory or can they be done in the field or even in-situ?
If you have any questions regarding temperature calibrators please don't hesitate to speak with one of our engineers by e-mailing us at email@example.com or calling 1-800-884-4967.