Force Calibration

Force calibration is a necessary process used for testing materials used for manufacturing equipment, machines, and other devices. All forms of metals and other materials can expand and contract during their use...
Please fill out the following form to submit a Request for Quote to any of the following companies listed on
This article provides a detailed look at.
Read further to answer questions like:
A calibration service is a service aimed at detecting the inaccuracy and uncertainty of a measuring instrument or piece of equipment. In calibration, the device under test (DUT) is compared to a reference of known value to determine the deviation of the measurement from the true value. Deviation of the actual value from the measured value is called an error. After the error has been detected, the next move is to adjust the DUT to obtain more accurate measurements, but this is a separate process called adjusting or trimming.
Calibration is a documented activity performed in a designated and controlled calibration laboratory by an accredited calibration service provider. Some calibration service providers perform this service in the site where the measuring equipment is situated to prevent operational downtime.
The calibration certificate is issued by the provider to the requesting organization; the certificate is specific to the calibrated instrument for traceability and documentation purposes. It reports the information and results of the calibration performed on the DUT. A calibration sticker may be seen on some measuring equipment to readily distinguish a calibrated piece of equipment.
The International System of Units (SI system) is a standardized system of measurement. SI is an abbreviation the organizations French name, "Système International d unités", which is known as the metric system. The goal of the SI system is to communicate measurements precisely through a coherent and consistent expression of units describing the magnitudes of physical quantities. The SI system serves as the universal language for measurement systems and is adopted in organizations, businesses, and research facilities worldwide.
The SI system was established in 1960 by the 11th General Conference on Weights and Measures (CGPM, Conférence Générale des Poids et Mesures). Under the authority of the CGPM is the Bureau International des Poids et Mesures (BIPM), an intergovernmental agency based in Paris, France that ensures the worldwide unification of physical measurements.
The SI system is made up of seven base units, which are used to define the 22 derived units. The seven base units are used to describe the seven fundamental quantities, which originated from the most stable reference—one that does not deteriorate over time: the distance traveled by light in a vacuum in 1/299792458 second is designated as one meter. It is an immutable definition of the most basic unit of length; all measurements of distance are based on it. It is only amended on the basis of the latest international definitions and scientific advances that lead to a clearer and more acceptable definition of the base unit. When the base units are multiplied by each other, the 22 derived units are obtained. The table below presents the seven fundamental quantities:
Name of Physical Quantity | SI Unit |
---|---|
Length (l) | meter (m) |
Mass (M) | Kilogram (kg) |
Time (T) | Second (s) |
Electric Current (I) | Ampere (A) |
Thermodynamic Temperature (S) | Kelvin (K) |
Amount of Substance (N) | Mole (mol) |
Luminous Intensity (J) | Candela (cd) |
Prefixes are attached to a base unit to indicate its magnitude. The magnitude of the scale is increasing by multiples of ten. This makes the SI system a convenient way of conveying and comparing quantities because we can easily convert from one unit to another.
Metrological traceability, or measurement traceability, as defined by the International Vocabulary of Metrology, is the "property of a measurement result whereby the result can be related to a reference through a documented unbroken chain of calibrations, each contributing to the measurement uncertainty." It is one of the most important aspects of measurement and calibration as it authenticates the conformity to the international standards. To visualize the chain of calibrations, take a look at the measurement traceability pyramid:
The calibration chain starts with the working standards or, alternatively, process calibrators of higher accuracy are used to calibrate the DUTs. These working standards and process calibrators are assumed to have the highest accuracy in a plant or site. Before working with DUTs, they are sent to an accredited calibration laboratory for calibration by a standard or calibrator with higher accuracy. The standards and calibrators in the accredited calibration laboratory are sent to the national metrology institutes (NMI‘s) of the governing state. Finally, the NMIs coordinate with international metrological agencies to assist in the fulfillment of the SI units and the conformance based on international definitions and comparisons of the calibrations and measurements being performed in their home countries. All uncertainties must be declared in each level of calibration.
The SI units are the foundation of all measurement standards and sit at the top of the measurement traceability pyramid. The SI units serve as the measurement standard with perfect accuracy and "true value" for all measurements.
The BIPM and NMIs of the participating countries help in preserving the accuracy of SI units as the calibration level moves down the lineage—from the primary and secondary standards to the process DUTs, which sit at the base of the traceability pyramid. From NMI-level standards, "true value" is communicated down the lineage. As we move up the pyramid, the accuracy becomes closer to the true value dictated by the SI system, but the cost of the standards also becomes higher. As we go down the pyramid, the measurement error and level of uncertainty is magnified.
To summarize, measurements must be related to the next reference of higher accuracy in the hierarchy. The measurement is traceable if it satisfies these three conditions:
Note: The NMI of the United States of America is the National Institute of Standards and Technology (NIST), which is part of the US Department of Commerce.
There is a phrase: "If you can‘t measure it, you can‘t improve it." Measurement is the foundation of quality, safety, efficiency, and overall development. Most industries and sectors rely heavily on the accuracy of measuring instruments to improve the quality of life. The goal of a calibration service is to minimize the measurement error and increase the assurance of making precise measurements.
Factors such as environment, usage frequency, and handling can increase measurement uncertainty and error. Hence, measuring devices must be subjected to timely calibration. Calibration lends to the improvement of repeatability and reproducibility of the produced data.
The accuracy of measuring instruments does not deteriorate over time. Instruments that are very stable and used for many years are the best standards. Their uncertainty and stability can be well documented due to their long history proving the best standards, which may not be the true value but documentation makes them valuable.
Calibration is necessary if your measuring instrument significantly affects the accuracy and validity of your testing procedure, which is crucial to the credibility of the data produced by laboratories and testing facilities. This aspect is immediately apparent in medical and legal decisions.
The ISO/IEC 17025, or the General Requirements for the Competence of Testing and Calibration Laboratories, provides the scope of competencies of the facilities seeking accreditation. Measurement uncertainty analysis and measurement traceability are parts of the ISO/IEC 17025 scope, and many laboratories are cited for these deficiencies. Besides, compliance to this standard assures many intangible benefits and ease of operation for the testing facilities that have been granted accreditation.
Another reason to calibrate your measuring instrument is if it is vital in the detection of variations in your process and such variations can have detrimental effects on product quality, health, and safety. Reliable measurements help engineers to minimize the assignable causes of variation by prevention and early detection.
Minimizing variation is critical in meeting the specification of manufactured products or parts. There are instances where large variations can even put others‘ lives at risk. When process parameters and operating conditions are left uncontrolled in a factory or site, it can endanger the people and the environment in the vicinity. This aspect is also seen in the manufacturing of aircraft and automobile parts and medications where any deviation can adversely affect the health and safety of the users.
Lastly, calibration upholds the geographical consistency of all measurements performed and keeps all measuring instruments up to date with international agreements and standards. Parties in different regions must agree if they were presented with a specific measured quantity. This is essential in international and domestic trading as quantities of goods equate to revenues. For example, one cubic meter of gasoline exported by a country must be exactly one cubic meter of gasoline when it is measured in the importing country.
BIPM formally defines calibrators as the "measurement standard used in calibration". Calibrators may be an instrument with higher accuracy used to calibrate a DUT, a source, or a Certified Reference Material (CRM).
A source is an instrument that produces a known and precise output. The output is measured by the DUT. The setting of the source equipment is considered the exact value.
A Certified Reference Material (CRM) is a form of a standard that is characterized by a metrologically valid procedure. Its exact measurement value is known. Individual CRM samples must be stable and must be authenticated by a certificate. CRMs are typically used in analytical and clinical chemistry.
The general procedure in calibrating any process parameter is as follows:
The comparison of the measurements obtained from the DUT and the calibrator can have one of the two possible outcomes:
The commonly used metrics in determining the calibration result are the Test Accuracy Ratio (TAR) and Test Uncertainty Ratio (TUR). Most calibration laboratories today consider a TAR or TUR of 4:1. This means that the DUT has at least 25% accuracy of the reference standard, or the reference standard is four times more accurate than the DUT.
Test Accuracy Ratio (TAR). TAR is the ratio of the DUT tolerance to the reference standard tolerance. It is a simplified pass-or-fail indicator in calibration practice, but it does not consider the measurement uncertainties associated with the process.
Test Uncertainty Ratio (TUR). TUR is the ratio of the DUT tolerance to the estimated calibration uncertainty. It considers the influences in the calibration process, which can affect the accuracy of the measurements such as environmental factors, process variations, technician errors, and instruments used in the procedure. The estimated calibration uncertainty is typically expressed at a 95% or a 99% confidence level.
The following devices are source calibrators that produce a known and precise output of the parameter to be calibrated. These devices are commonly found in calibration and metrology laboratories that support the calibration personnel in verifying the accuracy of a measuring instrument.
Electrical calibrators are a group of calibration devices that provide or measure electrical responses such as current, voltage, frequency, pulses, and resistance signals. Electrical calibrators include multifunction process calibrators, oscilloscope calibrators, and power calibrators.
Dry block calibrators are used to calibrate temperature measuring devices. They consist of a metal block contained in an insulated vessel that is precisely heated or cooled to a specific temperature. After the temperature has stabilized, it is maintained and readings are obtained by inserting the probes in the vessel of the dry block calibrator.
Calibration baths are also used to calibrate measuring devices. Dry block calibrators and calibration baths work using the same concept, except that the latter consists of an insulated vessel containing a liquid heated or cooled to a specific temperature. Once the temperature has homogenized, the readings are obtained by inserting probes into the vessel.
Calibration baths offer higher temperature stability and precision. They can be used to calibrate DUTs requiring high sensitivity.
Pressure calibrators are devices that measure, apply, and control the pressure to a DUT. The pressure generated is then measured by the DUT. Pressure calibrators include digital pressure controllers and pressure comparators.
A deadweight tester is a unique type of pressure calibrator that utilizes calibrated and traceable weights and a piston and cylinder assembly to apply known pressures to a DUT. The DUT then measures and records the pressure generated.
Humidity calibrators consist of a chamber that is set and maintained at a known and precise relative humidity or dew point. The relative humidity and dew point inside the chamber is then measured by the DUT.
Flow calibrators regulate the flow to a known and precise flow rate for a DUT to measure. They are used in calibrating flow meters and flow controllers used in liquid and gas distribution systems.
A laser interferometry uses a laser and electronic controls to inspect parts of a machine for straightness, being parallel, and flatness. The process can measure very small diameters and dimensions and is widely used to calibrate machine tables, slides, and axis movements. It is a measurement method that uses the characteristics of interference with lightwaves and the materials the lightwaves touch.
The process of an interferometer involves the use of two light beams where one light beam is split into two where an interference pattern forms with the two light beams. Since the wavelengths of the light are very short, differences in their paths can be easily detected. Although the technique has existed for over one hundred years, the introduction of laser interferometers has greatly enhanced the accuracy of the interferometer calibration method.
The type of calibration equipment used varies depending on the type of service being performed. There are many different types of calibration services available; some of which are:
Pressure calibration service aims to calibrate pressure-measuring devices such as pressure switches, pressure transmitters, relief valves, and barometers for gas and liquid systems operating above or below the atmospheric pressures.
Temperature calibration service aims to calibrate temperature-measuring devices such thermocouples, RTDs, thermistors, PRTs, bi-metal thermometers, thermal cameras, and infrared meters. This calibration service is performed in a controlled environment.
Humidity calibration service is performed to calibrate humidity-measuring devices such as humidity recorders, humidity probes, humidity sensors, psychrometers, and thermohygrograph. Parameters such as relative humidity and dew point are measured during humidity calibration. Like temperature calibration service, it is performed in a controlled environment.
Flow calibration service aims to calibrate volumetric and mass flow meters and flow controllers installed in gas and liquid distribution systems. Flow calibration must be done periodically, as it regulates the flow of the fluids through process equipment and pipelines; this has a direct impact on quality and safety.
When calibrating machines that have helium or hydrogen, the leak standard should be traceable to NIST Calibration Ranges of 2 x 10-10 atmcc/sec and larger for helium with other gas leak standards being 1 x 10-8 atmcc/sec and larger.
Pipette calibration service is performed to calibrate single-channeled pipettes, multiple-channeled pipettes, and electronic pipettes in order to dispense accurate volumes of liquid. Pipettes are widely used in analytical laboratories.
Pipette calibration is performed by weighing a liquid carefully dispensed by the pipette at a known temperature. The volume dispensed by the pipette is calculated by multiplying the weight of the dispensed liquid to its density, and it is compared to the theoretical amount of liquid dispensed.
Electrical calibration service is performed to calibrate instruments that measure electrical parameters such as voltage, resistance, current, inductance, and capacitance. Instruments catered in this calibration service are oscilloscopes, multimeters, data loggers, and clamp meters.
Dimensional calibration service is performed to calibrate devices that measure dimensional properties such as length, volume, flatness, and angle. Instruments catered in this calibration service are micrometers, calipers, height gauges, etc.
Force calibration service is performed to calibrate measuring devices which measure parameters related to force such as weight, torque, and tensile and compressive forces. It involves the comparison of the measurement of applied forces to the DUT to a calibration standard. During measurement, adapters are used to ensure that the applied force is centered on the DUT to avoid assignable measurement errors.
Traceable deadweights are used as standards in force calibration. It is performed in a controlled environment. Instruments calibrated under force calibration are tensiometers, load cells, scales and balances, force gauges compression and tensile testers, force dynamometers, hardness testers, and proving rings.
A calibration certificate is issued by an accredited service provider after the calibration has been successfully performed under their supervision. It summarizes the details, procedure, and results of the calibration performed. The certificate must contain the following information:
A calibration sticker is attached to the equipment to verify the calibration validity and status. It usually indicates the equipment serial number and the date the next calibration is due. However, it does not equate to the legitimacy and traceability of a calibration certificate. It is only used as a visual reference to inspect the calibration of the equipment.
Unaccredited calibration uses owner or inhouse methods to calibrate instruments. It is known as commercial calibration, standard calibration, quality assurance calibration, or NIST traceable calibration and is performed under the standards of a calibration laboratory. A calibration certificate is issued that describes the basic equipment used for the calibration and standard traceability. It is not a sanctioned calibration by an official accrediting body and does not include appropriate legal documentation.
Although the use of an unaccredited calibration service may be less expensive or more convenient, unaccredited calibration services do not have to adhere to calibration regulations, are not audited, and do not have the same high level of standards as an accredited calibration service. Their inaccurate measurements, inadequate quality assurances, and unsure results can lead to process errors, fines, and product recalls.
The important values to look for in a calibration certificate are the following:
Calibration correction is the difference between the measurements obtained by the DUT during calibration and the exact value of the reference standard. This data is presented on the calibration certificate. The calibration correction will be added to the future measurement readings obtained by the DUT. This will help the DUT get closer to the true value, thereby improving its accuracy.
Expanded uncertainty is the defined interval in the calibration report wherein the true values can be confidently asserted to lie. It is determined statistically and accounts for all uncertainty sources. The lower the expanded uncertainty of a DUT is, the higher precision of the measurements.
The coverage factor, or the K-factor, indicates the confidence level the expanded uncertainty is derived from. K-factors of 2 or 3 are recommended for most industries.
A K-factor of 2 corresponds to a 95.45% confidence level. It means that 95.45% of the time, the measurements made lie within the expanded uncertainty. The same concept is also applied to other K-factors; a K-factor of 3 corresponds to a 99.73% confidence interval. Higher K-factor is applied in DUTs performing critical measurements, wherein measurement failure can be costly and dangerous.
A key factor in the acceptance of calibrations is measurement decision risk that are expressed as false accept risk or false reject risk, which are the metrics used to determine the quality of a calibration.
There are two definitions of false accept risk, which are unconditional and conditional. Unconditional false accept refers to the possibility that equipment parameters may be out of tolerance but viewed as being in tolerance. With conditional false accept, it is possible that the equipment is out of tolerance. With high false accept, there is the possibility of severe negative outcomes regarding the performance of an item.
False reject refers to in tolerance readings being rejected, which can lead to increased costs due to unnecessary adjustments, repairs, re-calibrations, and less time between calibrations.
Force calibration is a necessary process used for testing materials used for manufacturing equipment, machines, and other devices. All forms of metals and other materials can expand and contract during their use...
A dynamometer is a measuring device used to determine the torque, force, speed, and power required to operate the drive on a machine or motor, which can be measured by evaluating the torque and rotational speed of a motor simultaneously...
An engine dynamometer is a device used to test an internal combustion engine that has been removed from a car, ship, generator, or any other accessory that uses one. The goal is to verify performance before reinstalling the engine in the equipment...
Force sensors are transducers that transform mechanical input forces like weight, tension, compression, torque, strain, stress, or pressure into an electrical output signal whose value can be used to...
A load cell is a transducer which converts mechanical energy (tensile and compressive forces) into electrical signals. There are different transducer operating principles that can be utilized to convert forces...
A load pin is a sensor utilized to measure force or weight in various research, control, measurement and testing applications. The load pin force sensor converts a force into an electrical signal. The load pins provide...
Machine vision systems are assemblies of integrated electronic components, computer hardware, and software algorithms that offer operational guidance by processing and analyzing the images captured from their environment. The data acquired from the vision system are...
An optical comparator is a measurement system that offers extremely accurate and repeatable measurement data. Optical measuring tools include optical comparators. This gadget employs the principles of optics by utilizing...
A platform scale is a scale that measures the weight of objects loaded on a flat platform. The function of the platform is to transmit the weight of the object to the internal measuring device and to support the object during weighing...
A strain gauge, or strain gage, is a sensing device used for measuring strain experienced by an object. It is made from a long, thin piece of conductor bonded to an elastic backing material called a carrier...
At the heart of every weighing device is a sensor called a load cell. When an item is put on a load cell, it senses the force of the gravitational pull of the weight, which an electronic circuit processes to display...