References and Standards - IS & ISO Standards

FORCE:
1. IS : 4169-1988 - Method for calibration of force -proving instruments used for the verification of uniaxial testing machines
2. ASTM E 74 – 06: Standard practice of Calibration of Force-Measuring Instruments for Verifying the Force Indication of Testing Machines1.
3. ISO 376:2011(E) - Metallic materials-calibration of force -proving instruments used for the verification of uniaxial testing machines.

Torque Transducer:
1. BS 7882:2008 : Method for calibration and classification of torque measuring devices.
2. EURAMET cg 14: Static Torque measuring devices.

Relationship between the mass and conventional mass of a weight in Calibration

The conventional mass value of a body is equal to the mass mc of a standard that balances this body under conventionally chosen conditions. The unit of the quantity "conventional mass" is the kilogram 

The conventionally chosen conditions are: 
tref = 20 C 0 = 1.2 kg m-3 c = 8 000 kg m-3 

The conventional mass has the same unit as mass, because its values are defined through the multiplication of a mass by a dimensionless quantity. 

Difference between the % of reading and %FSD (full scale deflection) and its Importance

A Uncertainity of the equipment: 
There are two ways of stating measurement error and uncertainity for the entire range of measuring instrument.
1. Percent of full scale deflection or FSD
2. Percent of reading or indicated value

The difference between the two concepts becomes highly significant when an instrument is operating near the bottom of its turn down range. The following example will show the difference between the two. 

Assume you have a 100 Nm torque tester (Maximum), and that the stated uncertainity as 
Case-1: At 100 Nm ± 0.5%FSD uncertainity is = 0.5 Nm for the entire range. This represents the "best case" uncertainty of the measurement. However, when a lower range is utilized this 0.5 Nm becomes more significant. 

Difference between manufacturers traceability certificate vs accredited calibration certificate

A accredited calibration certificate is issued by an Accredited laboratory, which is accredited by various bodies like NABL, DKD, etc as per ISO17025. In such laboratories calibration is done as per relevant standards and procedures, which is accepted and approved during audits. These procedures or standards are the basis for comparison of laboratories capabilities. The Accredited Calibration certificate thus has more credibility as calibration is performed as per certain standards, verified by accreditation bodies and gives the user much more information like uncertainty budget, classifications as per standards, along with the measurement values at various points in the range.

Difference between Calibration, testing and validation

A calibration is a process that compares a known (the standard) against an unknown (the customer's device). During the calibration process, the offset between these two devices is quantified and the customer's device is adjusted back into tolerance (if possible). A true calibration usually contains both "as found" and "as left" data. 

measurement uncertainty - parameter, associated with the result of a measurement, that characterizes the dispersion of the values that could reasonably be attributed to the measurand. (International Vocabulary of Basic and General Terms in Metrology) 

Metrology, Traceability & Calibration

Metrology is the science of measurement and its application. It represents the basis for trust on the results obtained. The national metrology system of a country represents the infrastructure that enables the performance and application of measurement for purposes that mirror the economic and social core of the nation. 

Purpose of Metrology – legal metrology, industrial metrology and scientific metrology.
These are fields that have been internationally accepted to cover all the technical and practical aspects of measurements.

Standards

In order that investigator in different parts of the country and different parts of the world may compare the results of their experiments on a consistent basis. It is necessary to establish certain standard units of length, mass, time, temperature, pressure, etc. 

A Dimension defines a physical variable that is used to describe some aspect of a physical system. The fundamental value associated with any dimension is given by a Unit.

A Unit defines a measure of a dimension.

Fundamental Dimension: Length, Mass, Time, Temperature, Electrical Current, Luminous Intensity.

Derived Dimension: Acceleration, Area, Density, Velocity and Force.

Error and Uncertainty

The precision to which we can measure something is limited by experimental factors, leading to uncertainty. The uncertainty depends on various factors including environmental, process, man or machine. The better control of these various factors, better the uncertainty.

The deviation of a measurement from the "correct" value is termed the error, so error is a measurement of how inaccurate our results are. 
There are two general types of errors.

. Systematic Errors - A error that is constant from one measurement to another, for example, an incorrectly marked ruler would always make the same mistake measuring something as either bigger or smaller than it actually is every time. These errors can be quite difficult to eliminate!

Pages