While there are several methods of measuring strain, the most common is with a strain gauge. A strain gauge’s electrical resistance varies in proportion to the amount of strain placed on it. The most widely used gauge is the bonded metallic strain gauge.
The metallic strain gauge consists of a very fine wire or, more commonly, metallic foil arranged in a grid pattern. The grid pattern maximizes the amount of metallic wire or foil subject to strain in the parallel direction (shown as the “active grid length” in the Bonded Metallic Strain Gauge figure). The cross sectional area of the grid is minimized to reduce the effect of shear strain and Poisson strain.
Bonded Metallic Strain Guage
It is very important that you properly mount the strain gauge onto the test specimen. This ensures the strain accurately transfers from the test specimen through the adhesive and strain gauge backing to the foil.
A fundamental parameter of the strain gauge is its sensitivity to strain, expressed quantitatively as the gauge factor (GF). Gauge factor is the ratio of fractional change in electrical resistance to the fractional change in length (strain):[/left]
The gauge factor for metallic strain gauges is typically around two.
Ideally, the resistance of the strain gauge would change only in response to applied strain. However, strain gauge material, as well as the specimen material to which you apply the gage, will also respond to changes in temperature. Strain gauge manufacturers attempt to minimize sensitivity to temperature by processing the gauge material to compensate for the thermal expansion of the specimen material intended for the gauge. While compensated gauges reduce the thermal sensitivity, they do not remove it completely. For example, consider a gauge compensated for aluminum that has a temperature coefficient of 23 ppm/°C. With a nominal resistance of 1000 GF = 2, the equivalent strain error is still 11.5 /°C. Therefore, additional temperature compensation is important. [/left]