Sensitivity And Sensitivity Analysis

1487 Words3 Pages

the mean, and the precision signified show well the various measurements performed by same instrument on the same quality characteristic agree with each other. The difference between the mean of set of readings on the same quality characteristic and the true value is called as error. Less the error more accurate is the instrument. SENSITIVITY Sensitivity may be defined as the rate of displacement of the indicating device of an instrument, with respect mn to the measured quantity. In other words, sensitivity of an instrument is the ratio of the scale spacing to the scale division value. For example, if on a dial indicator, the scale spacing is 1.0 mm and the scale division value is 0.01 mm, then the sensitivity at any value of y=dx/dy, where dx and dy are increments of x and y, taken over the full instrument scale, the sensitivity is the slope of the curve at any value of y. The sensitivity may be constant or variable along the scale. In the first case we get linear transmission and in the second non-linear transmission.. Sensitivity refers to the ability of measuring device to detect mall differences in a quantity being measured. High sensitivity instruments may lead to drifts due to thermal or other effects, and indications may be less repeatable or less precise than that of the instrument of lower sensitivity. …show more content…

The distinction between the precision and accuracy will become clear by the following example. Several measurements are made one component by different types of instruments (A, B and C respectively) and the results are plotted. In any set of measurements, the individual measurements are scattered about the mean, and the precision signified show well the various measurements performed by same instrument on the same quality characteristic green with each other. The difference between the mean of set of readings on the same quality characteristic and the true value is called as

Open Document