Upload
albert-jerome-bailey
View
213
Download
0
Tags:
Embed Size (px)
Citation preview
Calibration is a comparison between measurements of known magnitude with one device and another measurement made in as similar a way as possible with a second device.
Defi nition
Digital multimeters usually have built-in processors that handle measurements using stable voltage references. Unlike digital multimeters, analog multimeters need to be calibrated to maintain accuracy.
Analog multimeters usually have adjustable resistors called trimmer resistors that may be used to compensate for different conditions.
Analog .vs. digital
Basic calibration processThe calibration process begins with the design
of the measuring instrument that needs to be calibrated.
Ammeter calibration1 Connect the two
terminals of the voltmeter to across the resistor.
2Connect the two terminals of the ammeter in series with the resistor. This will allow the current flowing the resistor to be determined.
3Switch on the voltage supply, and set it to 1 V.
4Calculate the expected value of current using Ohm's law. Ohm's law states V=IR, In this case, the expected current is I=V/R. Compare this with the measured value shown on the ammeter. If the values are different, adjust the calibration knob on the ammeter to match 1mA.