timeline1968
Mechanical
- Sep 3, 2006
- 41
I'm going to pre-apologize for this question, as it's pretty simple:
I have a multimeter that I'm trying to figure out what the accuracy is when measuring small values of ohms (.5-10ohms). The booklet for the meter states:
Range:500.00ohms, Resolution: 0.01 ohms, Accuracy 0.05%+10^3
I don't understand the last part of the accuracy reading. The 0.05% I understand (say 20ohms: 20*.0005 = 0.01ohm, so I could read 19.99 to 20.01) but what does the 10^3 mean?
Thanks for any assistance.
I have a multimeter that I'm trying to figure out what the accuracy is when measuring small values of ohms (.5-10ohms). The booklet for the meter states:
Range:500.00ohms, Resolution: 0.01 ohms, Accuracy 0.05%+10^3
I don't understand the last part of the accuracy reading. The 0.05% I understand (say 20ohms: 20*.0005 = 0.01ohm, so I could read 19.99 to 20.01) but what does the 10^3 mean?
Thanks for any assistance.