×
INTELLIGENT WORK FORUMS
FOR ENGINEERING PROFESSIONALS

Log In

Come Join Us!

Are you an
Engineering professional?
Join Eng-Tips Forums!
  • Talk With Other Members
  • Be Notified Of Responses
    To Your Posts
  • Keyword Search
  • One-Click Access To Your
    Favorite Forums
  • Automated Signatures
    On Your Posts
  • Best Of All, It's Free!
  • Students Click Here

*Eng-Tips's functionality depends on members receiving e-mail. By joining you are opting in to receive e-mail.

Posting Guidelines

Promoting, selling, recruiting, coursework and thesis posting is forbidden.

Students Click Here

Jobs

How accurate does my reference meter need to be to certify 1% product?
3

How accurate does my reference meter need to be to certify 1% product?

How accurate does my reference meter need to be to certify 1% product?

(OP)
I have a meter that I need to certify as being ±1.0% accurate (1% of span, which is 0-100).

What level of accuracy is needed for the reference standard in order to claim this?

One group of co-workers says a ±0.5% meter will suffice, one says ±0.25%, and yet another says ±0.1%

Any thoughts would be appreciated.  Any references to an actual standard would be MOST appreciated!

Good on y'all,

Goober Dave

RE: How accurate does my reference meter need to be to certify 1% product?

'Normal' practice is for the reference instrument to be have an uncertainty at least 10x lower than the measurement you're trying to make. I can't find a standard which 'requires' this but it's been the accepted norm over here for a long time.
  

----------------------------------
  
If we learn from our mistakes I'm getting a great education!
 

RE: How accurate does my reference meter need to be to certify 1% product?

2
Here is a guideline from ISA for process intrumentation,
http://www.isa.org/InTechTemplate.cfm?Section=Automation_Basics&template=/ContentManagement/ContentDisplay.cfm&ContentID=56927

And this is a copy of a reference article we use in our shop, along with ANSI and NIST references,
http://en.wikipedia.org/wiki/Calibration

A quote from this article, The next step is defining the calibration process. The selection of a standard or standards is the most visible part of the calibration process. Ideally, the standard has less than 1/4 of the measurement uncertainty of the device being calibrated. When this goal is met, the accumulated measurement uncertainty of all of the standards involved is considered to be insignificant when the final measurement is also made with the 4:1 ratio. This ratio was probably first formalized in Handbook 52 that accompanied MIL-STD-45662A, an early US Department of Defense metrology program specification. It was 10:1 from its inception in the 1950s until the 1970s, when advancing technology made 10:1 impossible for most electronic measurements.

Hope that helps.

RE: How accurate does my reference meter need to be to certify 1% product?

(OP)
Thank you both!

Scotty for validating my thinking, and catserveng for the definitive answer.

Good on y'all,

Goober Dave

RE: How accurate does my reference meter need to be to certify 1% product?

Weights and Measures, in Canada, requires 0.3% accuracy for billing purposes... I understand

Dik

Red Flag This Post

Please let us know here why this post is inappropriate. Reasons such as off-topic, duplicates, flames, illegal, vulgar, or students posting their homework.

Red Flag Submitted

Thank you for helping keep Eng-Tips Forums free from inappropriate posts.
The Eng-Tips staff will check this out and take appropriate action.

Reply To This Thread

Posting in the Eng-Tips forums is a member-only feature.

Click Here to join Eng-Tips and talk with other members!


Resources