Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations cowski on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Measurement data with respect to gage accuracy

Status
Not open for further replies.

Runnik

Mechanical
Jun 16, 2008
4
I have a question regarding how to record inspection data from a measurement device. Recently, we had an ISO inspector here who stated that we're recording too many decimal places because it's more than the gage's accuracy. For example, Mitutoyo calipers have an accuracy of .001 in, but a resolution of .0005 in, so some data was recorded out to four decimal places.

So was the ISO inspector correct? If so, is there a standard to back up his statement (he didn't provide one)?

I've searched the forums, and came across two threads (thread281-168380 and thread286-122849) that have information close to what I'm looking for, but not exactly.
 
Replies continue below

Recommended for you

Take a look at ASTM E29

ASTM E29 - 06b Standard Practice for Using Significant Digits in Test Data to Determine Conformance with Specifications
 
Does that standard incorporate the manufacturer's accuracy specification for a gage?
 
You must have a good system if the inspector is nit picking something like this.

IMO you are doing exactly what you should be doing. The inspector is wrong. His way you would have to be rounding which would increase your uncertainty budget. That is to say for example. You have a tolerance of 1.005 to 1.010. You measure 1.0095, your way says that with .001 uncertianty it could be as much as 1.0105, his way you would round to 1.010 the top of you tolerance, but now the uncertainty says that it could be as much as 1.011. Both would be out of tolerance but depending on application this could be significant in acquiring devation approval. On the low end of the tolerance if you measured say 1.0055, then your way it could acutually, with the .001 uncertianty be 1.0045. With the inspector rounding to 1.006 and an uncertainty of .001 you would consider the measurment no smaller than 1.005. Now you have yours out of tolerance and his in tolerance. Of course this .001 uncertianty only uses the calipers accuracy and does not take into consideration other factors such as operator repeatability etc. which needs to be added to your budget.

Sorry for the long explanation, but it aggravates me when an inspector cites something and does not back it up with a written requirement.

Eddie
 
Yeah, we do have a pretty good system here. We tend to have very few findings, and they're usually minor.

So the gage's accuracy is the same thing as the gage's uncertainty? If so, then I don't believe the inspector fully understands the concept of uncertainty in measurement (which I have to admit, I'm just trying to learn myself).

And if you go by the inspector's logic, then why bother even having that fourth decimal place on the gage?

And don't apologize for a long explanation. I'm looking for all the info I can get.
 
Couldn't an accuracy of .001" equal +/-.0005"?

Believe it if you need it or leave it if you dare. - [small]Robert Hunter[/small]
 
I would not say that accuracy = uncertainty. Only that instrument accuracy is part of the uncertainty equation along with other factors such as environment, operator repeatabiliy, and any other factor that may cause a variance in the measurement. I too am relativley new to all this. I have been out of QC for 15 years and now find myself back in. So I will ask someone here if the may have a better explanation.

"Couldn't an accuracy of .001" equal +/-.0005"?"

On the face of it I would agree. But then I think accuracy of .001 could also mean to within .001 of the measured reading which would then mean +/- .001. Again any of the more experienced QC people have any input.

I don't want to hijack the thread so does anyone have any more input on Runnik's original question?

Eddie
 
I was taught to always record data to the precision of the instrument. Accuracy of the measurement is considered in a seperate step, e.g. when you present the data (how big should my error bars be) or when comparing your number to a standard. In the case of inspection, one should always try to use an instrument with a precision much greater than the tolerance range (10x if possible, 3x minimum), so that discussions of the accuracy of the tools becomes a non-issue, or at least one that doesn't come up very often. When it does come up, you check the caliper measurement (1.0095 vs. 1.010 limit) using a more precise tool, like a micrometer or optical comparator. Or, kick it over to MRB/customer to let them decide, then go back and fix your process.

 
"Couldn't an accuracy of .001" equal +/-.0005"?"

It could, but my original post meant to say ±.001". Sorry for the confusion.

edpaq - I wasn't implying that the accuracy reflected the total measurement uncertainty. I was just clarifying that the gage accuracy is equal to the gage uncertainty.

btrueblood - Do you mean the resolution of the instrument? Precision refers to the repeatability and reproducibility of an instrument, which isn't known unless you do a GR&R study. I think the accuracy of the tools would be an issue quite often here, as we see dimensions at the extremes of our tolerance limits quite a bit (which is a whole other issue).
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor