Measurement data with respect to gage accuracy
Measurement data with respect to gage accuracy
(OP)
I have a question regarding how to record inspection data from a measurement device. Recently, we had an ISO inspector here who stated that we're recording too many decimal places because it's more than the gage's accuracy. For example, Mitutoyo calipers have an accuracy of .001 in, but a resolution of .0005 in, so some data was recorded out to four decimal places.
So was the ISO inspector correct? If so, is there a standard to back up his statement (he didn't provide one)?
I've searched the forums, and came across two threads (thread281-168380: Inspection Accuracy VS. Dimensional Accuracy and thread286-122849: Decimal place inspection) that have information close to what I'm looking for, but not exactly.
So was the ISO inspector correct? If so, is there a standard to back up his statement (he didn't provide one)?
I've searched the forums, and came across two threads (thread281-168380: Inspection Accuracy VS. Dimensional Accuracy and thread286-122849: Decimal place inspection) that have information close to what I'm looking for, but not exactly.
RE: Measurement data with respect to gage accuracy
ASTM E29 - 06b Standard Practice for Using Significant Digits in Test Data to Determine Conformance with Specifications
RE: Measurement data with respect to gage accuracy
RE: Measurement data with respect to gage accuracy
IMO you are doing exactly what you should be doing. The inspector is wrong. His way you would have to be rounding which would increase your uncertainty budget. That is to say for example. You have a tolerance of 1.005 to 1.010. You measure 1.0095, your way says that with .001 uncertianty it could be as much as 1.0105, his way you would round to 1.010 the top of you tolerance, but now the uncertainty says that it could be as much as 1.011. Both would be out of tolerance but depending on application this could be significant in acquiring devation approval. On the low end of the tolerance if you measured say 1.0055, then your way it could acutually, with the .001 uncertianty be 1.0045. With the inspector rounding to 1.006 and an uncertainty of .001 you would consider the measurment no smaller than 1.005. Now you have yours out of tolerance and his in tolerance. Of course this .001 uncertianty only uses the calipers accuracy and does not take into consideration other factors such as operator repeatability etc. which needs to be added to your budget.
Sorry for the long explanation, but it aggravates me when an inspector cites something and does not back it up with a written requirement.
Eddie
RE: Measurement data with respect to gage accuracy
So the gage's accuracy is the same thing as the gage's uncertainty? If so, then I don't believe the inspector fully understands the concept of uncertainty in measurement (which I have to admit, I'm just trying to learn myself).
And if you go by the inspector's logic, then why bother even having that fourth decimal place on the gage?
And don't apologize for a long explanation. I'm looking for all the info I can get.
RE: Measurement data with respect to gage accuracy
Believe it if you need it or leave it if you dare. - Robert Hunter
RE: Measurement data with respect to gage accuracy
"Couldn't an accuracy of .001" equal +/-.0005"?"
On the face of it I would agree. But then I think accuracy of .001 could also mean to within .001 of the measured reading which would then mean +/- .001. Again any of the more experienced QC people have any input.
I don't want to hijack the thread so does anyone have any more input on Runnik's original question?
Eddie
RE: Measurement data with respect to gage accuracy
RE: Measurement data with respect to gage accuracy
It could, but my original post meant to say ±.001". Sorry for the confusion.
edpaq - I wasn't implying that the accuracy reflected the total measurement uncertainty. I was just clarifying that the gage accuracy is equal to the gage uncertainty.
btrueblood - Do you mean the resolution of the instrument? Precision refers to the repeatability and reproducibility of an instrument, which isn't known unless you do a GR&R study. I think the accuracy of the tools would be an issue quite often here, as we see dimensions at the extremes of our tolerance limits quite a bit (which is a whole other issue).
RE: Measurement data with respect to gage accuracy