Guys, let me throw in my ¢2.
It looks to me that you are trying to measure imperfect part using imperfect instrument while assuming living in the perfect world.
In the real world things are more complicated and simpler in the same time.
Let say we have to measure the part according to print that says DIA 1.000±.005
We have Caliper that can display measurement of .001, but its actual measurement uncertainty is .002.
We also have Micrometer that can also display measurement of .001, but its actual measurement uncertainty is .0005.
To be certain that our part is good, we SUBTRACT our measurement uncertainty from both sides of tolerance zone.
So, while the part print still saying DIA 1.000±.005, our QC SHEET (or whatever you call that document) will be saying:
“DIA 1.000±.003 measure with Caliper” or
“DIA 1.0000±.0045 measure with Micrometer”
So, nothing has to be rounded, you just read what your instrument says:
Caliper: 1.003-good, 1.004-bad.
Micrometer: 1.004-good, 1.005-bad.
So, this is the real world approach – numbers on your print are absolute, you measure with the best tool available to you, and compensate for measuring uncertainty.
Now, interesting question: will using more accurate tool (Micrometer vs. Caliper) result in acceptance of more parts? Maybe, maybe not. If your machine is capable of producing parts to DIA 1.000±.003 without effort, you may actually save money by not buying Micrometer. (Naturally this is purely thought experiment; any shop worth its while has several measuring instruments

)