Tolerances & Relief Valve Testing
Tolerances & Relief Valve Testing
(OP)
When you test a relief valve to verify its setpoint, do you count instrument accuracy and setpoint tolerance simultaneously? For example, if you (or the OEM) have relief valve with a rated setpoint tolerance of +/- 3%, and you measure inlet pressure with a gauge that is accurate to +/- 0.5%, do you look for the actual popping pressure to be within +/- 3% or +/- 2.5%?
If the code were to specify separate tolerances for the setpoint AND for the test instrument, it would seem to make some sense that you would not "add" the tolerances together during the test. But is this a true assumption? And what if the code does NOT specify a separate test instrument tolerance?
If the code were to specify separate tolerances for the setpoint AND for the test instrument, it would seem to make some sense that you would not "add" the tolerances together during the test. But is this a true assumption? And what if the code does NOT specify a separate test instrument tolerance?





RE: Tolerances & Relief Valve Testing
It is "implied" that, in order to accurately set your relief valve within the specified tolerances identified in ASME Section VIII, you will need to take into account (allow for) the pressure gauge accuracy.
RE: Tolerances & Relief Valve Testing
RE: Tolerances & Relief Valve Testing
Keep in mind that there are other factors that will affect accuracy. Test stand volume, gauge size and range, location of pressure tap to valve inlet(for water leg), etc...
RE: Tolerances & Relief Valve Testing
I have a different view of measurement uncertainty which comes at the issue from an instrumentation angle rather than as a Code Issue. ASME PTC 19 is a document on Instrumentation which calls for a minimum 4:1 accuracy ratio between the Calibration Standard and the Test Object. In other words, a Test Gauge should be at least 4 times more accurate than the PRV being Tested. For ASME Sec. VIII Applications, the minimum tolerance is +/- 2 psi from 15 psi to 70 psi. This means the Test Gauge needs to be accurate to +/- .5 psi or 4 times > 2 psi. If you "do the numbers", a 0 to 200 psi Test Gauge with .25% accuracy will read +/- .5% of Full Scale or .5 psi. Therefore, a 200 psi gauge would be sufficiently accurate to test any PRV set between 15 and 175 psi. Again, the minimum recommended accuracy ratio is 4:1, but the gauge will become increasingly more accurate as the PRV Set Pressure increases. For example, at 150 psi the tolerance is +/- 3% or 4.5 psi which is a 9:1 accuracy ratio. I hope this long explanation is helpful.
J. Alton Cox
www.delucatest.com