For a question that started out very simply, this thread has introduced an enormous amount of confusion. I think we should try to tie the loose ends together rather than leave this thread dangling in space-time with so much uncertainty.
David, you commented that
With a gauge calibrated in psi or kPa, you can adjust the zero as you change elevation without affecting the validity of the calibration. With a gauge calibrated in bar, that adjustment is not to zero but to some fraction of a bar away from zero.
Why would the unit "bar" be treated any differently than "psi", "kPa" or any other unit? My understanding of gauge pressure (irrespective of the units used) is that it is the pressure difference from the local atmospheric pressure. In my understanding, a pressure gauge calibrated to gauge pressure would show zero at rest at whatever altitude it was calibrated for.
Quark stated that (btw - good to see you around again Quark - I haven't seen many posts from you recently)
So, one bar is (at sea level) about 1/1.01325 = 0.9869 atm.
I thought that 1 bar was (approximately) equal to 0.9869 atm everywhere, at sea level or on the moon. When I use the "atm" as a unit in this way it is (IMHO) a well defined unit of pressure and is not dependent on local atmospheric conditions.
The two comments by David and by Quark seem somehow linked and pointing to something that I have missed. If anyone can clarify this I would be grateful.
The comments by JohnGP and by alexawy2611 refer to conventions applied to the interpretation of pressure units. Conventions are helpful in that they make things easier for people within the same paradigm, but they can confuse people from the outside. In the USA it is conventional for vessel engineers to specify design pressures as gauge pressures, but this is not always true in Europe.
In the design of a vessel, the engineer is concerned with the pressure differential between the inside and outside of the vessel (i.e. across the pressure envelope). If the outside of the vessel is subject to atmospheric pressure, then defining the internal pressure in gauge terms gives you this differential. But if the outside of the vessel has a jacket that has a pressure other than local atmospheric it would lead to confusion.
The internal pressure and temperature conditions for the vessel would have been determined by a process engineer (hopefully in absolute terms!) and they would be the same whether the vessel was operated at sea level or at 15,000 ft - but the internal gauge pressure would be different for each location. So in strict terms, the design pressure in gauge terms varies with location - but in absolute terms it remains constant. But we have to stay practical and appreciate that when the inspector comes to do the in-situ pressure test he is going to have a gauge calibrated to the local conditions and in fact he can only measure the gauge pressure.
I believe it is always safer to use absolute pressures, but it is not too important as long as the units are fully specified as gauge or absolute and the engineer on the receiving end can interpret them according to his/her needs.
Katmar Software
Engineering & Risk Analysis Software