Pressure is defined as a force per unit area. Common units include pounds per square inch (psi), bar, and Pascals, all of which are or are related to forces per unit area. Some plants use kg/cm2. Interestingly, this is a mass per unit area. Understand the difference. As the acceleration due to gravity changes (such as due to elevation), the force per unit area changes, whereas the mass per unit area does not. The effect is generally minor on earth, however not so on the moon where the acceleration due to gravity is a fraction of that on earth.
Also complicating pressure units are their references. Gauge pressures represent pressure relative to the instrument’s surroundings. Absolute pressures represent the pressure relative to a perfect vacuum. One would like to think that the difference between gauge pressure and atmospheric pressure is 1.01325 bar (14.696 psi). However, atmospheric pressure at high elevations can be significantly below that at sea level. A gauge pressure instrument measuring ambient pressure will measure identically (zero) at sea level and in Denver, CO (altitude 1600m?), even though their respective absolute pressures are nominally 1.01325 bar (14.696 psi) and approximately 0.7(?) bar.
There are often very compelling reasons for choosing to measure absolute pressure instead of (the more common) gauge pressure. Examples include some applications where measurement of absolute pressure is desirable for processing reasons, atmospheric pressure is different than 1.01325 bar (14.696 psia) (see example above), and when the magnitude of the atmospheric pressure changes approach the error associated with the pressure instrument. An incorrect decision can represent a loss of opportunity, but the good (bad?) news is that the measurement user will likely be oblivious to the loss. In other words, it can really matter in a technical (and potentially economic) sense, but it may be a (political) moot point because few would know the difference.
When compensating a flow measurement for operating pressure, one rule of thumb is to use absolute pressure transmitters when the pressure is below 3.5 bar (50 psig) (“Flow Measurement”, ISA, page 30). Assuming atmospheric pressure variations of +/- 2 percent, the absolute pressure of 3.5 bar measurements (at sea level) would vary by approximately 0.02 * 1.01325 / 3.5, or 0.6 percent of rate. This variation can introduce errors of similar magnitude to those introduced by the flowmeter — and when the pressure falls below 3.5 bar, the variation will be larger than 0.6 percent of rate. Increasing the rule of thumb to 14 bar (200 psig) would reduce the variation to approximately 0.15 percent of rate at a pressure of 14 bar. Even this may not be adequate for some measurements, so careful consideration should be the norm.
Beyond understanding the concepts, much of the confusion associated with pressure measurement is not really due to difficulty, but rather due to a failure to clearly communicate the units used. For example, the pressure may be “1 pound” (which is a weight, but…). Is this 1 psia, 1 psig, 1 psi vacuum, or is someone referring to a pressure difference?
What about the measurement of pressure in terms of water column that is commonly used to describe the calibration of differential pressure instruments? If the water temperature changes, its density will change and the same height of liquid will provide a different pressure. For example, if the plant standard is based upon water at 4degC, the error associated with using a 20 degC water column for calibration will be approximately 0.2 percent. This may appear to be a fine point, but transmitter accuracy is often advertised to be better than 0.1 percent of the calibrated span. Therefore, 1000 mmWC does not completely describe a pressure. However, 1000 mmWC at a defined water temperature does.