Fluke 721, 721EX Application Note

Calibrating Gas Custody
Transfer Flow Computers
Gas custody transfer flow computers that calculate flow by measuring the
differential pressure across a flow restriction, such as an orifice plate, require
special calibration to perform at optimum accuracy. In custody transfer
applications where the buying and selling of commodities like natural gas is
involved, calibration checks are performed frequently as a matter of fiduciary
transfer flow computers in the natural gas transmission industry is referenced.
Flow computers need multiple measurements to calibrate each device. In the
normal application, three measurements are made: volumetric flow, static
(line) pressure and temperature. A calculation is performed using this data to
determine the actual mass of the gas flowing through the pipeline.
The Fluke 721 Precision Pressure Calibrator has special features that support the complete calibration of natural gas multi-variable electronic flowmeters and other types of flow computers. With two internal pressure ranges, external Fluke 750P pressure mod­ules and an optional precision RTD probe, all of the calibrations required for the flow computer can be performed with just one instrument.
The Fluke 721 is available with two built-in pressure ranges from from 16 psi/1 bar up to 5000 psi/345 bar. For this application, the configuration with the low pressure sensor (P1) 16 psi/1 bar and high pressure sensor (P2) of 1500 psi/100 bar is frequently the best fit. Since the Fluke 721 has an accuracy specification as a percent of full scale, it is important to closely match the full scale of the calibrator to the scale of the application in order to get the best performance. (See sidebar on system accuracy calculation and importance of maintaining an adequate accuracy ratio).
In addition to the calibrator itself, a high and low pressure calibration pressure source will be needed. An accessory RTD probe is also required for measuring temperature. An appropriate low pressure source with 0.01 inH2O of resolution is needed for the low pressure test, and a high pres­sure source such as a regulated nitrogen bottle or hydraulic hand pump are required. One pump style typically does not work well for both tests or exten­sive cleaning is required to switch from hydraulic (oil or water) to pneumatic testing. High pressure pumps typically do not have the desired resolution for the low pressure test.
Application Note
PRESSURE
721
F1 F2 F3
V
mA
COM
CALIBRATOR
ZERO
30V 24mA MAX
stream
7.21psi
Fluid
stream
Figure 1. Energ y is ex pressed in real, reactive, and apparent power.
Gas custody transfer flow computer operational theory
Custody transfer flow computers are called by a variety of names including electronic flowmeters (EFMs) and multi-variable flow computers, but they all feature some common principles of operation.
1. Volumetric flow measurement uses some type of flow restriction such as an orifice plate to gener­ate a pressure drop. The differential pressure
Fluid
From the Fluke Digital Library @ www.fluke.com/library
created across this pressure drop is measured by
n
2
1
2
2
2
the flow computer as the primary measurement. It is based on the principle that flow velocity is proportional to the square root of the pressure drop. The volumetric rate is then calculated from the velocity by knowing the diameter of the pipe in which the gas is flowing.
The measured pressure drop (differential) is typically around 200 inches of water column (“WC) or 8-10 psi.
2. To convert volumetric flow to mass flow you also need to know the density of the mass per volume of the flowing media. The flow com­puter makes this calculation using 2 additional measurements, plus a range of factors/constants based on the flowing media. The two additional measurements are the static pressure of the gas in the pipeline and the temperature of the gas in the pipeline.
The static pressure in these applications ranges widely from a low of about 300 psi / 20 bar to a high of about 2000 psi / 138 bar.
The temperature of the gas is usually at ambient, so it is within the range of normal environmental conditions.
3. A final consideration about flow computers is how they are typically installed and used.
Industrial applications use either the analog
output of the flowmeter (4 to 20 mA) or a digital output like the HART signal to get data from the flowmeter to a control system or data acquisition system.
This analog output is generally not used in gas pipeline applications. Instead, the flowmeter is a specialized device that operates standalone to measure and record the total mass flow through the pipeline. The total is periodically “downloaded” from the flowmeter to be used in an accounting of gas flow and custody transfer. This information is also often sent through wire­lessly to a central control point in operations.
The flowmeter may be packaged with other electronic devices to be able to perform this function or it may be purpose manufactured, which is the most common type.
How to calibrate the flow computer
Each flow computer manufacturer has created a proprietary method of calibration, but they all use the same general technique, which will be described here.
In these proprietary calibrations, the manufac-
turer has provided a software application, which runs on a notebook computer (PC). The PC is con­nected to the serial port or USB port of the flow
System Accuracy Determination
In order to effectively calibrate an instrument, the calibrator used must be more accurate than the instrument by some factor. The factor will vary according to the application, but it should be as large as is practical. The minimum factor is gener­ally considered to be 3 to 4 times. The common term for expressing this factor is Test Uncertainty Ratio or TUR. If the calibrator is 4 times more accu­rate than the device being tested it is referred to as having a TUR of 4:1.
The rationale behind this comes from a tech­nique for the statistical analysis of the error in a system. This technique is called Root Square Sum or RSS. To determine the error in a system you take the square root of the sum of the errors squared for all elements in the system. Note that this is not the maximum possible error in a system, but is the largest error which is statistically likely.
This formula describes the calculation, where Et is the total error and E1, etc. are the errors of the individual components of the system.
2
=+++EEEE
t
By using a TUR of 4:1, the effect of the error in the calibrator is reduced to a small percentage of the error of the instrument under test and can
...
therefore generally be disregarded. As an alterna­tive to having a calibrator with the appropriate ratio, users may elect to de-rate the performance of the instrument to a value four times that of the calibrator.
For example, using a calibrator with ±0.05% accuracy would have a TUR of 4:1 testing an instrument with an accuracy of ±0.2%. Due to the continual advances in instrument technology, calibration technology may, from time to time, fail to provide the necessary TUR to calibrate to the instrument manufacturer’s rated specification. Alternately you can tighten the test tolerance to 80% of the desired specification to gain the same confidence using a technique called guardbanding. The fundamental concept of guardbanding is to restrict the Pass/Fail limits applied to a calibration test based on a defined criterion. The purpose of guardbanding is to control the risk of accepting an out-of-tolerance unit, or rejecting an in-tolerance unit. Without guardbanding the result of a test will be Pass or Fail. With guardbanding the result of a test will be Pass, Fail, or Indeterminate. A Pass or Fail test result without guardbanding may change to a result of Indeterminate with guardbanding. For more information on guardbanding refer to the application note “Guardbanding with Confidence” at www.fluke.com/guardbanding.
2 Fluke Corporation Calibrating Gas Custody Transfer Flow Computers
Loading...
+ 2 hidden pages