Sell with no payment Risks!

Logo Projectmaterials B2B marketplace

Sell with no Payment Risks!

Process Instruments (Measurement Features)

Share on whatsapp
Share on telegram
Share on linkedin
Share on email

An overview of key concepts about process instrumentation: measurement precision (a static feature that identifies the degree of agreement between the indication of the instrument and the characteristics of the measurand in measurement or monitoring), measurement uncertainty, metrological confirmation, and calibration.

MEASUREMENT PRECISION

By measurement precision we mean:

  • according to ISO-IMV (International Metrology Vocabulary): “…closeness of agreement between indications or measured quantity values obtained by replicate measurements on the same or similar objects under specified conditions”;
  • according to IEC-IEV (International Electrotechnical Vocabulary): “..quality which characterizes the ability of a measuring instrument to provide an indicated value close to a true value of the measurand” (Note: call in this case, however, not Precision but Accuracy);
  • or we could deduce the following practical definition from the previous ones:“…by testing a measuring instrument under conditions and with specified procedures, the maximum positive and negative deviations from a specified characteristic curve (usually a straight line)”.

Therefore, the concept of linearity is also inherent in the measurement precision term (which is currently very limited in the digital instrumentation), while the concept of hysteresis is not included (although this is considered, as it is included within the maximum positive and negative deviations found).

instrument accuracy and precision
Instruments accuracy and precision

Furthermore, the concept of repeatability of the measurement is not included (which is instead considered in the case of verification of precision over several measuring cycles.

Therefore, in the practical verification of the precision of the measuring instruments with a single up and down measurement cycle (generally conducted for instruments with hysteresis, such as, pressure gauges, pressure transducers, load cells, etc.) a calibration curve is obtained of the type found in Figure 1, where we can deduce the concept of tested accuracy (accuracy measured) that must be included within the so-called nominal accuracy (accuracy rated) or the limits within which the imprecision of an instrument is guaranteed by its specification.

The metrological confirmation is the verification that the measuring instrument keeps the accuracy and uncertainty characteristics required by the measurement process over time.

Sometimes this concept of imprecision for some common types of instruments (such as gauges, resistance thermometers, thermocouples, etc.) is also called precision or accuracy class, which according to the International Reference Vocabularies ISO-IMV and IEC-IEV : “class of measuring instruments or measuring systems that meet stated metrological requirements that are intended to keep measurement errors or instrumental measurement uncertainties within specified limits under specified operating conditions” (ie, the accuracy measured must be less than accuracy rated: See also Figure 1).

Instrument Measurement Accuracy
Instrument Measurement Accuracy

 
Figure 1 – Exemplification of measurement accuracy concepts

MEASUREMENT UNCERTAINTY

The measurement uncertainty of the measuring instrument is a new concept which takes into account during the calibration not only the errors or deviations found but also its resolution of indication as well as the uncertainty of the measurement standard used in the calibration itself.

Measurement Uncertainty
Instruments measurement uncertainty

By measurement uncertainty we mean:

  • according to ISO-IMV (Internat. Metrology Vocabulary): “non-negative parameter characterizing the dispersion of the quantity values being attributed to a measurand, based on the information used”;
  • according to ISO-GUM (Guide to Uncertainty of the Measurement): “result of the estimation that determines the amplitude of the field within which the true value of a measurand must lie, generally with a given probability, that is, with a determined level of confidence”.

From the above definitions we can deduce two fundamental concepts of measurement uncertainty:

  1. Uncertainty is the result of an estimate, which is evaluated according to the following two types:
  • Category A: when the evaluation is done by statistical methods, that is through a series of repeated observations, or measurements.
  • Category B: when the evaluation is done using methods other than statistical, that is, data that can be found in manuals, catalogs, specifications, etc.

2. The uncertainty of the estimate must be given with a certain probability, which is normally provided in the three following expressions (see also Table 1):

  • Standard uncertainty (u): at the probability or confidence level of 68% (exactly 68.27%).
  • Combined uncertainty (uc): the standard uncertainty of measurement when the result of the estimate is obtained by means of the values of different quantities and corresponds to the summing in quadrature of the standard uncertainties of the various quantities relating to the measurement process.
  • Expanded uncertainty (U): uncertainty at the 95% probability or confidence level (exactly 95.45%), or 2 standard deviations, assuming a normal or Gaussian probability distribution.

Standard uncertainty u(x) (a)

The uncertainty of the result of measurement expressed as a standard deviation u(x) º s(x)

Type A evaluation (of uncertainty)

Method of evaluation of uncertainty by the statistical analysis of series of observations

Type B evaluation (of uncertainty)

Method of evaluation of uncertainty by means other than the statistical analysis of series of observations

Combined standard uncertainty uc(x)

Standard uncertainty of the result of measurement when that result is obtained from the values of a number of other quantities, equal to the positive square root of a sum of terms, the terms being the variances or covariances of these other quantities weighted according to how the measurement result varies with changes in these quantities

Coverage factor k

The numerical factor used as a multiplier of the combined standard uncertainty in order to obtain an expanded uncertainty (normally is 2 for probability @ 95% and 3 for probability @ 99%)

Expanded uncertainty U(y) = k . uc(y) (b)

Quantity defining an interval about the result of a measurement that may be expected to encompass a large fraction of the distribution of values that could reasonably be attributed to the measurand (normally is obtained by the combined standard uncertainty multiplied by with a coverage factor k = 2, namely with the coverage probability of 95%)

(a)   The standard uncertainty u (y), ie the mean square deviation s (x), if not detected experimentally by a  normal or Gaussian distribution, can be calculated using the following relationships:

u(x) = a/Ö3, for rectangular distributions, with an amplitude of variation ± a, eg Indication errors

u(x) = a/Ö6, for triangular distributions, with an amplitude of variation ± a, eg Interpolation errors

(b)   The expanded measurement uncertainty U (y) unless otherwise specified, is to be understood as provided or calculated from the uncertainty composed with a coverage factor 2, ie with a 95% probability level.

Table 1- Main terms & definitions related to measurement uncertainty according to ISO-GUM

METROLOGICAL CONFIRMATION

The metrological confirmation is the routine verification and control operation that confirms that the measuring instrument (or equipment) maintains the accuracy and uncertainty characteristics required for the measurement process over time.

By metrological confirmation we mean according to ISO 10012 (Measurement Mgt System): “set of interrelated or interacting elements necessary to achieve metrological confirmation and continual control of measurement processes”, and generally includes:                          

  • instrument calibration and verification;
  • any necessary adjustment and the consequent new calibration;
  • the comparison with the metrological requirements for the intended use of the equipment;
  • the labeling of successful positive metrological confirmation.
Metrological confirmation
Metrological confirmation for process instrumentation

The metrological confirmation must be guaranteed through a measurement management system which essentially involves the phases of Table 1.

NORMAL PHASES PHASES IN CASE OF ADJUSTMENT PHASES IN CASE OF IMPOSSIBLE ADJUSTMENT
0. Equipment scheduling
1. Identification need for calibration
2. Equipment calibration
3. Drafting of  calibration document
4. Calibration identification
5. There are metrological requir.???
6. Compliance with metrological req. 6a. Adjustment or repair 6b. Adjustment Impossible
7. Drafting document confirms 7a. Review intervals confirm 7b. Negative verification
8. Confirmation status identification 8a. Recalibration phase (2 to 8) 8b. State of identification
9. Satisfied need 9a. Satisfied need 9b. Need not satisfied

Table 1 – Main phases of the metrological confirmation (ISO 10012)
Table 1 highlights three possible paths of metrological confirmation:

  1. the left path that normally achieves the satisfaction of the positive outcome of the metrological confirmation without any adjustment of the instrument in confirmation to phase 6;
  2. the first left path and then the middle one from phase 6a to 9a, in case of positive adjustment or repair of the instrument in confirmation and whose recalibration satisfies the confirmation: therefore, in this case, it will be necessary to reduce only the confirmation interval;
  3. the first path on the left and then the right from phase 6b to 9b, in case of negative adjustment or repair of the instrument in confirmation, which does not satisfy the result of the confirmation: therefore the instrument must be downgraded or alienated.

Metrological confirmation can usually be accomplished and fulfilled in two ways:
Comparing the Maximum Relieved Error (MRE) with the Maximum Tolerated Error (MTE), ie:
MRE <= MTE
Comparing the Max. Relieved Uncertainty (MRU) with Tolerated Uncertainty (MTU, ie:
MRU <= MTU
With reference to the previous articles, and taking into consideration the one on the Calibration, related to the evaluation of the calibration results in terms of Error and Uncertainty of a manometer, respectively equal to:

  • MRE: ±05 bar
  • MRU:   066 bar

if the maximum error and tolerated uncertainty were both 0.05 bar, then the manometer if evaluated in terms of MRE is compliant, while if evaluated in terms of MRU it is not compliant, and therefore it should follow path 2 of Table 1, or path 3; if it does not fall into it is then downgraded.

INSTRUMENTS CALIBRATION

Instrumentation calibration is the operation to obtain under specified conditions, the relationship between the values of a measurand and the corresponding output indications of the instrument in calibration.

By calibration we mean:

  • according to ISO-IMV (International Metrology Vocabulary): “operation that, under specified conditions, in a first step, establishes a relation between the quantity values with measurement uncertainties provided by measurement standards and corresponding indications with associated measurement uncertainties and, in a second step, uses this information to establish a relation for obtaining a measurement result from an indication”;
  • or we could deduce the following practical example from the previous one:  “operation performed to establish a relationship between the measured quantity and the corresponding output values of an instrument under specified conditions”.
instrument calibration
Instrument calibration

Calibration should not be confused with the adjustment, which means: “set of operations carried out on a measuring system so that it provides prescribed indications corresponding to given values of a quantity to be measured (ISO-IMV).

Hence, the adjustment is typically the preliminary operation before the calibration, or the next operation when a de_calibration of the measuring instrument is found.

Calibration should be performed on 3 or 5 equidistant measuring points for increasing (and decreasing) values in the case of instruments with hysteresis phenomena: eg manometers):

Figure 1 presents the calibration setup, while Table 1 presents the calibration results.

Manometer setup calibration
Manometer setup calibration

Figure 1 – Calibration setup of a manomete

 
Pressure
of
reference
 
(bar)
 
Relieved Values
 
Relieved Errors
 
Max
Relieved
Error
 
(bar)
Up
 
(bar)
Down
 
(bar)
Up
(bar)
Down
 
(bar)
0 0.05 0.05 0.05
2 1.95 2.05 – 0.05 0.05
4 3.95 4.05 – 0.05 0.05
6 5.95 6.00 – 0.05 0.00
8 7.95 8.00 – 0.05 0.00
10 10.00   0.00
Table 1 – Calibration Results

From the calibration results shown in Table 1, the metrological characteristics of the manometer     (or pressure gauge) can be obtained in terms of:

  • Measurement Accuracy: that is, maximum positive and negative error: ± 0.05 bar
  • Measurement Uncertainty: or instrumental uncertainty that takes into account the various factors related to the calibration, namely:

Iref    Uncertainty of the reference standard                  0.01 bar (supposed)
Emax  Max error of measurement relieved                       0.05 bar
Eres  Error of resolution of the manometer                     0.05 bar
from which the composed uncertainty uc can be derived from the following relation:
Instrument calibration formula
and then the extended uncertainty (U), at 95% confidence level (ie at 2 standard deviations):
  
Calibration formula instruments 
NOTE:
Obviously, the measurement uncertainty of the manometer (usually called instrumental uncertainty) is always higher than the measurement accuracy (because it also takes into account the error of resolution of the instrument in calibration and the uncertainty of the reference standard used in the calibration process).

The article describes the standardized analog pneumatic signals (20 to 100 kPa) and electrical signals (4 to 20 mA), as well as the innovative analog and digital hybrid signals HART (Highway Addressable Remote Transducer) and the state of the art of current digital communication protocols commonly called BUS.

CONTROL SIGNALS: ANALOG, HYBRID, DIGITAL

Analog Control Signals

The traditional most commonly used transmission signal of the type:

  • Direct current signals (Table 1): for connection between instruments on long distances (i.e. in the field area)
  • Direct voltage signals (Table 2): for connection between instruments on short distances (i.e. in the control room)
LOWER LIMIT (mA) UPPER LIMIT (mA)
4
0
20
20
(1)  Preferential signal

Table 1- Standardized signals in direct current (IEC 60381-1)

LOWER LIMIT
(V)
UPPER LIMIT
(V)
NOTE
 
1
– 10
5
5
10
+ 10
(1)
(1)
(1)
(2)

(1) Voltage signals that can be derived directly from normalized current signals

(2) Voltage signals that can represent physical quantities of a bipolar nature

Table 2- Standardized signals in direct voltage (IEC 60381-2)
The signal different from 0 (live zero) for the variable at the beginning of the measuring range (true zero), is used for electrical instruments to power the instrument and in general to highlight connection losses (as in the pneumatic instruments).

Moreover, given their characteristics, the current signals are used in the field instrumentation, while the voltage signals are used in the technical and control room instrumentation.

Finally, the current signal with respect to the voltage signal has the advantage of not being affected by the length and hence the impedance of the connection line at least up to certain resistance values, as it is illustrated in Figure 1.

Limit supply region analog signals
Limit supply region analog signals

Figure 1 – Example of the limit of the operating region for the field instrumentation in terms of its connection resistance Omega to the supply voltage V
 
Key:

  • Vdc = Actual supply voltage in volt
  • Vmax= Maximum supply voltage, 30 V in this example
  • Vmin= Minimum supply voltage, 10 V in this example
  • RL= Max. load resistance in ohm at the actual supply voltage:
  • RL <= (Vdc – 10) / 0,02 (in the Example reported in Figure 1)

Hybrid Control Signals

Hybrid signals, that is of the analogical-digital protocol type, were standardized “de facto” by a Consortium of Manufacturers as:

HART (Highway Addressable Remote Transducer) which precisely superimposes to the analog normalized signal (4 ¸ 20 mA) a digital signal modulated in frequency according to the Standard Bell 202, with amplitude of +/- 0.5 mA and with frequency found in Table 3, which given the high frequency of the superimposed signal, the added energy is virtually zero, so this modulation does not cause any disturbance on the analog signal.

NOTE: Remember that to operate the HART protocol requires a resistance of 250 ohms in the output circuit!

HART protocol
Table 3 – HART protocol with signals standardized BELL 202

Digital Control Signals

Digital signals were normalized towards the end of the 1990s by the International Standard IEC 61158 on Fieldbus Protocol, but still not much applied since it standardizes as many as 8 communication protocols, and as each digital protocol is essentially characterized by following features (see Table 4):

  • Transmission encoding: Preamble, frame start, transmission of the frame, end of the frame, transmission parity, etc.
  • Access to the network: Probabilistic, deterministic, etc.
  • Network management: Master-Slave, Producer-Consumer, etc.
PROTOCOL
IEC
61158
PROTOCOL
NAME
 
NOTE
1
2
3
4
5
6
7
8
Standard IEC
ControlNet
ProfiBus
P-Net
Fieldbus Foundation
SwiftNet
WorldFip
InterBus
(1)
(1) Protocol initially designed as unique standard protocol IEC

Table 4 – Standardized protocols provided for by the International Standard IEC 61158

 Finally, Figure 2 shows the geographic path of the measurement signals from the “field” to the “control room” through the “technical room”, where the sorting (also called “marshaling”) takes place and the transformation of the current signal in the voltage signal for the controller (DCS: Distributed Control System) and then through digital signals flow in “control room” for the operator station and video (HMI: Human Machine Interface).

Instrumentation measuring chain
Figure 2 – the Typical path of a measurement chain from the field to the control room

INSTRUMENTATION POWER SUPPLY

  • For pneumatic instrumentation: 140 ± 10 kPa (1.4 ± 0.1 bar) for the pneumatic instrumentation (sometimes the normalized pneumatic power supply in English units is still used: 20 psi, corresponding to ≈ 1.4 bar)
  • For electrical instrumentation: Continuous voltage: 24 V dc for field instrumentation, Alternating voltage: 220 V ac for control and technical room instrumentation

The connection and transmission signals between the various instruments in the measuring and regulating chains are standardized by the IEC (International Electrotechnical Commission):

  • Pneumatic signals (IEC 60382): 20 to 100 kPa (0.2 to 1.0 bar) (sometimes the standardized signal is still in English units: 3 to 15 psi, ≈ 0.21 to 1.03 bar)
  • Electrical signals (IEC 60382):

About the Author

Alessandro Brunelli

Author: Dott. Prof. Alessandro Brunelli – Professor of Instrumentation, Automation, and Safety of Industrial Plants

Cavaliere dell’ Ordine al Merito della Repubblica Italiana (OMRI N. 9826 Serie VI)

Author of “Instrumentation Manual” (available in IT):

  • Part 1: illustrates the general concepts on industrial instrumentation, the symbology, the terminology and calibration of the measurement instrumentation, the functional and applicative conditions of the instrumentation in normal applications and with the danger of explosion, as well as the main directives (ATEX , EMC, LVD, MID, and PED);
  • Part 2: this part of the book deals with the instrumentation for measuring physical quantities: pressure, level, flow rate, temperature, humidity, viscosity, mass density, force and vibration, and chemical quantities: pH, redox, conductivity, turbidity, explosiveness, gas chromatography, and spectrography, treating the measurement principles, the reference standard, the practical executions, and the application advantages and disadvantages for each size;
  • Part 3:illustrates the control, regulation, and safety valves and then and simple regulation techniques in feedback and coordinates in feedforward, ratio, cascade, override, split range, gap control, variable decoupling, and then the Systems of Distributed Control (DCS) for continuous processes, Programmable Logic Controllers (PLC) for discontinuous processes and Communication Protocols (BUS), and finally the aspects relating to System Safety Systems, from Operational Alarms to Fire & Gas Systems, to systems of ESD stop and finally to the Instrumented Safety Systems (SIS) with graphic and analytical determinations of the Safety Integrity Levels (SIL) with some practical examples.

Download the PDF – Traceability & Calibration Handbook

You can download an excerpt of the “Instrumentation Manual” (Brunelli, 2018-2019) by clicking on the following link:

pdf iconTraceability&Calibration Handbook (Process Instrumentation)

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Share on whatsapp
Share on telegram
Share on linkedin
Share on email

Searching for something else?

SHOP.PROJECTMATERIALS.COM

Buy Online!

ANSI Y-Strainers

🚀 Stock Delivery

🇪🇺 EU Origin

💰 Factory Prices

✅  Top Quality

SIZES       ⅜” to 12″
RATINGS 150# · 300# · 600# · 800#
GRADES  Carbon · Stainless
ENDS        Flanged RF · BW · THD · SW