Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations waross on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Electronic Transmitter Ranging 1

Status
Not open for further replies.

Weegie

Petroleum
Feb 20, 2007
60
Are there any 'Rules of Thumb' in regards to the calibration ranges of electronic transmitters ?
I have a process with norm/max values of 100/150psig.
I've asked for a calibration range of 0-200psig (4-20mA) with a normal value at around 12ma and a max value at 16mA.
I also have a range on a temperature transmitter with norm/max process values of 127/149 deg C. I'm ranging the transmitter 0-200 deg C. My boss however says it should be 50-200 deg C. I asked why and he said 'it's always been that way'. I'm quite sure my 0-200 range is just as good as his 50-200 in terms of accuracy etc as we're calibrating modern electronics and not electro-mechanical transmitters from the seventies. Any thoughts, comments ?

Thanks.
 
Replies continue below

Recommended for you

I usually maximise the resolution and minimise the error.
The sensor may measure with good precision but the electronics which convert to analogue usually have an accuracy penalty. It is important not to give too much of that accuracy away.

So of the temperature is only ever 127 to 149 then a useful range is 120 to 150. In fact, there is nothing to prevent you spanning it as 127 at 4mA and 149 at 20mA, the precision with which the range limits are specified suggests either that the limits are critical or someone hasn't really thought about the limits and has simply plopped down the results of a calculation (e.g. from degF to degC.).

What is the point of ranging the temperature for 0degC when you will never operate at this temperature?

Presumably, if you do get to this temperature then it is a fault condition and you need an alarm not an accurate temperature measurement.
What does it matter if it is 50 or 60 deg C when both are in low temp alarm condition?
If the analogue is configured for 2mA as an alarm trip then you'll get an alarm whenever you drop below the minimum temperature and it seems from the range data that precise temperature values are critical.

Hence if the temperature range is critically 127 to 149 degC then 150 as a top and even 125/129 as bottom.

The reason a lot of people choose O to 200 for 4-20mA dates back to the simple analogue dial type displays and the early primitive digital displays.
Today indicators are smarter so when they get 4mA as 125degC and 20mA as 150degC they have no trouble converting the signal into the correct temperature value.
If there are alarms required then use a trip amp.

Always seek to understand the reasons why "We've always done it that way". There may be good reasons behind it or no good reason at all.
Similarly, validate the range limits. Lots of nasties can creep in through casual estimates of working ranges made initially being carried through to final specifications.
Note: you can always add a second span for a more coarse temperature display if needed.



JMW
 
Nicely stated, jmw.

Good on ya,

Goober Dave
 
To add a slight bit to jmw's excellent dissertation.

Most electronics tend to have some form of nonlinearity, which is often corrected via a two-point calibration. Since the nonlinearity is fixed, then the closer the two points are to the actual expected measurement, the smaller the nonlinearity error, i.e., if you could calibrate just outside of the expected range, the nonlinearity error is minimized.


If everything was perfectly linear, then short of a resolution scale change, the calibration points wouldn't matter, but almost nothing is perfectly linear.

TTFN

FAQ731-376
 
I typically try to have the minimum value at around 20% and the maximum value at around 80% of the range of my sensor. Nonlinearities are often worse at the very top and bottom part of the sensor range. As previously mentioned, however, there is a tradeoff between resolution and span. On occasion I have used a wide range sensor for manual startup and a narrow range sensor for automatic control at the operating condition for that reason.

xnuke
"Live and act within the limit of your knowledge and keep expanding it to the limit of your life." Ayn Rand, Atlas Shrugged.
Please see FAQ731-376 for tips on how to make the best use of Eng-Tips.
 
Gents, thanks for the replies.
My numbers for norm/max are as per process engineers.
Previous workplaces have min/norm/max values where you have a better idea of the overall operating range.
With norm/max numbers only and no min value, that's my reason for zero to whatever hoping not to have nuisance DCS alarms if/when in a min operating state.
Previous experience has shown that process numbers in the design stage do not always run true after startup so, I tend to go with a broader calibration span as opposed to calibrating close to numbers stated in the design process as we'd be re-ranging many control loops at site due to actual process conditions being outwith the specified design parameters.
 
Weegie,
that's exactly what I was referring to.

What I suppose one ought to do is span as you suggest for the installation and connection testing during the pre-commissioning stages and then optimise the span and bias settings during commissioning.

... and pigs might fly.... some of these things just never get sorted out during commissioning, one plant manager at a refinery explained to me that if they can get to 95% of what they want they are happy. The last 5% is just too much time, cost and trouble to achieve.

Of course, with today's transmitters, it is increasingly affordable to choose MODBUS/CAN BUS or any other digital signal.
Simple analogue signals are increasingly used for simple display only.
In that case it really doesn't matter because the digital signal is used for control, data logging etc. while the analogue is there to give a plant operator something to look at (and in which case it will be heavily damped so as not to confuse and quite unresponsive to step changes, but unless the operator is expected to do something, so what?)

I just commissioned a sensor where all these factors came up yet again.
Quite why the client had written in such wide ranging min/nom/max values isn't clear but presumably, as you say, the values derived during the original design phase.
Somewhere along the line these values ought to be revisited.

In my case the sensor output is designed to be the measured process variable input to a PID controller.
It will spend all its time at or around one set value...and if there is a "control excursion" the actual accuracy or extent of deviation is no where near as critical as the fact that the product is off spec.

Of course, by whatever law it is that is invoked here, having commissioned the sensor and gone to great lengths to maximise the accuracy (this was a density sensor) the operator then takes a sample and measures the density with a hydrometer and a thermometer, and then looks up the concentration on a chart.
But that is life, it doesn't matter that you have a sensor with a calibration traceable to a national standard, commissioning is where you make the online measurement match the local lab measurement.

But the problems with on site sampling, sample management and lab analysis is a whole other topic.

JMW
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor