IN PREPARATION, let me have any comments or corrections
Measuring hydrocarbon viscosity in-line/on-line (companion piece to FAQ 698-1004)
Viscosity is one of the most difficult measurements to make in the process.
There are two types of process viscosity measurement:
Behavioural Viscosity: the measurement of viscosity at the process temperature, the easiest of the measurements to make and where most process viscometers are successfully applied. Viscosity affects the way fluids behave – how they atomise, flow or coat.
Analytical Measurement: the determination of the viscosity at a reference temperature. This measurement is used to assess and control quality. This is the most difficult measurement to make.
Most refinery applications require analytical measurement.
Many processes that must use viscosity as a quality control parameter are still controlled by collecting samples for laboratory analysis. This usually requires that the processes can be managed as near steady state as possible. In some processes online viscosity is essential.
Why are there not more online measurements? Simply, it is the cost and difficulty of analytical measurement and the limited applicability of the available industry standard technology.
Hydrocarbons, at the operating temperatures, can be treated as Newtonian fluids, the simplest behaviour there is. If the fluid rheology isn't the difficulty, what is?
It is the temperature.
In the process, viscosity may change due to a quality change and it may also change due to a temperature change – with some fluids the sensitivity to temperature may be higher than sensitivity to quality. This presents some serious problems. It is necessary to limit and to compensate for the effects of temperature in order to isolate the effects of quality change.
The temperature viscosity relationship, as found in ASTM D341, reveals a further dimension to the problem:
Log10.log10( ν+0.7)=A-B.log10(T+273.15) where ν is the kinematic viscosity at temperature T°C.
This log log relationship means that the rate of change of viscosity with temperature changes and increases significantly at lower temperatures. This means that there is an increasing sensitivity to temperature error with higher viscosities at lower temperatures. This is why we might prefer to use a higher reference temperature for the more viscous fluids. If we want to calculate the viscosity, small temperature and viscosity measurement errors can create large base viscosity errors. It means that designing indirect (calculation) methods can be very demanding. Even so, and even given that this equation is a not an exact representation of the behaviour, just a good practical model, it does let us consider two different ways to determine viscosity at a reference temperature:
Direct Measurement: a sample stream is temperature conditioned so that when it reaches the viscometer it is at the reference temperature.
Indirect Measurement: the viscosity the reference temperature is calculated from the viscosity measured at some other temperature.
Many viscometer manufacturers consider the calculation method as little more than a random number generator.
The process capillary viscosity analyser is a first generation instrument (i.e. developed from laboratory technologies and necessarily involving some compromises to enable them to work in the process environment) and is an adaptation of the capillary method in ASTM D445. This became the industry standard method some 40-50 years (or more) ago.
In the laboratory, the capillary is immersed in a temperature bath with very precise temperature regulation. A sample of oil is then allowed to flow through the capillary under gravity. The time taken increases as thekinematic viscosity increases.
In the process analyser the flow rate is maintained constant by a small gear pump and it is the headloss that increases as dynamic viscosity increases. A density meter is necessary to derive the kinematic viscosity (density may also often be a required measurement so this is not necessarily a negative feature). The conventional process capillary viscometer uses this method. It provides a continuous measurement of viscosity because the flow is continuous through the capillary. There are some process capillary viscometers that adhere more closely to the original laboratory method. A sample is grabbed from the fast loop and repeatedly processed until the temperature is stable and the result repeatable. This may have better temperature stability, be more accurate and it may return the kinematic viscosity but it is no longer a continuous measurement but intermittent. These are generally more expensive instruments because they are more complex.
These analysers do have some limitations. The capillary used may not be a true capillary but it is still vulnerable to dirt and it has a limited viscosity range. They may also require a lot of maintenance and hence have a low on-stream factor as a result. They are also expensive in themselves and expensive to install and operate. Since many are installed in analyser houses remote from the process, response times can be long and installation still more expensive. It also adds the complication of long sample lines requiring heat tracing and maintenance.
Quality control is a function of both the measurement accuracy and the control response. Where response times are long product quality excursions can become significant, no matter how accurately measured.
But there is much in their favour also. Not least is that for forty years or more they have been the only real way to measure viscosity in process and much of refinery practise is built around the advantages and limitations of this technology. They enable inline processes that would otherwise need to be batch processes. Fuel oils can be inline blended instead of batch blended in a tank.
This schematic Capillary shows a typical arrangement for the conventional continuous flow device. The process viscosity analyser consists of the capillary in a temperature bath. The flow to the capillary first passes through a coil of tubing to allow the temperature to be brought to within acceptable limits of the reference temperature. This requires that the fluid temperature in the fast loop sample line from which it draws the measurement sample should have heat exchangers fitted that will bring the fluid temperature within 5-6 degrees of the target temperature. Flow is delivered by a PD pump at around 50ml/min. Both the pump and the capillary require protection by filters. A density meter is often also installed.
It is interesting how the various elements are all so complimentary. The capillary requires a low flow rate and has a low thermal mass and, through the connections to the dP transmitter, very limited heat flow to the outside environment. The temperature bath is at its most effective with a very low flow rate because a low sample flow rate has a low heat flow associated with it. This enables the temperature to be brought very close to the target temperature and maintained very stable. Other technologies attempting direct measurement have tried to use the temperature bath without success. They are usually more massive sensors with far greater heat conduction to the external environment and usually demand much higher flow rates to be effective. Temperature control may prove difficult but that is often the least of the problems.
The alternative approach might be to use the heat exchanger for temperature control. This is much better suited to the higher flow rates other technologies require, but it suffers from thermal lag; that is, it takes time to respond to a temperature change. Precise temperature control is difficult. The use of heat exchangers to deliver a direct measurement method has been proven possible, but a somewhat unnecessary approach given the success and flexibility of indirect methods. (go to http://viscoanalyser.com/page34.html for details)
So the Process Capillary Viscosity Analyser has thus been pretty much the only way to make analytical viscosity measurement in the refinery for a good few years.
So, what about indirect (calculation) methods? Are such systems really random number generators? While this may have been true in the past, modern technology has made the difference with micro-processors and new second generation (designed for process and not compromised lab technologies) sensor designs.
ASTM D341 gives us an equation to use but no methods (and if it did, there is no one universal method anyway).
A successful exploitation of indirect (calculation based) methods depends on:
A suitable mathematical model of temperature viscosity behaviour – from ASTM D341
A suitable sensor properly installed – digital viscometers (displacement vibrational viscometers with bandwidth measurement) that can report the kinematic viscosity, that are comprehensively calibrated, accurate and have sophisticated signal processing – e.g. Emerson's 7827 and LEMIS's DC 52 ViscoAnalytic (made possible by micro-processors).
A choice of methods that will adapt the solution to various applications and system design that accounts for the actual operating conditions and application objectives.
The ASTM D341 equation has two constants, "A"&"B", which seem to be anything but constant. Both can vary significantly even for small changes in quality. That means that the values of both must be found before viscosity can be calculated at the reference temperature. As quality changes, new values must be found. Finding "A" & "B" is straightforward if we have the viscosity of the fluid at two different temperatures. This is the basis of the dual viscometer solution.
Dual viscometer use two viscometers installed in series with a heat exchanger between them so that as the fluid flows from the first viscometer to the second, its temperature is changed significantly.
These systems are a well proven solution used for fuel oil blending, pipeline blending of heavy crude with distillate and asphalt/bitumen blending.
A schematic Dual Viscometer shows the key components of a typical system. In some applications two heat exchangers may be required as shown. The temperatures do not need to be precisely controlled to a predetermined value and they can vary so a heat exchanger is well suited to the application. However, the set point values should be carefully chosen to optimise accuracy and to reflect process conditions.
Dual viscometer systems can be complex and not all processes can justify or require such a system. The ideal would be a single viscometer with as few accessories as possible. Heavy fuel oil blending is such an application. It is OK in the refinery but much blending takes place in smaller amounts in fuel terminals or even on fuel barges and some operators would like to blend at the engine. What is sought is a simple calculation method that would allow the use of a single viscometer. One such attempt is the Shell V50 equation. This is specially developed for HFO but is often misused and is therefore not very satisfactory.
An alternative is the multi-curve method.
Multi-curve depends on the ratio of the viscosity of one quality to the viscosity of another quality at the same temperature being the same ratio at all temperatures. This is generally true for fuel oils but not for lubricants. The signal processor compares the measured viscosity to the viscosity in each of a range of programmed reference curves, at the measurement temperature, to find the ratio of the sample fluid to the reference curves and then applies these ratios to the reference curve viscosities at the reference temperature to infer the viscosity of the process fluid at the reference temperature. This usually provides very satisfactory results. It is not generally suitable for lubricants because lubricants do not exhibit the necessary property, two different lubricants may have the same viscosity at one temperature and differing viscosities at other temperatures. (BUT: it depends on the application – in some applications even the simplest methods may be used - even for lubricants)
We do not often achieve our ideal which would be to simply install the sensor in the side of the main pipeline. Side of pipe insertion viscometers have been used with HFO blending in refinery applications but the trend in high volume production is toward dual viscometer solutions for commercial reasons and the change in ISO 8217 from a reference temperature of 100°C to 50°C means that this is no longer suitable for heavy fuel oils; the preferred single viscosity solution is illustrated in this schematic (link). A heat exchanger is used to bring the sample temperature from the process temperature of 90-130°C to somewhere in the 50-60°C range.
There is one more basic approach, the equation method.
Equation Methods are best illustrated by referring to the spreadsheet RMI ASTM D341 rev1.xls (available from www.viscoanalyser.com/page8.html) which provides a nice illustration of the principle. The spreadsheet has some default curves preloaded where the values of "A" & "B" are calculated and shown (these values are then used to calculate the viscosity at a range of other temperatures as also shown). It will be seen that the value of "A" changes more noticeably than the value of "B". It can also be seen that "B" does not exhibit much variation even with the range of different qualities shown. We need to be careful not to presume that these small changes in "B" are not significant because for fuel oils they are, but it leads to the supposition that for small quality changes, "B" may not vary significantly. If, for practical purposes we can assume that once we know a value for "B", we only need to find "A" as only "A" will change significantly and thus we have an equation that can be solved with a single process viscosity measurement. We find "B" by combining a process measurement with the laboratory analysis of a sample collected at the same time. The definition of "B" being a constant depends on the acceptable accuracy and may also depend on periodically lab testing to test that it is within limits. Many process control applications are designed to maintain a constant quality, and in some of these applications, the conditions are such that the Equation method is a valid solution. In application, there is no evident difference between the basic installations for multi-curve or equation. However, the conditions under which Equation can be used are usually such that heat exchangers are not used so the installation does often simplify.
Once the principles of indirect measurement are understood and once there is an appreciation of the factors that must be addressed to optimise the accuracy, then with the right sensor, good results can be achieved.
Note: In this article the LEMIS and Emerson bandwidth displacement viscometers have been indicated as being the most suitable for this type of measurement. Care must be taken to recognise that this does not necessarily exclude other technologies or manufacturers.
They have been singled out because they simultaneously measure both density and viscosity in the same sensor at the same conditions. They also have an extensive calibration and sophisticated software and are capable of good accuracy.
If the ASTM D341 equation were an exact description of the temperature/viscosity behaviour, and if the sensor measured without any significant error, then system design could be unconstrained with the maximum degrees of freedom. We could measure at process conditions and confidently calculate the viscosity at any other temperature. But neither is true and therefore these factors must be considered in system design. The degree to which sensors differ is the degree to which they are more or less constrained in system design. Consider the fuel oil blending application. Measuring at between 90°C and 130°C with a reference temperature of 100°C, there is sufficient freedom to install in the side of a large pipeline without flow or temperature regulation. But if the reference temperature changes to 50°C then the errors become highly significant and the system must be designed with much tighter constraints (as illustrated previously). This schematic shows how a multi-curve sensor might be installed in this application. The heat exchanger reduces the temperature differential. multi-curve .
What one should consider is that the degree to which other sensors are more or less able will be reflected in the system design constraints.
It might be helpful to consider the direct method as a degenerate form of indirect (calculation) methods where the allowable temperature differential (between the measurement temperature and the reference temperature) reduces to zero and where temperature regulation needs to become increasingly precise. Thus other sensors, e.g. that only measure dynamic viscosity and require a density signal from a separate sensor or which are less accurate or less extensively calibrated, may also be capable of indirect measurement methods, but in a more degenerate system i.e. with more constraints. Much depends on the sensor accuracy, functionality, installation conditions and the application itself.
Some manufacturers may use alternative calculations and methods. When considering any solution, the questions are always "how accurate are they?" and "how much is involved in maintaining them on-stream?"
Outside of hydrocarbons we may encounter some more extreme fluid conditions and more exotic rheological behaviour. There is also a need for analytical measurements which is often satisfied with niche market solutions and niche market sensors.
Question Asked: "Can you expand on vibro type viscometers immersed in flowing fluids? It has been mentioned that the vibration based technology is one of the most promising in the measurement of viscosity."
"....one of the most promising in the measurement of viscosity" : Modern electronics allow sophisticated signal processing and extensive computational power which has transformed many measurements.
Industry expects modern sensors to:
have no moving parts,
that require little or no maintenance,
may not require cleaning,
do not require protection against solids,
which have "smart" electronics with sophisticated software and extensive computational capability.
more accurate and faster responding measurements
it also expects that they will not require any special skills to use or maintain.
they should therefore also have a very high on-stream factor and justifiably be able to lay claim to such clichés as "fit and forget" "straight from the box" and so on.
But while the technology is, in principal, very suited to the environment, many of the manufactured sensors are not manufactured suitable for the analytical measurements which dominate in the refinery. This is because many were designed for behavioural applications and therefore do not have sophisticated electronics and may not have much beyond a basic calibration, in the behavioural applications there is no need. We can be sure that with better electronics and a more comprehensive calibration that most would deliver much better accuracy than the 1-2% of span most claim.
But not all will be suitable for analytical measurements. What we do know is that density appears to be a significant factor. The successful sensors are those which measure density from the same sensor at the same time as they measure viscosity. ( Dangerous to read too much into this: all dogs have four legs, my cat has four legs therefore my cat is a dog..) However, the problems may also be with an inappropriate design or a lack of understanding of how to make a good analytical measurement, there are many factors which are insignificant for behavioural measurements but which assume great significance in analytical measurement. Once these factors are recognised and addressed, many more and different vibrating sensors may become available. The caveat is that the established market for behavioural viscometers is vastly greater than for analytical measurements and there may not be much incentive to make the transition.
However, at least some other notable manufacturers of vibrating element sensors are trying to establish themselves in analytical measurements but their success so far appears limited. In the case of one it was the need to re calibrate periodically (not the sensor but the temperature viscosity method they use). If the calibration is not stable it means accuracy is drifting and errors in final product quality will result. In part the lack of density is an issue. Some manufacturers offer the option to input a fixed density value or input density as a 4-20mA signal. The first option is probably where many of the accuracy problems come because density variations will impact on kinematic viscosity. The second adds cost which suggests it is more cost effective to purchase a viscometer which measures density.
So at the moment the successes are with those sensors that measure density as well as viscosity.
How enduring the vibrating sensors will be is another question. There are new technologies that use ultrasonics and some that use electromagnetism. As the process viscosity market develops we may see some significant advances in various technologies.
Vibration type viscometers immersed in flowing fluids: Theoretically, if the sensor does not displace the fluid (and cannot measure density) it is not affected by flow. In reality mechanical effects can lead to some flow sensitivity.
In theory also, those sensors which displace fluid are affected by flow. However, the extent to which they are affected depends on the sensor itself.
This sensitivity, of whatever cause, may be evident with some sensors at low viscosities. In practise it has not been necessary (with either the Solartron (Emerson) or LEMIS to introduce a flow correction, simply a flow limit. Another technology protects thee sensor from flow effects by using a shroud.). There are other reasons for flow limits including, where side of pipe installed, vortex shedding. But for analytical measurements, the opportunities for side of pipe installation are few. It is far more usual to install in a slip stream or bypass with a PD pump. A PD pump delivers a constant flowrate (allowing a correction, if necessary) even if viscosity varies. In one application, bitumen in the bypass slowed so much it set solid. Once a PD pump was installed the problem did not recur. Having a constant flow is an advantage where heat exchangers are used; it helps control that the only thing varying is the fluid input temperature. Flow is also an important part of the self cleaning function. If the flow rate is sufficiently high, then the flow will continuously "wash" the sensor clean. Coupled with PTFE coating for dirty fluids, these sensors may not ever require cleaning. This is very important for another reason. Good flow rates bring good heat flow. That helps limit temperature gradients (and viscosity gradients). One last factor is that vibrational (and many other sensors) are most sensitive to the fluid in the boundary layer i.e. the layer of fluid on the surface of the sensor. It is important that this layer is continuously refreshed and that requires a good flowrate. Analytical measurements are more complex and expensive than behavioural measurements but often very competitive with capillary methods and deliver many more advantages including the ability to measure where capillary viscometers cannot be used including in-tank. An example of this is in batch reactions such as the polymerisation of Methyl Methacrylate where end point spotting is critical and where temperature and viscosity both vary significantly. Many applications are not as replacements for capillary viscometers but new applications where control previously was by taking samples for lab analysis.
The ability to control Quality within the target limits is a function of the measurement accuracy and the speed of the control response; the fast response and comparable accuracy of vibrating element sensors is another significant advantage.
3D scanning is revolutionizing how engineers digitize parts. Not long ago, if you had a physical object and wanted to create a 3D model of it, you had one option: bust out the calipers, measure every nook and cranny, and create the model yourself. Download Now
Material selection can be a guessing game, but it doesn’t have to be. If you’re an engineer looking to quantitatively analyze a material to determine if its properties fit your application, then you’ll need to be ready with a wide range of facts. Download Now
Teach pendants are a critical part of the factory environment as they act as the human/machine interface for industrial robots. But since teach pendants are handheld, they are under a greater risk of damage due to falls. Download Now
Product development in most companies requires a sequential iterative process to come up with the right product. It can be time-consuming and time is a resource that is in the shortest supply. Download Now