First of all, thanks for your replies. I was seeking in Electrical Stack Exchange, but i dont know why i cant seek for proper feedback in there.
This is my first question in this forum, so i hope to start collaborating in a more active way in this an other sections.
djs said:
The biggest issue would be the receiver or ADC. Precision and speed usually do not come together or if they do it is very expensive.
I am agreed with that. Nevertheless, the standard specs for the
NI 9203 +- 20mA 10-bit AI Module DAQ are 0.04%/0.02% Gain/Offset errors and 24k Samples/s per channel. Which is quite good for me for $541 (plus $820 for the chassis).
djs said:
Lowering the signal to noise ratio is just being careful with shielding and grounding. Current loops are low impedance so have a high immunity to noise. On the other hand because they are current driven, wire inductance and capacitance will affect the signal response time.
I agree too. I could suppose here that i am following all clean recomendations.
djs said:
Current loops are low impedance so have a high immunity to noise. On the other hand because they are current driven, wire inductance and capacitance will affect the signal response time.
I am indeed concerned on this point. Do you know how could i calculate error figures from these parameters?. I am really interested in knowing how to obtain an estimate from cable specs, so i can calculate a priori how much and which type of cable should i use for an specific sensor.
danw2 said:
My last 25 years have been in the process field with 1-10Hz sampling rates, so my experience from the early '80's with DAQ at higher speeds is way out of date.
I dont think so... Industries are really filled with 70-80 machinery

...
danw2 said:
Given the range of AI's out there - 10 to 16 bit A/D, single ended to fully isolated, I don't know how a "typical" S:N could be stated.
Neither i

....
danw2 said:
My thought is that to get a system signal to noise value, you'll have to measure it. Put a solid mA signal in at the field end (can the transducers be put into a held static state at a fixed output?) and see how much jitter you get at the receiver end. The problem with using a battery powered floating calibrator or a battery and a resistor is that neither shows the inherent instability of the transducer and its power supply.
I have no idea whether it is typical or not, but the unit I was working with last week has a factory spec sheet 4-20mA accuracy spec of 0.2%FS. With a calibrator attached at the input, 4.4000mA input gave a 4.4002 or a 4.4003 reading. When I connected the calibrator at the field end, the reading was identical. There is a cal routine to zero any error out (probably the dropping resistor), but given the application, I didn't bother. Rock steady, no noise. Belden 8760. The low pressure DP sensor would jitter about plus or minus 0.0002mA at zero DP.
This is the hardest figure i've obtained from my research. That is less than a 0.0001% error. Can i ask you the cable distance for the cable you used?
Perhaps you are right, and i should measure and forget about calculations.
danw2 said:
Without looking up all the capacitance values, I'll speculate that shielded twisted pair like Belden 8760 (18g) or 8762 (22g) will not have any trouble with 1Khz sample rate; similar cable is used for serious audio all the time (ignoring the market segment that values every dB of deliberate distortion . . .)
Common mode (and the normal mode noise that comes with it) might be more of an issue.
Actually the 1kHz sample rate and the cmmon mode rejection will depend solely on the DAQ side, and not on the cable.. or i am wrong?
danw2 said:
Are your transducers 2 wire, deriving power from the loop, or 3 or 4 wire which use a separate wires for DC power? If 3/4 wire then there's a reason to use 0-20mA, if available on the transducers in order to get an additional 20% range at no sacrifice (unless you need the 'live zero' for long term diagnostics/troubleshooting).
Are the transducers and their power supply 'floating' from ground? My antiquated experience was that higher speed AI's were like scope inputs - one side grounded, even the multiple inputs. That could be a problem over 200 yards with common mode between the grounds of each end. Or maybe higher speed AI's are now isolated.
I guess it would be "safer" to have a 4 wire configuration for this case. I am from the segment that value every dB loss

...
danw2 said:
I'm curious about what your analog receiver is with a 1KHz sample rate; that's not a typical PLC AI.
The application is for measuring sensors, like the
393B04, with 6e-5% resolution and 0.02-1700 Hz freq. range. Hope that feed your curiosity

.
danw2 said:
0.01% is one part in 10,000, so you'll need a minimum 14 bit (1:16,000) A/D for that kind of resolution. But it'll probably be a 16 bit A/D in the end given that the leading bit is probably a sign bit and the LSB is a 'toggler'.
National Instruments has a variety of PC based AI cards at a range resolutions and sampling rates.
You are completely right here.. Typical high end PLC 4-20ma AI modules have a 0.1% accuracy or less?, with typical sampling rates of 10Hz?. I am also looking for NI modules.
My open questions about how to calculate 4-20mA loops are uncountable, but could be summarized as follows:
[ul]
[li]If i have a 0.01% or better sensor, such as the above indicated, how should i read it, if they are going to be placed for monitoring, each 200m away from a DAQ?,[/li]
[li]Which is the deal breaker in distance/precision? It is 4-20mA enough?,[/li]
[li]Could the impedance of the cable degrade the sensor resolution totally?,[/li]
[li]Should i use a digital converter immediately after the sensor and move to digital buses -Fieldbus, Profibus, etc.- and forget about 4-20mA?.[/li]
[/ul]
The cost difference between 4-20mA and a digital bus scheme is simply too big for not making that prior evaluation.
Thanks for everything, fellows....