A historic background:
0-20 mA was used for many years when instruments actually needed some current to work. Moving coil recorders and analogue instruments had a 20 mA (sometimes 60 mA) F.S. input and wire resistance was a problem when sending a voltage signal to such an instrument. There were usually little adjustable resistors delivered with those recorders and instruments so that the total series resistance could be adjusted to nominal value.
The idea of using a very (infinitely) high series resistance caught on and since an infinitely high resistance is equal to a current source Thevenin equivalent, the current signal was born. First 0-20 and 0-60 mA and later the now standard 4-20 mA signal.
The reason for the infinitely high resistance was based on the fact that you can add anything to infinity and still have infinity. So, the different wiring resistance wouldn't influence the signal at all. No calibration necessary.
The 20 mA current was recommended for all long wire applications and for good reasons. It has then earned an undeserved reputation for being less unsensitive to transients and HF interference than a 0-10 V signal. That is not true. I have had 4-20 signals being very sensitive to inverter interference where a 0-10 V signal (which very often has a well-filtered sending unit and receiving unit) was completely unaffected. It is a lot depending on actual hardware and application. There is no general rule saying that 4-20 is better than 0-10 V in "long distance" applications - only an opinion.
But, old thinking lives on, and the opinion usually is that 4-20 mA is a lot better than 0-10 V (or 0-5 V). I have used all three for decades. In paper mills, steel works, power stations, anywhere. And I still cannot say that one is "better" than the other.
Gunnar Englund
--------------------------------------
100 % recycled posting: Electrons, ideas, finger-tips have been used over and over again...