Hi Guys, this has made for some interesting reading, and triggered something which I noticed recently. I have about 230 VFD's (525V, 50Hz)in my plant, and we are doing our own repairs and fault finding (not relying on supplier contracts etc..)
Now although we have become quite good at it, some things are still lacking. Example: I noticed that when measuring on the input side of the drive (no line filter, small load) with a true RMS meter (Fluke 87III and a Fluke i200 AC current clamp) I get about 1.4 Amps, but when measuring on the output((no filter) I get 14Amps, which corresponds with the drive's indication.
All the drives are PWM, makes use of 6 pulse Rectifiers and utilizes LEM (Linear Electromagnetic Modules) on the output for current sensing.
Is this difference in value because of signal noise imposed by the different switching frequencies between the input and output power equipment? Or am I just using the wrong equipment?
Thanks in advance. I don't think buzzp expected so many responses. Obviously this is a subject of large debate!