MHEA
Electrical
- Jan 1, 2011
- 5
When testing voltage transformers and their associated wiring I normally apply about 1% of the rated primary voltage and get a secondary voltage exactly primary voltage divided by the VT ratio. On some transformers however, I get ratio errors of up to about 10% when doing this. If I then increase the primary voltage to about 20% of rated primary voltage, this ratio error decreases to a negligible amount. It would seem to me that this is tied in to the design of the VT
Can someone possibly give me a technical explaination of this issue.
Many thanks
Can someone possibly give me a technical explaination of this issue.
Many thanks