Why do 1 Phase Transformers have less Impedance than 3 phase Trans?
Why do 1 Phase Transformers have less Impedance than 3 phase Trans?
(OP)
If you compare the percent impedance of a 3 phase transformer to a three, single phase transformers connected to give the equivalent transformer, why is it that the single transformer will have a smalled percent impedance?






RE: Why do 1 Phase Transformers have less Impedance than 3 phase Trans?
Smaller transformers tend to have lower impedances.
But the impedance is what it is. Connecting single-phase transformers as a three-phase bank doesn't change their percent impedances.
Think about how the impedance is determined by test: The secondary is shorted and the voltage on the primary is increased until rated current is seen on the secondary. This voltage (as a percentage) is the impedance.
If you're comparing three single-phase pole-mounted cans with a three-phase pad-mount you are not really comparing apples to apples.
RE: Why do 1 Phase Transformers have less Impedance than 3 phase Trans?
RE: Why do 1 Phase Transformers have less Impedance than 3 phase Trans?
Hehe, this isn't a specific application based question, but a question more on the theory.
I was looking at Table 1, Impedance Data for three phase transfoermers in IEEE STD 242-1986 (The old Buff Book.) On the bottom it had a note:
"Three phase banks with three single phase transformers may have values as low as 1.2%."
So if we were comparing "typical" transformers on an equivalent KVA size, a higher % impedance would mean the transformer has more loses (More R and XL.) So why would a typical 3 phase transformer have more loses than single phase transformer with the same load rating?
As Davidbeach said: "Single phase pole mount cans generally are built to have a lower impedance than 3-phase pad mount transformers." Just a simple curiosity question on why?
RE: Why do 1 Phase Transformers have less Impedance than 3 phase Trans?
RE: Why do 1 Phase Transformers have less Impedance than 3 phase Trans?
Well, I don't think it has much to do with single-phase versus three-phase. Lower impedances mean less voltage drop across the transformer as the load varies, so better voltage regulation. It may also have something to do with the physical construction of the pole-mounted core and coils versus the pad-mount.
Also, three-phase transformers tend to be higher kVA ratings where the secondary fault current becomes an issue. This is rarely the case with pole-mounted single-phase units.
If you look at single-phase pad-mount transformers, they can have very low impedances as well.
Also, higher impedances equate to higher losses only if the resistance is higher. Transformer reactance is not a source of energy loss.
RE: Why do 1 Phase Transformers have less Impedance than 3 phase Trans?
The full load losses in a transformer are mainly dependent on the current and the resistance of the transformer.
The current and the phase angle of the current are determined by the vector sum of the reactances and resistances of both the transformer and the load, including conductors.
The %impedance is significant only during fault conditions when the transformer impedance is the total impedance.
Under short circuit conditions the greatest part of the transformer losses will be I^2R losses. We must know the X/R ratio to determine the Resistance.
The "Available short circuit current" is a symetrical current.
The actual short circuit current is asymetrical and is usualy higher. How much higher depends on the point on the sine wave that the fault occured.
A piece of switch gear or breaker with a symetrical interrupting rating of, for example, 20,000 Amps will be able to withstand the actual asymetrical current that will be associated with an available fault current (calculated from the percent impedance) of 20,000 amps.
Two transformers of equal KVA capacity may theoretically have the same percent impedance, but a two to one ratio of resistance. They will both supply the same available short circuit current under impedance testing, but in service, the transformer with the higher resistance will have poorer voltage regulation and greater full load losses.
respectfully
RE: Why do 1 Phase Transformers have less Impedance than 3 phase Trans?
The key bit here is the 'as low as' part.
In the old days, there was less conncern with available fault levels (AFL) than there is today, and the systems were generally smaller with lower primary-sode AFLs.
So the xmfrs were built as cheaply as possible, without regard to AFL and little regard to efficiencies and losses.
Over time, the practices changed, and the minimum impedances for new transformers were increased.
Somewhere in there, padmounts became common, but the avergae polemount is older than the average padmount.
So, my present-day spec for a 50kVA polemount and a three-phase 150kVA padmount provide for the same worst-case AFL on the secondary side, but thius only applies to the ones I am buying NOW.
I have old (polemount) xmfrs in service dating back to 1930. The impedances really are 1.15%, 1.3%, etc. My NEW ones are all 3.0% Z (min imum).
RE: Why do 1 Phase Transformers have less Impedance than 3 phase Trans?
RE: Why do 1 Phase Transformers have less Impedance than 3 phase Trans?
These impedances can vary quite a bit.
RE: Why do 1 Phase Transformers have less Impedance than 3 phase Trans?