Off Peak Transformer Heat Gain
Off Peak Transformer Heat Gain
(OP)
Does anyone know how to calculate the reduced heat load of a transformer operating at partial load or no load.
I need to simmulate a building with multiple transformers and have been looking at how to model off peak operation.
Please let me know what you think.
Thanks
I need to simmulate a building with multiple transformers and have been looking at how to model off peak operation.
Please let me know what you think.
Thanks





RE: Off Peak Transformer Heat Gain
A 225 kVA transformer for example, at 0.8 PF when fully loaded will lose something in the neighborhood of 3% of it's load as heat. So 225 x 0.8 = 180 kW of real power. At 3% of that = 5.4 kW of heat loss x 3.4btuh/w = 18360 btuh. If the same transformer is normally loaded to 80% of it's capacity, the heat losses would be 80% of that. If off-peak loading was 20% of it's rated capacity the heat losses would be 20% of that.
These are rough estimates, transformers serving heavy resistive loads will have a higer power factor hence higher losses. Transformers serving heavy non-linear loads such as flourescent lighting and switched power supplies will have higher harmonic loads that add to heat dissipation. Transformers serving heavy motor loads will have lower power factors but higher peak loads during motor starting.
So the above is an example, but as they say, your results may vary. The real numbers depend heavily on types of loads, system power factors, power quality and load profiles.
But, I hope this was of some use.
Regards,
EEJaime