Transformer Primary Cable Sizing Based on Voltage Drop
Transformer Primary Cable Sizing Based on Voltage Drop
(OP)
I would like to get opinions from our experienced folks.
On our CEC code, (Rule 8-102): The maximum voltage drops from the supply point up to the feeder bus and from each branch circuit up to the point of utilization shall not exceed 3%. However, the total voltage drop from the supply point up to the point of utilization shall not exceed 5%.
Our designers sizing 600V transformer primary cables from MCCs are sizing them based on a 3% limit. Consequently, the 600V incoming cable from the 4160-600V transformer feeding the MCC is based on a 2% limit. Again, the 4160 cable primary of the upstream transformer is designed based on 3%.
My comment is that 1) The transformer is NOT a utilization equipment, hence, the 3% rule does not apply when sizing it's primary cable. 2) Since the primary current is a direct proportion ot it's secondary current via transformation ratio, the primary cable voltage drop limit should be lesser than the secondary voltage drop limit to allow compensation for voltage drop due to transformer impedance.
Because of this, we always adjust the primary tap changers because load flow studies indicated very low nominal voltage at the secondary (especially to 600-208Y/120V transformers). When I asked them and presented my argument, they told me that the client instructed them to save cable costs and anyways, the transformer taps can be adjusted.
I came to believe that taps are there to obtain flexibility of voltage adjustments in cases when the supply becomes weak due to future load additions. What is happening is that the primary cables are deliberately sized to 3% VD limit with the justification that the taps may be adjusted...I think something is wrong here...can anybody shed light to me if I'm wrong?
On our CEC code, (Rule 8-102): The maximum voltage drops from the supply point up to the feeder bus and from each branch circuit up to the point of utilization shall not exceed 3%. However, the total voltage drop from the supply point up to the point of utilization shall not exceed 5%.
Our designers sizing 600V transformer primary cables from MCCs are sizing them based on a 3% limit. Consequently, the 600V incoming cable from the 4160-600V transformer feeding the MCC is based on a 2% limit. Again, the 4160 cable primary of the upstream transformer is designed based on 3%.
My comment is that 1) The transformer is NOT a utilization equipment, hence, the 3% rule does not apply when sizing it's primary cable. 2) Since the primary current is a direct proportion ot it's secondary current via transformation ratio, the primary cable voltage drop limit should be lesser than the secondary voltage drop limit to allow compensation for voltage drop due to transformer impedance.
Because of this, we always adjust the primary tap changers because load flow studies indicated very low nominal voltage at the secondary (especially to 600-208Y/120V transformers). When I asked them and presented my argument, they told me that the client instructed them to save cable costs and anyways, the transformer taps can be adjusted.
I came to believe that taps are there to obtain flexibility of voltage adjustments in cases when the supply becomes weak due to future load additions. What is happening is that the primary cables are deliberately sized to 3% VD limit with the justification that the taps may be adjusted...I think something is wrong here...can anybody shed light to me if I'm wrong?






RE: Transformer Primary Cable Sizing Based on Voltage Drop
However... I've designed a lot of these installations, and my experience has been that the transformer primary breaker for 480 V and 600 V primaries ends up being somewhat oversized in order to handle the transformer inrush. Because of this, the feeder size is generally larger than required for the actual load. I've never had a problem with voltage drop on the primary side of these transformers.
What size transformers are we talking about?
RE: Transformer Primary Cable Sizing Based on Voltage Drop
Theoretically, the taps of a transformer can be adjusted to allow for any voltage drop on the incoming feeders that is possible. Possible means that the cable can handle the current/heating and that the taps can adjust for the voltage drop. However, keep in mind that this is based on the assumption that the load is constant and the resulting voltage drop is constant.
In reality, the load probably varies. The suggested voltage drop limit is called for to maintain voltage regulation (constant voltage) from the supply to the load when varying from no load to full load. The percentage of voltage drop is equal to the percentage voltage regulation that you will see on the system.
You are right that taps can be used to compenstate for voltage drop as the system is expanded and load is added. But, the rule still applies that the voltage regulation for the system is affected by the percentage of voltage drop. For a constant load, this is no problem. For a load center with varing loads, this may be a problem. If you design a system that at the beginning is at maximum voltage drop and therefore maximum limit of acceptable voltage variation, then there is no room for expansion and no room for error.
RE: Transformer Primary Cable Sizing Based on Voltage Drop
Is time a loom, weaving all together, or a broom, sweeping all away?
RE: Transformer Primary Cable Sizing Based on Voltage Drop
A "feeder" starts at the separately derived system (in this case a transformer) or the service.
RE: Transformer Primary Cable Sizing Based on Voltage Drop
DPC, I agree with you in terms of your interpretation of the Code. I am just trying to apply engineering sense to it. I look at the transformer as an impedance inserted to a system. The resulting secondary voltage depends on the magnitude of the primary voltage. If I take that the cable from the transformer 208Y/120V secondary terminals to the panelboard to be sized at no more than 2%VD to give me a margin of 3%VD limit in sizing my cables from the panelboard to utilization equipment, then it will be logical for me to design the primary cable to no more 2%VD limit since the primary voltage drop will be reflected on the secondary. Does this make enough engineering sense?
In the above scenario, the same thing applies to upstream transformers.
In this way, I can reserve the taps in case electrical loading starts to expand in the future or in case the supply becomes weak.
RE: Transformer Primary Cable Sizing Based on Voltage Drop
From a practical viewpoint, the voltage drop on the feeder to the transformer primary is generally not much of a problem, due to the reasons I cited previously.