Relationship of Feeder Resistance, Voltage Drop, and Power Dissipated in a Circuit
Relationship of Feeder Resistance, Voltage Drop, and Power Dissipated in a Circuit
(OP)
The new California Energy Code has adopted the fine print notes of the NEC for voltage drop. I am well aware of the undesirable affects of excessive voltage drop on circuits, especially motor loads (decrease in starting torque and overheating of the winding) but me and a coworker were wondering why the energy code would be stipulate voltage drop requirements. I started thinking the excessive power being dissipated in the wires is wasted power and reducing this wasted power in the conductors would lead to energy savings.
My coworker countered that a 120V circuit feeding a 60W bulb would draw a current of 1/2Amp and that the resistance in the wires wouldn't change the power dissipated in the circuit.
Going along that thought process I realized for panel schedules we record the circuit as consuming 60W of power only, so I started to think maybe I was incorrect in my original assumption? So would the circuit actually consume 60W of power for the Bulb and then some additional wattage that is wasted in the lines? Or would the overall power consumption of the circuit be 60Watts and if the resistance of the wires is increased there would be more power consumed by the wires and less than 60W of power consumed by the load so that the total consumption of the circuit stays at 60Watts? Or do we just not account for the power dissipated in the feeders because it is marginal compared to the overall load of the devices throughout a building?
I feel kind of foolish having an issue thinking this one through, so I appreciate the input.
Thanks all.
DJR
My coworker countered that a 120V circuit feeding a 60W bulb would draw a current of 1/2Amp and that the resistance in the wires wouldn't change the power dissipated in the circuit.
Going along that thought process I realized for panel schedules we record the circuit as consuming 60W of power only, so I started to think maybe I was incorrect in my original assumption? So would the circuit actually consume 60W of power for the Bulb and then some additional wattage that is wasted in the lines? Or would the overall power consumption of the circuit be 60Watts and if the resistance of the wires is increased there would be more power consumed by the wires and less than 60W of power consumed by the load so that the total consumption of the circuit stays at 60Watts? Or do we just not account for the power dissipated in the feeders because it is marginal compared to the overall load of the devices throughout a building?
I feel kind of foolish having an issue thinking this one through, so I appreciate the input.
Thanks all.
DJR






RE: Relationship of Feeder Resistance, Voltage Drop, and Power Dissipated in a Circuit
Bill
--------------------
"Why not the best?"
Jimmy Carter
RE: Relationship of Feeder Resistance, Voltage Drop, and Power Dissipated in a Circuit
The other designer disagreed with my first reasoning and that is why I took it to the online community.
Kinda not cool to respond to posts like that if you just want to be cute and insulting and not really try to help the other person out. I thought that type of behavior was for teenagers on AOL chat forums.
DJR
RE: Relationship of Feeder Resistance, Voltage Drop, and Power Dissipated in a Circuit
If you're talking about a simple load like an incandescent light bulb having a voltage drop(power loss) in the feeding lines it makes no functional difference other than the light will be dimmer. Dimmer enough for your eye to even see it? You'll never know. You don't buy bulbs because they are the precise wattage you need or you'd be buying 63W and 87W bulbs, so the fact that your 60Wer may be a little dimmer is of no consequence.
If you're talking about a more complected load like a motor then the losses do carry power bill ramifications. Any line drop to the motor reducing it's voltage will generally cause it to draw more current to keep its power addiction fed. If it needs 746W for 1HP of load and you reduce the voltage the current it is going to have to increase current draw to keep the V x I the same. Note: You actually pay for the CURRENT you use since you have no say in the voltage you get from the power company.
But to your first question. There will ALWAYS be voltage drop between the service and the loads. OHM's law as waross was alluding to, demands it. There are reasonable amounts to have. Too much and things like motors outright fail. Wires in tight wads overheat. There were rules that came to pass that other things began to depend on so the rules became design points that will stay with us, probably forever.
Have you priced copper wire lately?? Most people will be fine with a little power drop over time verses staggering out-of-pocket today. Having little or no drop, hence no resistance, can actually have a detrimental result on some devices as they depend on that slight drop to provide protection from line disturbances that would otherwise need to be dropped entirely inside the products.
Keith Cress
kcress - http://www.flaminsystems.com
RE: Relationship of Feeder Resistance, Voltage Drop, and Power Dissipated in a Circuit
RE: Relationship of Feeder Resistance, Voltage Drop, and Power Dissipated in a Circuit
Um, er, uh, say what? I'm not quite sure where you're going, itsmoked...
True, you have no say over the precise voltage the power company supplies you; but to say you pay for the current you use seems to me to imply that customers are billed for their current draw and not their energy usage, which I don't believe is correct...to the best of my knowledge, electrical energy consumption meters [whether of the old-style spinning disc or new-fangled digital type] employ both voltage and current coils so that the vectorial in-phase product of these is integrated into advancing what is in fact an energy pass-through measurement.
As the old saying goes, the devil is in the details.
CR
"As iron sharpens iron, so one person sharpens another." [Proverbs 27:17, NIV]
RE: Relationship of Feeder Resistance, Voltage Drop, and Power Dissipated in a Circuit
Keith Cress
kcress - http://www.flaminsystems.com
RE: Relationship of Feeder Resistance, Voltage Drop, and Power Dissipated in a Circuit
For resistnce loads;
Add increase the resistance in series (longer or smaller supply conductors) and you increase the total resistance and the current drops slightly.
Less current means you are using less Watts and are paying less.
But more resistance in the supply condustors means more losses.
You are getting a smaller percentage of the power that you are paying for due to the increased loss in the longer or smaller supply conductors.
But voltage drop limits are more about usability than about losses.
Motors may fail to start and may also be damaged by excessive voltage drop. Remember that the starting surge of a motor may be 600%. That means that when a motor is trying to start, the voltage drop may be 5 or 6 times normal.
The color balance of incandescent lamps may shift towards the red end of the spectrum.
Battery chargers may fail to charge properly.
Fluorescent lamps may fail to light.
But back to the light bulb, this is a site for electrical proffesionals. I expected that an electrical proffessional would be able to apply Ohm's law to the light bulb example and determine that as the voltage drop in the supply conductors increases you are paying for less energy but wasting a larger percentage of the energy that you are paying for.
Bill
--------------------
"Why not the best?"
Jimmy Carter
RE: Relationship of Feeder Resistance, Voltage Drop, and Power Dissipated in a Circuit
The light bulb example was obviously a bad example.
Davidbeach, you wrote, "..but motors or anything with a switchmode power supply or a ballast is much more likely to respond as a constant power load."
Why do SMPS act like constant power loads like a motor?
RE: Relationship of Feeder Resistance, Voltage Drop, and Power Dissipated in a Circuit
RE: Relationship of Feeder Resistance, Voltage Drop, and Power Dissipated in a Circuit
Raising the source voltage too high also causes excess losses due to higher magnetic saturation in motors and transformers. Minimizing voltage drop may allow for more uniformly optimal voltage to all loads. Many utilities are experimenting with various forms of voltage/var optimization/reduction (with various degrees of success as loads move from constant impedance to constant power SMPS).
As distributed generation without voltage regulation capabilities get added to the grid, utilities are finding the voltage along distribution feeder is much more variable and unpredictable than in the past. Reducing voltage drop on the customer premises may allow more flexibility on the utility side before customers complain about voltage or power quality issues.
Government regulation is often proposed by lobbyists or other folks without a firm grasp on the actual physical principles governing the world. Impractical things sometimes get approved.
Government regulations may have a different discount rate assumption than private investment. The immediate capital cost of oversized cables to reduce losses rarely makes economic sense at discount rates 7% or more common in private investment. Government regulations may use significantly lower discount rates, which would tend to place higher value on future losses. White house circular A004 discussions such assumptions. https://www.whitehouse.gov/omb/circulars_a004_a-4/...
RE: Relationship of Feeder Resistance, Voltage Drop, and Power Dissipated in a Circuit
RE: Relationship of Feeder Resistance, Voltage Drop, and Power Dissipated in a Circuit
Thanks for taking the time to help clear this up for me. Sometimes I let myself get carried away and over complicate matters...
DJR