Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations cowski on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Voltage Reduction for power savings

Status
Not open for further replies.

Sparky0

Electrical
Jan 11, 2011
8
Greetings -

Can you all help me brainstorm on this topic.

What issues should be considered when thinking about reducing the voltage at a substation to lower the power use for a limited time?

Meaning could the power provider save money by dropping the voltage at their substation at peak time(s)?

How would reactive loads behave with this drop?

Am I correct that if the loads were resistive then the power distributor would indeed drop their power consuption with a voltage drop?

Could having many reactive loads acctually cause more problems?

Can this be mathematically modeled / calculated?

Has this ever been considered or studied?

Thanks
-Sparky0
 
Replies continue below

Recommended for you

It's been done for a long time. The utility can reduce voltage at peak loads to reduce demand charges imposed by the transmission company. Most voltage regulator and LTC controls have provisions for temporary voltage reductions.

It does tend to reduce utility company revenue while the voltage is reduced, but loads such as water heating and heat pumps will tend to equalize this by running longer.

There is a lot of data available from EPRI and other sources regarding impact of voltage reduction on various types of loads.

This type of voltage reduction (to cut peak demand) is not the same as the current push for "Conservation Voltage Reduction" (CVR) which is a method for reducing customer power usage on a continuous basis by keeping voltages just within the acceptable voltage limits using line drop compensation and other methods.



David Castor
 
Benefits from voltage reduction varies according to load type (constant impedance, power, energy) and usage.

Some CVR system vendors will tell you that for each % of voltage reduction you will get the same % in energy saving....it's a little optimistic!

If you have access to simulation software, just play with voltage settings, load (P,Q values) and check for resulting power and network losses.

Daniel



 
If your loads are all incandescent lights and resistance heating, you can get great energy savings by lowering voltage. If your loads are all switch-mode power supplies (fluorescent ballasts, VFDs, computer power supplies, etc.) and induction motors you get greater losses by lowering the voltage as the kW load remains unchanged but the current necessary goes up.
 
Thermostatically controlled heating loads like ovens and water heaters will have little energy savings because they will run longer to achieve the set temperature. There may be some demand reduction, but after a while the loss of diversity will eliminate most demand reduction for heating loads.

You need to make sure that the voltage to customers on the end of the line is adequate.

Remember that any energy savings will also result in lower revenues.

Older studies indicated that for typical distribution circuits, a 1% reduction in voltage would result in a 1% reduction in kW demand.
 
Thanks for the info.

Follow up question - why would (or what would cause) one to see no change in current flow when the voltage is dropped at the substation?

- lower the voltage at the substation's transformer (LTC), watching: power, voltage, and current and see no change in current.

Thanks again
-Sparky0
 
Follow up question - why would (or what would cause) one to see no change in current flow when the voltage is dropped at the substation?
If there is a 1% load reduction for a 1% voltage reduction, this implies a constant current type of load overall. Voltage (V) goes down, power (V·I·pf) goes down by the same amount, so current (I) must stay the same. A combination of resistive loads where power is proportional to the square of the voltage (V²/R) and constant power loads like motors could produce an overall constant current load.
 
I haven't caught up with you yet.

Can you explain further why I don't see the current change when the voltage is lowered?

What does this tell me about my load - obvious it's not purely resistive.

What components would model this type of load such as to cause the current to not change?

I see that power = V*I*pf.

I have lowered V. It seems to me like I should have seen the current drop some.

By the current not changing - does this say that something out on my system is increasing the current to make up for the resistive components (lowering the current) - for the loads that are resistive the current will drop some - right?

Thanks for helping
-Sparky0
 
There are three types of loads: constant impedance, constant current, and constant power. I don't have any good examples of a single load that would be constant current, but they are said to exist. But you could composite a bunch of constant impedance loads and a bunch of constant power loads in the right balance and create a composite constant current load.

The constant impedance load will draw less current as the voltage declines, while the constant power load will draw more current as the voltage declines. It would not be surprising on many circuits to see the current go up as the voltage goes down.
 
hmmmm -

that's making sense

is there a connection to power factor with current not changing or is power factor fixed?

Meaning - am I correct that changing the voltage at a substation will not affect the power factor - correct? (or not)

Thanks again
Sparky0
 
Depends on the load, and whether you are talking displacement power factor or total power factor.
 
can you provide a brief explanation of which one would (or could) cause the current to remain unchanged when dropping the voltage?

I assumed power factor was set or fixed based upon the components.

Thanks
Sparky0
 
Changing the voltage at the substation will probably change the power factor, but that's not what I'm talking about with a constant current load. Assume for argument's sake that the power factor remains constant.

If the voltage goes down, the current in constant impedance loads like water heaters will go down. I = V²/R

If voltage goes down, the current in constant power loads will go up. I = P/(V·pf)

If there is the right combination of loads, the decrease in current from constant impedance loads will balance the increase in current from constant power loads. Total current may stay the same.
 
Would central heat and air units be constant power loads?

What would be examples of constant power loads for highly residential areas?

Thanks
Sparky0
 
It takes a certain amount of mechanical power to run a compressor. The motor running the compressor will provide that power. Neglecting changes in efficiency and power factor, the current will increase to provide that power if the voltage goes down.
 
So lowering the voltage on a motor or a switched mode power supply has no local effect on power or energy, but since distribution losses increase; the demand and energy measured at the substation increase somewhat.

Thermostatically controlled constant resistance loads simply run longer, so energy is the same and demand is lowered. Distribution losses are similar.

And lowering the voltage on an incandescent lamp has no effect on runtime, so demand and energy are both lowered as well as the associated distribution losses. Luminance will also be reduced. While a small change in voltage probably will not cause folks to immediately up all their bulb wattages, I am sure people will rethink the need when replacement is needed. Probably just chalk it up to aging eyes when they opt for 75 W where they used to run a 60. This long term 15 W staggered step increase would be buried by other fluctuations in loading. Of course these bulbs are being phased out in favor of CFLs, moving more load to the first category above.

I remain somewhat skeptical of the very long term energy and carbon benefits of CVR.



 
Hopefully by limited time you mean just very few hours at a time. When first lowering the voltage, constant impedance loads will drop and thermostatic loads will not change, thus likely lowering demand. As time goes on, thermostatically controlled loads will loose diversity and cause demand to rise again.

One problem we ran into trying this out long term during the last energy crunch was coking problems on LTC reversing switches. The voltage regulator settings were changed enough so that the LTC never went through Neutral, and after several months we started developing hot spots.

Be sure whether you care about lowering demand (kW) or saving energy (kWh).

Also consider the voltage profile of your feeders. Creating a flatter profile or using load drop compensation may result in savings 24x7, not just the few hours each year CVR is called upon.

Bonneville Power has been sponsoring lots of CVR projects such as:
 
Suppose there is a significant reduction in energy use possible. How much would your typical free enterprise business spend to reduce their sales?
 
It all depend on the cost of increasing the sales (marginal cost).

Marginal cost for the last sales might be too high. example: if you are selling electricity at 10¢/kWa and paying the top kWh 15¢ because you have to fire a gaz turbine you might just want to pass....by reducing the voltage.

Energy price might be very expensive at peak load.
 
What is being discussed here is not really CVR. CVR is focused on reduction in actual energy consumption by customers. This will reduce utility revenue unless rates are restructured to account for this change in philosophy.

This is not quite the same thing as reducing voltage during peak demand periods to reduce the utility's demand charge. The CVR approach would be to INCREASE the substation voltage during peak demand, not decrease.

David Castor
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor