×
INTELLIGENT WORK FORUMS
FOR ENGINEERING PROFESSIONALS

Log In

Come Join Us!

Are you an
Engineering professional?
Join Eng-Tips Forums!
  • Talk With Other Members
  • Be Notified Of Responses
    To Your Posts
  • Keyword Search
  • One-Click Access To Your
    Favorite Forums
  • Automated Signatures
    On Your Posts
  • Best Of All, It's Free!
  • Students Click Here

*Eng-Tips's functionality depends on members receiving e-mail. By joining you are opting in to receive e-mail.

Posting Guidelines

Promoting, selling, recruiting, coursework and thesis posting is forbidden.

Students Click Here

Jobs

Voltage reduction w/low heat...possible?

Voltage reduction w/low heat...possible?

Voltage reduction w/low heat...possible?

(OP)
I'm looking for ways to reduce (and regulate) a voltage source from +12.5-14.5 V down to an even +12.0 V, with a current range around 3-4 A.  This is a small, inexpensive item, so regulator size and cost is a major concern.  It will end up in a car and be handled from time to time, so heat is also a major concern.

Shunt regulators are out of the question due to guaranteed high heat from wasted energy in the resistors (not too concerned over the wasted energy, though).  Linear regulators are the current candidate, but again, the high heat levels involved at dropping 10+ W doesn't make me a huge fan of those, either.  The only other option I can think of is a separate switching regulator IC, but the cost on that would more than likely be prohibitive.

Am I overlooking any options, or am I stuck choosing between these?

RE: Voltage reduction w/low heat...possible?

A "Simple Switcher" needn't cost an arm, and certainly not a leg. Have a look at National Semiconductor's site. Look for "Simple Switcher". Not expensive at all. Good luck.

RE: Voltage reduction w/low heat...possible?

(OP)
Yeah, the more I look at things though, it seems as if switching supplies really begin to shine when the input voltage is significantly different than the output voltages.  For example, a linear regulator will burn through 7.5W at Vi=14.5V  and Io=3A.  A switching regulator with 85% efficiency will drop it a tad to 6.4W.  At Vi=12.5V, the linear regulator actually wins out hands down by burning a "mere" 1.5W.

8W is completely unacceptable.  The plastic case would not be just warm, it would be extremely hot!  I need a stable voltage for a long string of LEDs so their currents can be calculated and remain constant over varing input voltages.  This is the last major hurdle in my project, and I'm now regretting putting it off until the end :(

RE: Voltage reduction w/low heat...possible?

I see your problem. And I guess that you have tried the other options like going for more efficient LEDs. One thing that I actually did once was to PWM the current through the LEDs by using a comparator that sensed the voltage drop over a resistor from the last LED in the chain and GND. The comparator switched the LED string ON/OFF. The switching frequency can be kept fairly low (controlled by filtering the sense voltage with 2 - 5 milliseconds) to minimise switching losses (100 Hz is OK to the eye) and you do not lose more than a few hundred millivolts if you drive the transistor correctly. Using a low resistance FET and low switching frequency will probably be your best choice.  And you don´t need any inductor!

RE: Voltage reduction w/low heat...possible?

(OP)
I wish I could throw my entire design in front of a room full of engineers, but unfortunately for me I'm a lone engineer ;)

My design entails a large number of LEDs in parallel (well, each "LED" is actually 2 LEDs in series for an increased voltage drop).  The combined voltage drop of 2 LEDs can be in excess of 10V.  I have set the max current in each LED through individual resistors.  If I had a stable supply voltage everything would be golden, but a car's voltage swings from about 12.5V when off to 14.5V when the alternator is running.  With such a small voltage differential between the LED voltage drop and the supply, minor changes in the supply equal quite large changes in the LED current (the current set at 12.5V can double or more at 14.5V, a very bad thing).

So, I considered putting in some sort of stable supply (but currents could change slightly as LED forward voltages varied from lot to lot).  I also considered actually increasing the supply voltage with a boost converter to increase the voltage differential between supply and LED (and thereby reducing current swings as the LED forward voltage changed from lot to lot).

I think the sensible thing to do now might be reduce the LED series combinations to individual packages to increase the voltage differential between supply and LED, leave the supply voltage alone and let it vary a couple of volts, and count my blessings the current won't change more than about 5 mA from one supply voltage extreme to the other.  It's a chicken's way out and does NOT make me a happy camper, but it's inexpensive and simplistic.

Of course, I'm always open to ideas from you guys.

RE: Voltage reduction w/low heat...possible?

(OP)
This idea has good and bad sides to it.

What if I use a boost converter to up the voltage to a stable 16V (no specific reason for 16V, other than 3 LEDs in series could max out to about 15V)?  I use a transistor in diode configuration (base connected to collector) with a resistor between the collector and supply, sized to allow a specific current through the transistor.  The base of the transistor is connected to other transistors in a current mirror configuration (all bases tied together).  This way, only one resistor is needed, the supply is stable, and each set of LEDs will have equal current through them.

Of course, the down side is power dissipation doesn't change much (assuming I'm looking at this correctly).  Skipping all of the calculations, the converter will still be around 85-90% efficient.  I trade off voltage for current and vice versa, but I still have to dissipate 6-8W of power somewhere.

RE: Voltage reduction w/low heat...possible?

Is this a onesie application?  MPJA.com has surplus HP units (in plastic case) 9-24V input and 12V 4A out

RE: Voltage reduction w/low heat...possible?

(OP)
This will be a several hundred to several thousand a year type of project, that's why it has to be perfect the first time out.

RE: Voltage reduction w/low heat...possible?

What about the charge pumps for LEDs that are announced for laptops and PDAs?  The efficiency is good.  Maybe there's a way to translate the application into a higher current one like yours.

RE: Voltage reduction w/low heat...possible?

LED drive needs current limiting rather than voltage regulation.  A simple switcher with an output inductor sized so it never saturates and limiting the current to the appropriate level will give you exactly that.  A linear regulator will give you a lot of heat if it is designed for the voltage ranges you give.

RE: Voltage reduction w/low heat...possible?

Or just put in a series diode, or two, on the input to the linear regulator.  The diodes will disapate some of the heat, so the regulator won't have too...

RE: Voltage reduction w/low heat...possible?

(OP)
felixc, the problem with that method is expense.  I would need quite a number of those ICs (and associated components) to regulate all of the LEDs.  It would be a perfect solution if I was only using a handful, but I'm talking hundreds scattered over a wide area.


Brian, The voltage regulation was not directly for the LEDs, but as a method to allow me to control the current (within certain bounds).  I know the Vf for LEDs will vary from piece to piece, but the minor voltage changes there would vary my currentby a few percent, quite acceptable.  I've come to the conclusion a linear regulator will not work for this cituation due not only to the heat generated, but also to the low voltage drop required in the regulator at certain supply voltages.


melone, Diodes or not, I'm still left with 8W of dissipated power to dump into the atmosphere.  They would also lower my available supply headroom in the regulator, so even with a LDO regulator I wouldn't have enough leeway to stay in regulation.


I'll be running some tests in the next few days to determine if I can handle the brightness drop between one supply voltage to the other.  No other manufacturer I've found so far regulates the voltage, but they also have some advantages I do not which allows them to get away with it.

RE: Voltage reduction w/low heat...possible?

I just realised that you don’t have to use a regulator that handles the range between 12.0V and 14.4V. The battery is either being charged or it isn't. In other words, the engine is either running or it isn’t. You could therefore use a MOSFET to switch a resistor out of circuit when the low level voltage is encountered.

RE: Voltage reduction w/low heat...possible?

LM2642 or LM5642 might work for you.
Almost every body has some kind of buck chip.
You may want to contact an apps.eng. at one
of the many chip manufacturers. ie TI , National ,Fairchiald
,International Rectifier ,or Analog Devices ect..
For the range you want you should be able to get 95%
or more. I just don't know if at 12.5 in you will have
enough over head for 12v out. You may try splitting
the power between two simple switchers. This should
get the eff. up and spread out the heat.

Good luck !

RE: Voltage reduction w/low heat...possible?

You might try using the fet that is pulsing the leds
and change the pulse width dependent on the voltage
to maintain the brightness level.

RE: Voltage reduction w/low heat...possible?

The solution to this may be simpler than we are all thinking.  some more information may be helpfull here in diagnosing a correct fix for your problem but i will throw out some ideas.  

First off if your goal is to be as efficient as possible you may want to look the other way from where you are going.  This is to say that you don't have to reach the voltage drop of the LED to power it you just have to be close.  for instance if you are using 5V LEDs as you have kind of led onto you can try using 3 in series without a resistor at all.  The circuit should stay safe until you get over about 14.7 volts.  Throw some stuff on a bread board and see what kind of current you are pulling.  If you stay below the max current you should be safe without any resistors/PS.  You will get a lower brightness per led at the lower voltages but you will be making up for this with more LEDs.  

secondly you may want to look into other LEDs.  You may be able to get an LED that uses a different technology and has a lower minimum voltage requirement.  If you can get the LEDs down to say 4V each your resistor will be bleading off 4-6 volts.  You can tune the system to give 22ma at max (assuming 20 ma LEDs) which will decrease the life only slightly (10%).  At this level you will still be able to pull approx 15ma at the lowest voltages you are likely to encounter.  This will not produce a profound difference in brightness (20% or so depending on LED).  

Both of these suggestions are assuming you are not using the system for headlights as you may run into cert problems with LEDs there as they will not run as low as the car will (7 V or so before you no longer get ignition) and could pose safety problems.

RE: Voltage reduction w/low heat...possible?

(OP)
pezas,

Very astute observation skills you have there.  I am indeed limiting myself to two LEDs and a resistor due to the large Vf of the LEDs.  With such a large drop across the LEDs, I have very little voltage across the resistor.  For argument's sake, let's assume the supply will range from 12.5-14.5V and the LEDs grab 10V for themselves.  This leaves a resistor dropping anywhere from 2.5-4.5V.  If I specify a resistor for 20mA at 14.5V (225 ohms), current drops to 11mA at 12.5V, a drop of nearly 50%...not insignificant, for sure.  I could mitigate the percentage variation by moving to a single LED, but the current requirements would double (again, not insignificant).  I'm almost damned if I do, damned if I don't.

While driving LEDs strictly with voltage control may have merit in some applications, I think it could go one of two ways...either 1) dangerous for the LED as a higher voltage allows beyond-spec currents to flow, or 2) dismal performance as a too low voltage makes the LEDs glow too dim.  Finding a middle ground would be quite difficult for the reasons mentioned in my last paragraph.  Even if I could guarantee a slightly tighter supply range, finding LEDs with just the right Vf to get within that range would be an ccomplishment onto itself.

I'm getting the feeling I may have to bite the bullet and accept a fairly wide range of brightnesses based upon whether or not the car is running.  I do have a few minor items to my advantage.  First, since LEDs are square law devices, dropping the current by 50% doesn't mean I lose half of the brightness (that's what, uh, a 25% drop in brightness, give or take?).  Second, as the current drops, the Vf of the LEDs drops slightly, giving a bit more headroom to the resistor...so, as the supply voltage drops the 2V or so, it may only manifest itself as a 30% drop in actual current compared to the first order calcuation of 50% (I'm pulling rough figures out of the air).  Both combined, I may see a brightness loss of 15-20% <crossing fingers>.  As long as this brightness level remains the same for a long period of time (no pulsing or fluctuation), this may be acceptable.

Does that sound like a fair assessment?

RE: Voltage reduction w/low heat...possible?

(OP)
I had to test my theory for peace of mind, so I hooked up several LEDs (to also check lot to lot differences).  While there was a noticeable difference in brightness between 100% and 50% current, I don't believe the difference significant enough to warrant further worries...you would notice a drop in brightness as the car is turned off, but the brightness should still be of a significant nature.  Of course, I'll still think on it from time to time, but I don't consider it enough of a problem to squander my precious resources on.  See my further replies below for why

The following is also for future reference and newbies...

According to the charts, the LED is rated at Vf=4.6V at 35mA.  I measured 4.20V @ 30mA and 3.53V @ 15mA.  Now, I changed from a supply voltage of 5.91V to 4.25V (from another viewpoint, the voltage across the resistor went from 1.71V to 0.72V).  The supply voltage changed 1.66V, but the voltage across the current-controlling resistor only changed by 0.99V due to the LED's Vf dropping with decreased current.  A first-order approximation then says a change in supply voltage only causes a 60% change in current (I'm neglecting the non-linear nature of changes in Vf versus If).

That being the case and going back to actual component values (nominal specs), the current through my production version will probably only drop about 20% from when the car is running and when it's off (which begins to line up with the first-order approximation/guesstimate I made in my previous post.  If a 50% current drop shows a mostly insignificant change in brightness (as I've proven to myself here on the lab bench), I think the change in brightness for a 20% current drop will be all but unnoticeable.

RE: Voltage reduction w/low heat...possible?

just for grins if you are using 4.6V leds try hooking 3 of them up wihout a resistor and measure the through current when the supply voltage is between 12-15V.  your brightness should be based on this current.  You may find that you don't need to have a resistor at all (or possibly 10 ohm or something like that if you are really worried).  Don't worry about led life if you are staying below the rated current of the led.

RE: Voltage reduction w/low heat...possible?

'Scuse me for asking, but what colour are the leds?

4.6V forward voltage seems pretty high. Are they blue?

rgds
Zeit.

RE: Voltage reduction w/low heat...possible?

(OP)
zeit, Yep, blue.


pezas, I'm very hesitant to put little to no current limiting in the string itself.  Should the user install the system in a vehicle that has a higher than average voltage with the alternator, my poor little LEDs would be toast in a very short time.

RE: Voltage reduction w/low heat...possible?

I have an alternate simple circuit to regulate the current to LEDS instead of voltage on my web page:
http://www.inventgineering.com/LEDdriver.htm
The problem with trying to have a fixed voltage is that LEDS are current devices with a more or less constant voltage drop. This voltage changes with temperature so that a constant voltage source to the LEDs does not insure a constant current! The circuit I have uses the LEDs themselves to regulate and what is left are a few transistors and resistors. You can parallel strings of LEDS each with its own transistor and emitter resistor to insure constant current. Typical Automotive spikes and surges will not be a problem with this circuit. Heat dissipation is spread among several transistors and resistors.
 

RE: Voltage reduction w/low heat...possible?

(OP)
Dave,

I've seen this circuit before (or one similar to it), but unfortunately it will not work in my application for several reasons.  I do not need absolute current control, and a fluctuation of a few mA here and there isn't a problem.

The voltage drop of an LED is significantly larger with changes in current when compared with current changes versus temperature, particularly near the max operating point of the LED.  In at least one manner the LED is self-regulating (to an extent)...when connected in series with a resistor, the forward voltage drops as the current level drops, thereby allowing more current to flow.  An equilibrium is reached that, while not a scheme one would want to use on an LED-based display board, makes the setup reasonably decent.

RE: Voltage reduction w/low heat...possible?

Led brightness changes with current but not linearly. are you sure you need regulated power at all? Why not just run the LEDS with their resistor off the raw voltage and be done with it. The variation between 12.5V and 14.5V is not all that great per centage wise anyway. However dropping 12.5V to 12V is a stretch as you don't really have enough time between pulses to properly charge the inductor in a switching supply.

RE: Voltage reduction w/low heat...possible?

(OP)
Wow, this is an older thread... ;)

14.5V to 12.5V is not a large percentage when your reference point is 0V, but when your reference point is 10V, you've just jumped from a 14% change to a 56% change.  THAT'S a significant difference in brightness...

Red Flag This Post

Please let us know here why this post is inappropriate. Reasons such as off-topic, duplicates, flames, illegal, vulgar, or students posting their homework.

Red Flag Submitted

Thank you for helping keep Eng-Tips Forums free from inappropriate posts.
The Eng-Tips staff will check this out and take appropriate action.

Reply To This Thread

Posting in the Eng-Tips forums is a member-only feature.

Click Here to join Eng-Tips and talk with other members!


Resources