Voltage reduction w/low heat...possible?
Voltage reduction w/low heat...possible?
(OP)
I'm looking for ways to reduce (and regulate) a voltage source from +12.5-14.5 V down to an even +12.0 V, with a current range around 3-4 A. This is a small, inexpensive item, so regulator size and cost is a major concern. It will end up in a car and be handled from time to time, so heat is also a major concern.
Shunt regulators are out of the question due to guaranteed high heat from wasted energy in the resistors (not too concerned over the wasted energy, though). Linear regulators are the current candidate, but again, the high heat levels involved at dropping 10+ W doesn't make me a huge fan of those, either. The only other option I can think of is a separate switching regulator IC, but the cost on that would more than likely be prohibitive.
Am I overlooking any options, or am I stuck choosing between these?
Shunt regulators are out of the question due to guaranteed high heat from wasted energy in the resistors (not too concerned over the wasted energy, though). Linear regulators are the current candidate, but again, the high heat levels involved at dropping 10+ W doesn't make me a huge fan of those, either. The only other option I can think of is a separate switching regulator IC, but the cost on that would more than likely be prohibitive.
Am I overlooking any options, or am I stuck choosing between these?





RE: Voltage reduction w/low heat...possible?
RE: Voltage reduction w/low heat...possible?
8W is completely unacceptable. The plastic case would not be just warm, it would be extremely hot! I need a stable voltage for a long string of LEDs so their currents can be calculated and remain constant over varing input voltages. This is the last major hurdle in my project, and I'm now regretting putting it off until the end :(
RE: Voltage reduction w/low heat...possible?
http://www.maxim-ic.com/quick_view2.cfm/qv_pk/2166
or
http://www.monolithicpower.com/mp1517rev1.1f.pdf
There are several other chips and circuits around doing just this.
Usually they do voltage step-up, though.
RE: Voltage reduction w/low heat...possible?
RE: Voltage reduction w/low heat...possible?
My design entails a large number of LEDs in parallel (well, each "LED" is actually 2 LEDs in series for an increased voltage drop). The combined voltage drop of 2 LEDs can be in excess of 10V. I have set the max current in each LED through individual resistors. If I had a stable supply voltage everything would be golden, but a car's voltage swings from about 12.5V when off to 14.5V when the alternator is running. With such a small voltage differential between the LED voltage drop and the supply, minor changes in the supply equal quite large changes in the LED current (the current set at 12.5V can double or more at 14.5V, a very bad thing).
So, I considered putting in some sort of stable supply (but currents could change slightly as LED forward voltages varied from lot to lot). I also considered actually increasing the supply voltage with a boost converter to increase the voltage differential between supply and LED (and thereby reducing current swings as the LED forward voltage changed from lot to lot).
I think the sensible thing to do now might be reduce the LED series combinations to individual packages to increase the voltage differential between supply and LED, leave the supply voltage alone and let it vary a couple of volts, and count my blessings the current won't change more than about 5 mA from one supply voltage extreme to the other. It's a chicken's way out and does NOT make me a happy camper, but it's inexpensive and simplistic.
Of course, I'm always open to ideas from you guys.
RE: Voltage reduction w/low heat...possible?
What if I use a boost converter to up the voltage to a stable 16V (no specific reason for 16V, other than 3 LEDs in series could max out to about 15V)? I use a transistor in diode configuration (base connected to collector) with a resistor between the collector and supply, sized to allow a specific current through the transistor. The base of the transistor is connected to other transistors in a current mirror configuration (all bases tied together). This way, only one resistor is needed, the supply is stable, and each set of LEDs will have equal current through them.
Of course, the down side is power dissipation doesn't change much (assuming I'm looking at this correctly). Skipping all of the calculations, the converter will still be around 85-90% efficient. I trade off voltage for current and vice versa, but I still have to dissipate 6-8W of power somewhere.
RE: Voltage reduction w/low heat...possible?
RE: Voltage reduction w/low heat...possible?
RE: Voltage reduction w/low heat...possible?
RE: Voltage reduction w/low heat...possible?
RE: Voltage reduction w/low heat...possible?
RE: Voltage reduction w/low heat...possible?
Brian, The voltage regulation was not directly for the LEDs, but as a method to allow me to control the current (within certain bounds). I know the Vf for LEDs will vary from piece to piece, but the minor voltage changes there would vary my currentby a few percent, quite acceptable. I've come to the conclusion a linear regulator will not work for this cituation due not only to the heat generated, but also to the low voltage drop required in the regulator at certain supply voltages.
melone, Diodes or not, I'm still left with 8W of dissipated power to dump into the atmosphere. They would also lower my available supply headroom in the regulator, so even with a LDO regulator I wouldn't have enough leeway to stay in regulation.
I'll be running some tests in the next few days to determine if I can handle the brightness drop between one supply voltage to the other. No other manufacturer I've found so far regulates the voltage, but they also have some advantages I do not which allows them to get away with it.
RE: Voltage reduction w/low heat...possible?
RE: Voltage reduction w/low heat...possible?
Almost every body has some kind of buck chip.
You may want to contact an apps.eng. at one
of the many chip manufacturers. ie TI , National ,Fairchiald
,International Rectifier ,or Analog Devices ect..
For the range you want you should be able to get 95%
or more. I just don't know if at 12.5 in you will have
enough over head for 12v out. You may try splitting
the power between two simple switchers. This should
get the eff. up and spread out the heat.
Good luck !
RE: Voltage reduction w/low heat...possible?
and change the pulse width dependent on the voltage
to maintain the brightness level.
RE: Voltage reduction w/low heat...possible?
First off if your goal is to be as efficient as possible you may want to look the other way from where you are going. This is to say that you don't have to reach the voltage drop of the LED to power it you just have to be close. for instance if you are using 5V LEDs as you have kind of led onto you can try using 3 in series without a resistor at all. The circuit should stay safe until you get over about 14.7 volts. Throw some stuff on a bread board and see what kind of current you are pulling. If you stay below the max current you should be safe without any resistors/PS. You will get a lower brightness per led at the lower voltages but you will be making up for this with more LEDs.
secondly you may want to look into other LEDs. You may be able to get an LED that uses a different technology and has a lower minimum voltage requirement. If you can get the LEDs down to say 4V each your resistor will be bleading off 4-6 volts. You can tune the system to give 22ma at max (assuming 20 ma LEDs) which will decrease the life only slightly (10%). At this level you will still be able to pull approx 15ma at the lowest voltages you are likely to encounter. This will not produce a profound difference in brightness (20% or so depending on LED).
Both of these suggestions are assuming you are not using the system for headlights as you may run into cert problems with LEDs there as they will not run as low as the car will (7 V or so before you no longer get ignition) and could pose safety problems.
RE: Voltage reduction w/low heat...possible?
Very astute observation skills you have there. I am indeed limiting myself to two LEDs and a resistor due to the large Vf of the LEDs. With such a large drop across the LEDs, I have very little voltage across the resistor. For argument's sake, let's assume the supply will range from 12.5-14.5V and the LEDs grab 10V for themselves. This leaves a resistor dropping anywhere from 2.5-4.5V. If I specify a resistor for 20mA at 14.5V (225 ohms), current drops to 11mA at 12.5V, a drop of nearly 50%...not insignificant, for sure. I could mitigate the percentage variation by moving to a single LED, but the current requirements would double (again, not insignificant). I'm almost damned if I do, damned if I don't.
While driving LEDs strictly with voltage control may have merit in some applications, I think it could go one of two ways...either 1) dangerous for the LED as a higher voltage allows beyond-spec currents to flow, or 2) dismal performance as a too low voltage makes the LEDs glow too dim. Finding a middle ground would be quite difficult for the reasons mentioned in my last paragraph. Even if I could guarantee a slightly tighter supply range, finding LEDs with just the right Vf to get within that range would be an ccomplishment onto itself.
I'm getting the feeling I may have to bite the bullet and accept a fairly wide range of brightnesses based upon whether or not the car is running. I do have a few minor items to my advantage. First, since LEDs are square law devices, dropping the current by 50% doesn't mean I lose half of the brightness (that's what, uh, a 25% drop in brightness, give or take?). Second, as the current drops, the Vf of the LEDs drops slightly, giving a bit more headroom to the resistor...so, as the supply voltage drops the 2V or so, it may only manifest itself as a 30% drop in actual current compared to the first order calcuation of 50% (I'm pulling rough figures out of the air). Both combined, I may see a brightness loss of 15-20% <crossing fingers>. As long as this brightness level remains the same for a long period of time (no pulsing or fluctuation), this may be acceptable.
Does that sound like a fair assessment?
RE: Voltage reduction w/low heat...possible?
The following is also for future reference and newbies...
According to the charts, the LED is rated at Vf=4.6V at 35mA. I measured 4.20V @ 30mA and 3.53V @ 15mA. Now, I changed from a supply voltage of 5.91V to 4.25V (from another viewpoint, the voltage across the resistor went from 1.71V to 0.72V). The supply voltage changed 1.66V, but the voltage across the current-controlling resistor only changed by 0.99V due to the LED's Vf dropping with decreased current. A first-order approximation then says a change in supply voltage only causes a 60% change in current (I'm neglecting the non-linear nature of changes in Vf versus If).
That being the case and going back to actual component values (nominal specs), the current through my production version will probably only drop about 20% from when the car is running and when it's off (which begins to line up with the first-order approximation/guesstimate I made in my previous post. If a 50% current drop shows a mostly insignificant change in brightness (as I've proven to myself here on the lab bench), I think the change in brightness for a 20% current drop will be all but unnoticeable.
RE: Voltage reduction w/low heat...possible?
RE: Voltage reduction w/low heat...possible?
4.6V forward voltage seems pretty high. Are they blue?
rgds
Zeit.
RE: Voltage reduction w/low heat...possible?
pezas, I'm very hesitant to put little to no current limiting in the string itself. Should the user install the system in a vehicle that has a higher than average voltage with the alternator, my poor little LEDs would be toast in a very short time.
RE: Voltage reduction w/low heat...possible?
http://www.inventgineering.com/LEDdriver.htm
The problem with trying to have a fixed voltage is that LEDS are current devices with a more or less constant voltage drop. This voltage changes with temperature so that a constant voltage source to the LEDs does not insure a constant current! The circuit I have uses the LEDs themselves to regulate and what is left are a few transistors and resistors. You can parallel strings of LEDS each with its own transistor and emitter resistor to insure constant current. Typical Automotive spikes and surges will not be a problem with this circuit. Heat dissipation is spread among several transistors and resistors.
RE: Voltage reduction w/low heat...possible?
I've seen this circuit before (or one similar to it), but unfortunately it will not work in my application for several reasons. I do not need absolute current control, and a fluctuation of a few mA here and there isn't a problem.
The voltage drop of an LED is significantly larger with changes in current when compared with current changes versus temperature, particularly near the max operating point of the LED. In at least one manner the LED is self-regulating (to an extent)...when connected in series with a resistor, the forward voltage drops as the current level drops, thereby allowing more current to flow. An equilibrium is reached that, while not a scheme one would want to use on an LED-based display board, makes the setup reasonably decent.
RE: Voltage reduction w/low heat...possible?
RE: Voltage reduction w/low heat...possible?
14.5V to 12.5V is not a large percentage when your reference point is 0V, but when your reference point is 10V, you've just jumped from a 14% change to a 56% change. THAT'S a significant difference in brightness...