how to match voltage source for 3w star led?

Status
Not open for further replies.

rgbbv

New Member
hello

i read on 3w star led data sheet that on If-700mA the min Vf is 3.4v and the max Vf is 4.9V.is it mean that i can directly(direct drive)the led with voltage source of 3.4v and increase it till it get 4.9V without any issue?
how much does it affect the bright of the led and the current?

thank in advance
 

Welcome, rgbbv!

If I read your post correctly, yes, you can drive that LED directly (with an appropriate current limiting resistor). I would expect the brightness (and the current) to increase as you increase the voltage.

And although you can probably exceed the listed 4.9v max, it would greatly reduce the life of the LED.
 
Dont ever run an LED without having a resistor (or other current limiting circuit) between the voltage source and LED!!!!
 
thanks for your replies.

My intention was,if i can run that led without any resistor(or other limiting circuit)?
i see that you said not to do that.but why?
if i run the led with steady 3.4vdc or increase the voltage to any value till 4.9vdc,it seem to be ok with the specification.the led will get the voltage that ok with the spec.and the current should be setteled accordance to the voltage value,is'nt it?

and by the way why in the data sheet they say max Vf(4.9v) and min Vf(3.4v) value for a certain If(700mA)?doesn't for every Vf on the led there is a specific If?
 
This is why it's common practice to run powerful LED's like yours from a constant current power supply - this means that the current is set at a specific value (like your If of 700mA) and the voltage will be whatever the LED needs it to be for that current - in your case somewhere between the min and max limits you describe.

The problem with fixed voltage supplies is that LED's, as with any other junction semiconductor, have a specific voltage at which they will conduct, this is Vf. For example, for silicon devices it's usually around 0.65v, and for germanium its around 0.2v I believe. LED's are made of different materials depending on what colour they are, so have a different Vf. So the supply needs to be close to that voltage. Since it's not normal to have power supplies at LED supply voltages, using a current limiting resistor in series with the LED sets the voltage and current.

If you try to connect a voltage significantly higher than this without a current limiting resistor, the current will rise as high as the power supply will allow, the limiting factor here being the power supply's internal resistance. In the worst case the LED lets out it's magic blue smoke, without which it won't work. (Possibly accompanied by flames).

That said, doing what you say, connecting a supply variable between min and max Vf to vary the brightness, is probably okay - though I think you will find the range which is actually useful is smaller than that, and you should have a least a small current limiting resistor!

To work out the value of your current limiting resistor, subtract the typical Vf for the LED from your supply voltage, and divide this value by the If you require. This gives you the value in ohms. You can vary the resistance or vary the supply voltage to change the brightness, as long as Vf doesn't go above the max value.

I'm sure someone else could give a better, more accurate and more concise explanation, but I think that's about the gist of it.

Hope this is helpful and isn't complete gobbledegook!
 
Last edited:
...
My intention was,if i can run that led without any resistor(or other limiting circuit)?
i see that you said not to do that.but why?
.

If you grab a hundred LEDs, and drive each one with a constant current of If=700mA, the Vfs will distribute on a bell curve, the lowest may have a Vf=~3.4V and the highest will have a Vf =~4.9V, while most will have a Vf somewhere in between.

If you grab a single LED, carefully tune a constant voltage power supply to get a If=700mA, the If will drift all over the place as the ambient temperature changes and the LED self-heats. It can also go into thermal runaway, where the self-induced heating causes If to progressively increase, which cause more self-heating, which causes an increase in If until the die melts...

Starting with a power supply voltage higher than Vf(max) and then putting a current-limiting resistor between the supply and the LED is required to prevent thermal runaway. There are better circuits for current-regulation, but they are more complicated than a resistor...
 
Led's work best when operated from a constant current supply, commercial led lighting supplies are usually constant current.
 
Also, when running some of those high wattage led's ... some of them get really really hot real fast. think "heat-sink" ! I ran one once at the max on the bread board just to see how bright it was. and it was bright (don't look at them directly) I took it off the bread-brd and let go real fast. It left a blister. jfyi
 
Hi,


This question comes up a lot when someone goes to use an LED for the first time. It stems from our past use of BULBS with a filament, and then we want to use an LED and find that it is nothing like a bulb.

In the past when we went out and purchased a filament bulb, we'd look for a particular voltage. For example, for the car we probably go looking for a bulb that is rated for 12 volts. We buy the bulb, screw it in, and everything works. Why it works is because we knew we needed a 12 volt bulb, we found one, and we screwed it into the socket.
So with bulbs we are looking for a particular voltage and once we find that bulb we're good to go. Sometimes we look for a number, but that number will be a number of a bulb of a certain voltage anyway. So for bulbs the most important specification is *VOLTAGE*.

But with LED's it's entirely different. The most important specification is *CURRENT*, not voltage. This leads to an different drive method. We can no longer simply connect it to a certain voltage and expect it to work. We have to pay MORE attention to the current rather than the voltage. We still have to pay attention to the voltage, but the current is the most important of all for an LED.

To power a bulb that is 12 volts we supply a voltage that is 12 volts (or thereabouts) and it works.
To power an LED that is 350ma we supply a current that is 350ma (or thereabouts) and it works, as long as we are getting that current from a source that can also supply the required voltage range for the LED.
One of the problems with the voltage of an LED is that it is not specified as exactly as the voltage of a bulb. There is often a range of voltages that are specified for the LED. This means the manufacturer can not guarantee what the voltage is going to be. If the range is say 3 to 5 volts, then it could be 3v, 4v, 4.3v, 4.8v, etc, up to 5v. So we have to assume that we dont know what the voltage really is.
But the current is usually well specified. If they spec 350ma max then that is the maximum current the LED can take and that produces the most brightness. So the idea is to drive the LED with a current source that comes from a voltage source that is at least as high as the maximum voltage rating of the LED. This usually involves a battery and a resistor or other current limiting device, and we guess at what current we'll get and then test it to see that we get the required current, or we use a constant current drive.

Since the battery and resistor is the most common for a first flashlight, connecting a battery to a resistor and that to the LED provides current to the LED. it may not be the right current at first however, so we have to experiment a little. Starting with a resistor value that is high enough to not overdrive the LED (too much current) and then measuring the current to see if it is what we want. For example, we connect a 10 ohm resistor to a 4.2v cell and a LED that is rated for 350ma and it's voltage in this case is 3 volts. The current would be (4.2-3)/10=1.2/10=120ma, which is not enough current, although the LED will still light up.
We might want to stop here if we are getting enough light out of the LED, or we could proceed to try a lower value. Going down to 5 ohms would lead us to a current of approximately twice that, or 240ma. We then may want to evaluate and decide if this is enough light now. If it is, we're done, but if not, we go down a little lower and then test again. This is the only way to get it right. It's good to leave a little safety margin too though, so maybe we'll drive it at 300ma instead of 350ma.

Under driving the LED leads to longer life and less heat while operating. So that's something to consider too.
 
thanks you all.

i,definitely,understand now why it is better to use resistor,constant current drive or other limiting circuit.you explained it to me very well.

best regards
 
i think again about the thermal runaway that MikeMI mention above : and i don't understand the differents-why in one case(without resistor) the led can go into thermal runaway(that produced,maybe,because the led resistance decrease as the heat is rise) and in the other case(with resistor)it won't.after-all in both cases the led run on the same voltage and the same current and as result the same electric power(heat dissipation).and if in one case the heat will cause the led go into thermal runaway,what prevent from the other case to go into thermal runaway?the resistor?how?(while the voltage and the current that i provide to the led on both cases are with the specification)
 
 

The answer is to look at how much the current would change were the voltage of the LED to change slightly.

If there is a constant current supply, the current doesn't change at all if the LED voltage changes a bit.

If there is a resistor, the current changes a bit if the LED voltage changes a bit.

If there is a constant voltage supply, the current changes a lot if the LED voltage changes a bit.

The LED voltage will fall slightly as the temperature increases. If that voltage fall causes a large increase in current, there will be a significant increase in temperature, the voltage will fall more, you can have a thermal runaway.

Of course, if the voltage fall does not cause a significant increase in current, you won't get a thermal runaway.
 
so lets say that i choose(for example)a resistor of 2.6ohm to get 3.2vf with 700mA powered by 5vdc voltage source after calculation of(5v-3.2v)/0.7A=2.6ohm.how can i calculate the voltage and the current on the led and the voltage on the resistor when i increase or decrease the voltage source after i had already chosen the resistor value?lets say if i increase the voltage source to 5.3v.
 
Way back in this thread I mentioned that an appropriate resistor will prevent thermal runaway, but using a resistor is not the best way. It wastes power, gets hot, and it is hard to pick a starting supply voltage and a resistor value that works for all Vf's for a particular line of LEDs. You unfortunately have to customize the resistor value on a per-LED basis.

The dilemma comes if the goal is to have one common DC constant-voltage power supply driving several 3W LEDs wired in parallel. First, the voltage has to be high enough to allow for a couple of volts across the resistor on the LED with the highest Vf. Say the highest Vf you have in your batch of LEDS is 4.8V. The minimum supply voltage for this LED would be 4.8+2 =6.8V. To get 700mA, the resistor would be R=E/I = (6.8-4.8)/0.7 = 2.86Ω, dissipating P=I*E = 2*0.7 = 1.4W.

Now, say you reach into the bag and pull out a LED that has a Vf of 3.4V. The supply is 6.8V so the resistor will have to drop (6.8-3.4)V, so its value would have to be R=E/I = (6.8-3.4)/0.7 = 4.86Ω. It would dissipate (6.8-3.4)*0.7 = 2.4W.

All this is way too tedious. I would just go buy **broken link removed**, wire all my LEDs in series, and let the driver take care of everything...
 
Last edited:
thanks for your replies.
i have been,already,convinced before my last post,about the necessity of the constant-current driver and why it is better to use it.and i understand how to calculate a resistor and the problem of energy waste because of using it.i only try to understand the phenomenon,what happens in the circuit after i have already been chosen a resistor and then i increase or decrease the voltage source value. what will be the vf on the led?what will be the voltage drop on the resistor?and the current?

regarding to the 3w star led, if i am going to mount the 3w star led on a rectangular piece of aluminium,how should i mount the 3w 700mA constant current regulated led driver on the same aluminium ?
 
Last edited:
Hi,

There are a couple of things coming up here that we should probably discuss further.

First, a constant current driver isnt necessarily better than a resistor. It depends what kind of constant current driver it is, and what your goals for the flashlight are.

But before we talk about the constant current driver vs resistor, lets talk about the other issue thermal runaway.

All LEDs effectively have a built in equivalent resistor. It's part of the diode itself. The diode also acts like a resistance, but it's simpler to look at it as a voltage source in series with a small resistor. We'll look at it this way for now and this will reveal how thermal runaway works.

Lets say we have a 3.1v LED at 1 amp. And we'll say that the diode part drops 3v and the resistive part drops 0.1v because the resistive part is 0.1 ohms. So far we have:
vLED=3 (LED starting voltage at room temperature)
R=0.1 (internal LED resistance apart from the diode)
Vs=3.1 (our source voltage)
I=(Vs-vLED)/R (the current of the LED)

and because we set the source voltage to 3.1v we get I=1 amp exactly. But then something else happens, the LED heats up. Lets say it heats up by 20 degrees C. The LED voltage drops by approximately 2.5mv per degree C, we the voltage drops and now the LED voltage is described by:
vLED=3-0.0025*T

where T is the temperature rise in degrees C. With a 20 degree C rise, we now have:
vLED=3-0.0025*20=2.96 volts

Doesnt seem like much does it? We started with 3.00 volts, now we have 2.96 volts. But look what happens to the current with only 0.1 ohms series resistance:
I=(Vs-vLED)/R=(3.1-2.96)/0.1=1.4 amps.

Wow, the current went up by 400ma just because the diode heated up. And it doesnt stop there. Because we now have even more current than before, the diode voltage drops even more yet, and that again causes an increase in current, and that again causes a decrease in voltage, etc., etc. This is thermal runaway.

The system we talk about above does in fact converge, it does not increase to infinity (in theory) but the problem is the place where it converges (becomes stable) is far above the operating temperature of the diode, so the diode fails (burns out).

Now lets put a 1 ohm resistor in series with the LED and see what happens...

Now that we have a 1 ohm resistor in series with the original internal resistance we have a total of 1.1 ohms in series with the LED diode, so we have to increase the voltage source to 4.1 volts (up from 3.1 volts). This is a drawback, but it is necessary to get the same operating current of 1 amp. So we have:

T=0 (start with no temperature rise)
vLED=3-0.002.5*T
R=1.1
Vs=4.1
I=(Vs-vLED)/R=1 amp

Now with that 1 amp we still have a 20 degree C temperature rise, so we now have:
T=20 (because of the 1 amp current)
vLED=3-0.002.5*T=2.96 volts
R=1.1
Vs=4.1
I=(Vs-vLED)/R=1.045 amps

You can quickly see how much less the current had increased here. Before it was 1.4 amps, now it is 1.045 amps. That's only about 10 percent of how much it increased before we added the 1 ohm resistor. That extra current will again cause more heating, but much much less than before, so the result is that when the system converges (reaches equilibrium) it converges at a temperature that is BELOW the maximum operating temperature of the LED. In this way we avoid thermal runaway.


Now about the constant current vs resistor...

If we use a constant current drive that is a switching type regulator, we obtain higher efficiency than using a resistor. But using a linear constant current regulator we dont gain anything except having constant light output over the entire discharge of the battery.

A switching constant current generator is not the same as a pulsed source either. A pulsed source is no more efficient than a resistor. In fact, it is slightly less efficient for the same light output unless it is run on high (full output).

There is a drawback to using a constant current to drive the LED when run from a battery however. That is that although the light output is constant over the full battery discharge, when the battery finally reaches the lower operating voltage of the constant current circuit the light output suddenly drops very low or even goes out entirely. This can be a drawback because we have very little warning that the light output is no longer going to be there. This leads us to design a constant current source that is pretty much constant, but does have some degree of lower current as the battery drains.
With a resistor, the light output drops slowly as the battery drains, and in fact it takes a very very long time to completely loose all light output because as the battery drains the LED draws less and less current which means the battery lasts much much longer. Reduced light, but still some light to see by. This is important sometimes such as when cave'ing (cave exploration).

Also a single resistor and LED is going to be more reliable than a constant current source circuit with many components and connections.
 
Last edited:
maybe i didn't ask the question right.i know how to calculate the voltage the resistance and ect. in the circuit,how to choose a resistor and how to measure the components with multimeter when it is working.the data that i gave was theoretically,only for example(for led 3vf-3.4vf).the question that i try to get an answer is-what happens after i have been chosen a resistor,after i did the all calculation and i have theoretical working circuit that i know the vf,the current,the resistance and etc. on each part of the circuit(and the circuit is working exactly as i plan on practical).and then,what will happen then,when i,suddenly, increase the voltage source from 5v to 5.3v,what then will happen to the led vf voltage?is it stay the same?is it increase?what will happen on the resistor?will it absorb all the change of the increasing voltage?how can i calculate the change of the values without measuring the components on the circuit while it is working?
 
Hello,

Did you read post #17 at all?

You have to make at least one measurement to be sure. That's because the manufacturer does not specify the voltage exactly.
If you really dont want to make any measurements, then you have to build a constant current source circuit and use that instead, knowing it will put out the right current.
 
Status
Not open for further replies.
Cookies are required to use this site. You must accept them to continue using the site. Learn more…