hello
i read on 3w star led data sheet that on If-700mA the min Vf is 3.4v and the max Vf is 4.9V.is it mean that i can directly(direct drive)the led with voltage source of 3.4v and increase it till it get 4.9V without any issue?
how much does it affect the bright of the led and the current?
thank in advance
...
My intention was,if i can run that led without any resistor(or other limiting circuit)?
i see that you said not to do that.but why?
.
and i don't understand the differents-why in one case(without resistor) the led can go into thermal runaway(that produced,maybe,because the led resistance decrease as the heat is rise) and in the other case(with resistor)it won't.after-all in both cases the led run on the same voltage and the same current and as result the same electric power(heat dissipation).and if in one case the heat will cause the led go into thermal runaway,what prevent from the other case to go into thermal runaway?the resistor?how?(while the voltage and the current that i provide to the led on both cases are with the specification)...If you grab a single LED, carefully tune a constant voltage power supply to get a If=700mA, the If will drift all over the place as the ambient temperature changes and the LED self-heats. It can also go into thermal runaway, where the self-induced heating causes If to progressively increase, which cause more self-heating, which causes an increase in If until the die melts...
...because the led resistance decrease as the heat is rise) and in the other case(with resistor)it won't.after-all in both cases the led run on the same voltage...QUOTE] (My emphasis)
That's just it - an LED with a series resistor is no longer "running" on the same voltage when the LED's resistance has decreased due to heating.
This might help you understand:
https://www.electro-tech-online.com/custompdfs/2013/01/AP01.pdf Note Figure 1.
The whole point of an in-line current limiting resistor is to prevent (but not toally eliminate) this overheating by removing some of the voltage available to the LED.
Increased current results in increased voltage removal by the resistor.
This compensates for the normal, decreased forward voltage drop (due to heating) of the LED and keeps the voltage across the LED in check.
Up to a point, of course. There is the assumption that the voltage source is relatively stable.
i think again about the thermal runaway that MikeMI mention above :and i don't understand the differents-why in one case(without resistor) the led can go into thermal runaway(that produced,maybe,because the led resistance decrease as the heat is rise) and in the other case(with resistor)it won't.after-all in both cases the led run on the same voltage and the same current and as result the same electric power(heat dissipation).and if in one case the heat will cause the led go into thermal runaway,what prevent from the other case to go into thermal runaway?the resistor?how?(while the voltage and the current that i provide to the led on both cases are with the specification)
what will be the vf on the led?what will be the voltage drop on the resistor?and the current?so lets say that i choose(for example)a resistor of 2.6ohm to get 3.2vf with 700mA powered by 5vdc voltage source after calculation of(5v-3.2v)/0.7A=2.6ohm.how can i calculate the voltage and the current on the led and the voltage on the resistor when i increase or decrease the voltage source after i had already chosen the resistor value?lets say if i increase the voltage source to 5.3v.
Depends on the specific LED. You have to measure it. It could be anywhere from 3.4 to 4.9V!!!...
what will be the vf on the led?
what will be the voltage drop on the resistor?
and the current?...
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?