THEME: Hunting for THE most efficient IR emitter money can buy.
Datasheets specify
approximate values for the Radiant Intensity (I
e) of a LED, in the form of Min-Typ-Max mW/sr for a given current. Just like with virtually all electronic components, lot variation is inherent; however it seems that LEDs vary quite widely. One of the reasons [I think] is mechanical, i.e. die position relative to reflector, lens distortion etc. The other reason [I think] is electrical, i.e. efficiency - Total Radiant Flux (Φ
E) vs. Power Consumption.
**broken link removed** **broken link removed**
A very nice, narrow beam (±15°) IR led, the OSRAM SFH4231 [
datasheet]
The -S, -T, -U, -V represent different groups or "bins".
- IF = 70mA
- Total Radiant Flux, ΦE = 33mW (typical) @ 70mA
- VF = 1.6V (<2.0V) @ 70mA
I'm guessing that the chip's efficiency is
ΦE / IF×VF = 33mW / 70mA×1.6V = 33mW / 112mW =
30%
Note that this is based on the
typical values. It is my understanding that LEDs in general have a quite high "tolerance" when it comes to V
F specs, with our example here being no exception: "
1.6V typical, 2.0V maximum" and this is where it gets confusing. If, for example, one particular LED drops 2.0V @ 70mA, consuming 140mW (instead of 112mW for a 1.6V unit), what happens to the radiant flux? Will it increase from the "typical" 33mW? Will it stay the same?
- Obviously, if the output (ΦE) stays at approx. 33mW, a 2.0V unit is less efficient: 33mW / 140mW = 23%.
- However, if the output increases, then efficiency is virtually the same among chips.
I'm pretty sure this applies to all LEDs, regardless of color, material, etc. but...
which one is it - 1 or 2?
Any help is greatly appreciated!
THANK YOU