Gandledorf said:
Would I be safer limiting it to 200mA? Nigel suggests a 10% duty cycle, and after working out some diagrams last night, I can see why. So given this, can I assume 1A forward current?
I didn't suggest 10% as an exact value, just to show how the dissipation works out for the resistor. If the maximum continuous current is only 100mA, I would like to see less than 10% - probably I would aim for no more than 5%.
There are a number of decisions which will affect the duty cycle, firstly the mark/space ratio of the modulation used. This is usually around 38KHz, so for 50/50 we're talking roughly 13uS on, 13uS off - so we're already down to 50% duty cycle.
The actual IR encoding should only use fairly short pulses of this 38KHz, the Sony SIRC's example in my tutorial uses pulses of 2.4mS (start pulse), 1.2mS (1's) and 0.6mS (0's) separated by pauses of 0.6mS. The entire command is sent every 45mS, so for the worst case (all 1's) the command takes 24mS, followed by a pause of 21mS, and for the best case (all 0's) it takes 16.8mS, followed by a pause of 28.2mS.
Assuming the worst case scenario, this means the LED is pulsed for 37% of the time (at 50/50), giving 18.5% duty cycle over 45mS. Under the best case it will only be pulsed 21% of the time, giving 10.5% duty cycle over 45mS.
These figures both exceed the 10% at 1A we are talking about, but if you alter the modulation of the LED to 25/75 this halves the duty cycle again - giving 9.25% worst case, and 5.25% best case.
Also, as Gandledorf has already told me he's not using 12 bit data, only 8 bit data - if you keep the 45mS timeslot it reduces the duty cycle still further.
Sorry for the long post (and for any calculator errors I may have made), but I think it may help the decisions required - it should be noted, that in my tutorial, I use two LED's in series, this give twice the output at no extra current drain - and decreases waste in the series resistor (which has to be a different value of course) - this is common practice in many remote controls!.