I'm trying to put together an LED matrix. I understand the math around picking a resistor for use with an LED, but this matrix project is pushing the envelope of my knowledge on the subject.
I found **broken link removed** online that does all the math for me. I know the equations to do this myself, but it's nice to see the same results from another source. (My inputs for the wizard are 12, 1.5, 100, 56 if you want to see the schematic I'm talking about)
For reference, I plan on putting together a 7x8 LED matrix. The DC source is 12V and the LED parameters are: Vf = 1.5, If = 100mA (infrared LEDs).
I have a couple concerns. I've learned the hard way that nothing in the electronics world is exact. My DC source will probably not be exactly 12V, my LEDs Vf will probably not be exactly 1.5V and will vary from LED to LED, etc.
So, my first concern is whether I should try to put 8 LEDs in series as the wizard suggests since I may have variations in Vf, or have lower than 12V source due to ripple, current demands, variations in different wall warts (I have one that's rated at 12V, but only gives me 10.5V), etc. Would it be smarter to do 7 LEDs in series and use a bit larger resistor so that I have plenty of Voltage to drive the LEDs?
Next, the wizard put in a 1ohm resistor. I don't have any of these. Is such a small resistor strictly necessary? This LED matrix will be on all the time (pulsed), so I don't want to overheat/short it just to save the hassle of finding a few low value resistors.