I am installing resistors to allow me to use 1.5-3V 20 mA LEDs on a 12 V system. I already know how to calculate the resistance required:
(DV-AV)/mA=R1 Where DV= desired Voltage and AV = actual voltage so in this case
(12-2.25)/.02 = 475 ohms
I also know that Ohms times Volts equals Watts
My problem is that I am going to be running many LEDs and want to use as few resistors as possible. Which voltage do I multiply the .02 with to determine the wattage draw. To make matters worse this application is in an old car where voltage can easily spike, so I am going to be running 1K resisters to dampen any extra possible voltage. So how do I figure out wattage draw so that I can maximize the number of LEDs per .5W resister with out over taxing said resistor. I don’t just want the answer. I would also like the formula so that I can figure it out for my self in future applications.
(DV-AV)/mA=R1 Where DV= desired Voltage and AV = actual voltage so in this case
(12-2.25)/.02 = 475 ohms
I also know that Ohms times Volts equals Watts
My problem is that I am going to be running many LEDs and want to use as few resistors as possible. Which voltage do I multiply the .02 with to determine the wattage draw. To make matters worse this application is in an old car where voltage can easily spike, so I am going to be running 1K resisters to dampen any extra possible voltage. So how do I figure out wattage draw so that I can maximize the number of LEDs per .5W resister with out over taxing said resistor. I don’t just want the answer. I would also like the formula so that I can figure it out for my self in future applications.