I need some help with an LED circuit. I have 48 LEDs wired in parallel, each with a voltage drop of 5V, operating at 40 mA, with a 12V supply. Should I use a single 175 OHM resistor for each LED? Should I split the LEDs into 24 pairs, each pair wired in series, using a 50 Ohm resistor? What would be the best design?
Hi there,
To add to what others have said...
Before we can determine what the best or cheapest method is we
have to know how much your 12v source changes. Does it go from
10v to 14v, or a wider range. For example, an automobile battery might
run from 11v up to 14 or so volts (lower when starting the car) and
this changes the way you can drive the LEDs.
As others have already mentioned, 40ma is too high for LEDs regardless
how you drive them unless you are also pulsing them which i dont think
you want to do.
Also, the typical voltage drop for a white LED is 3.5 volts.
Paralleling LEDs isnt a good idea either because one LED might
hog more of the current than the others. You can mix and match
them if you want, but that's no guarantee this will work the same
in the future when the LED forward voltages change a little.
This means it's best to use individual resistors when possible.
Because you are using 12v and the LEDs only drop 3.5v or so,
you can probably use a series connection of two or three LEDs
to a group, but again we need to know your 12v line tolerance
before we can make this call.
More info you should supply:
1. How much does your 12v source change, or where does it originate.
2. What color LEDs are you using.