Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Source voltage drop with parallel LEDs (resistor for each)

Status
Not open for further replies.

livebriand

New Member
I have a bunch of 3.2-3.8V up to 30mA white LEDs, and 56 ohm resistors. I figured that, V=IR, this should work with 5V. However, the problem is that the source voltage decreases as I add more sets of this in parallel, and the amount of current drawn doesn't increase proportionally as I add more sets. After about 12 LEDs or so, the total amount of current pretty much levels off. What's going on here?

The power brick is an older 5V 700mA Samsung phone one (it had some proprietary connector that I chopped off). There shouldn't be any negotiation necessary with this to get all the current, right? (it only has 2 wires coming out of it)
 
What's the internal resistance of your ammeter? It depends on range. If it were 10 ohms, it would drop about 3.2 volts at 320 mA (~12*30 mA*10 ohms). When you use a DVM as an Ammeter you have to be careful of it's internal resistance. So, what you thought was 5V is suddenly 3.

What happens if you take the ammeter out of the circuit?
What's the internal R of the Ammeter on the range your using?
 
Last edited:
I'm actually not sure what the internal resistance is - it's a Fluke 110 True RMS multimeter (I was using this meter for voltage, and another Fluke one for amperage simultaneously).

I don't see much of a drop when I use another power supply I have (MW 122A) on 4.5V (closest setting it has), 2A max.


I'll try removing the ammeter next, though I have to rewire the circuit again (I'm currently running 3 LEDs with a 56Ω resistor off 12V for the sake of testing it).



Edit: I tried running this off 3 AAs, measured 4.8V, but with 12 LEDs in parallel, each with a 56ohm resistor, it measured 4.4V. I guess ~300mA is enough to do that? I'm getting 3.2V on each LED there, pretty much the same as with the MW122A power supply. All this is without an ammeter.



Also, WITH the ammeter, I'm getting 3.1V per LED. The ammeter is a Fluke 179 btw.
 
Last edited:
Measure the voltage comming out of your supply and after the ampmeter as you add more leds.
 
Last edited:
Hi,

A simple test would be to measure (monitor) the voltage coming out of your supply source (wall wart or whatever) and start adding LED strings. As you add them one by one, see if the voltage drops. If could be that something is wrong with it or it is made to feed some special circuit. If so, you'd have to investigate that too. See what the circuit looks like that it normally feeds if possible.
If it can put out 700ma then it should be able to power at least 600ma i would think without too much problem.

If the 'wall wart' is unregulated then it could be getting the peaks clipped because the LEDs draw more current as the peaks rise. This would cause a decrease in output too. That would mean you might be able to simply add more capacitance across the output to get more current out as maybe it was designed with little or no filter capacitance (some are made that way). Watch out for overheating though once you get everything up and running (if you do that is).
An electrolytic of say about 3300uf should work nicely, but 2200uf might work too.
 
Last edited:
Yep, voltage before, voltage after and voltage across the ammeter.

I didn't see any specs either, except it has to be a 111 or 112 to measure current. The ammeters internal resistance can limit the available current.

Also check the voltage across a few LEDs.
 
Last edited:
With the MW 122A supply (nominal 4.5V):
12 LEDs: 4.438V (per LED 3.182V)
9 LEDs: 4.449V (per LED 3.157V)
6 LEDs: 4.462V (per LED 3.213V)
3 LEDs: 4.475V (per LED 3.229V)
1 LED: 4.486V (per LED 3.285V)
No load: 4.491V


With the phone charger power supply (nominal 5V):
12 LEDs: 3.662V (per LED 3.028V)
9 LEDs: 3.794V (per LED 3.031V)
6 LEDs: 3.998V (per LED 3.131V)
3 LEDs: 4.334V (per LED 3.167V)
1 LED: 4.757V (per LED 3.339V)
No load: 5.081V

That's the problem...



FWIW, the phone charger works just fine with my Galaxy Nexus. (I connected the leads to a USB connector, and shorted the data ones.) I haven't tried measuring the voltage under that load though. (anyone know an Android app to measure the USB voltage?)
 
Last edited:
One LED shows a serious regulation problem. With only two wires there is not likely any negotiation, but a break in the strands would cause excessive series resistance. That's just a guess.

I would think the adapter is broken at this point.
 
The problem is, with 12 LEDs anyway (I didn't test any other combinations here), I get the same issue with 3 other USB adapters I have around - about 3.7V (these all charge my phone just fine). Surely there has to be something else going on here?


Note: I have the adapters connected to a male USB connector I got off something else (I checked with a multimeter to figure out which terminal is which). Since I'm using this same thing for each of the USB adapters, I suppose something bad with this would screw up the results on everything.


Edit: WTF? I tried another USB connector and now this works perfectly! 12 LEDs in parallel on the 700mA phone adapter, and measuring 5.017V. I'm not even sure how the other connector would've shorted to begin with, but anyway, it looks like I'll be getting rid of that.

Edit: Fyi, I had a switch wired in with the USB cable (and some more wire to extend it), and as it turns out, the actual USB cable was causing the problem. Now that I think about it - maybe that has a resistor in it by design, because it was used for a USB LED decoration from CVS, and I didn't see any resistors with the LED. That makes sense...
 
Last edited:
Hi,

So everything is working now?
 
Not that it matters, but looks like there was a 10 ohm in series.
 

Attachments

  • junk.png
    junk.png
    8 KB · Views: 139
I just found it odd that the Fluke specs didn't include the series resistor value or the voltage drop for the 110 series. Nor was the input Z listed for the voltage ranges.
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top