Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Resistor Dilema

Status
Not open for further replies.

Voltz

New Member
Ok so there's something that's been puzzling me for ages now,

If you have a 12V Power Supply Regulated to 1.2A - **broken link removed** and a 2V LED 20mA, Then the LED I need would be
12V - 2V = 10V, 10V / 20mA = 500Ω surely?
But if the currents 1.2A and you want 20mA then wouldn't you need 1200/20 = 60Ω

Which of these is correct and is there any situation where the opposite is true?
 
The first solution is correct. The second one doesn't make any sense.
 
Last edited:
The 1.2A value only means the power supply can put out a maximum of 1.2A @ 12V.

Your first calculation is correct, but I would probably use a value like 560Ω, 1/2 watt.

Electronic devices only use the power they need. Like if you have your 1.2A power supply. The light bulb will only use the power it needs, say 300mA, not the full 1.2A the supply can provide.
 
That's what I thought but then 10V will be across the Resistor and 2V across the LED?
The Reason I thought it might be the second one is because if I =V/R then for I to = 0.002 if V is 12 then 0.002 = 12/6000? or a 6kΩ Resistor, why isn't it that?


EDIT: Thanks Birdman - so 6 2V LEDs Receiving power from a 12V Source is fine without a resistor even if the current is rated at 1.2A because they'll only use the 6 x 20mA they need which is 120mA?
 
Last edited:
But if the currents 1.2A and you want 20mA then wouldn't you need 1200/20 = 60Ω
That's the value of resistor you'd need if the power supply voltage was 1200V and the current required was 20A, in which case you'd need a huge resistor, P = V²/R = 1200²/20 = 72000W

The whole point of the resistor is to limit the current, as mentioned above, the current on the power supply is just the maximum it can safely supply.
 
Thanks Birdman - so 6 2V LEDs Receiving power from a 12V Source is fine without a resistor even if the current is rated at 1.2A because they'll only use the 6 x 20mA they need which is 120mA?

Or you could connect them together as two strings of 3 and the current consumption would only be 40mA and the LEDs would be just as bright.
 
EDIT: Thanks Birdman - so 6 2V LEDs Receiving power from a 12V Source is fine without a resistor even if the current is rated at 1.2A because they'll only use the 6 x 20mA they need which is 120mA?
No, you still need the resistors. That's what determines the current the LED takes. Without the resistor the LED will take a large amount of current and blow.
 
No, you still need the resistors. That's what determines the current the LED takes. Without the resistor the LED will take a large amount of current and blow.

Or you could connect them together as two strings of 3 and the current consumption would only be 40mA and the LEDs would be just as bright.

Er... what? So I need a resistor for a string of 6 or I don't need a resistor for a string of 6
 
You have to remember LEDs are really just diodes. If you don't limit them, they will just pass as much current as they can.

If you chain 6 together, I think they will all work together and limit each other.

I think crutschow thought you were wiring each one independently, not all together.
 
Last edited:
I see your delima. 6 LED's @ 2V each would be 12V, and so from your earlier equation: R = (12 - 12)/.02 = 0Ohms. While that's technically correct, lots of things can go wrong if you try to do it that way. Designing with a resistor to limit current will insure nasties like tolerences don't bite you in the butt.
 
I see your delima. 6 LED's @ 2V each would be 12V, and so from your earlier equation: R = (12 - 12)/.02 = 0Ohms. While that's technically correct, lots of things can go wrong if you try to do it that way. Designing with a resistor to limit current will insure nasties like tolerences don't bite you in the butt.

Thank you, so what do you suggest in such a case? 5 LEDs and a 100Ω Resistor or 6 LEDs and a 1Ω Resistor or what?
 
Here's how I would think about it; Each LED is a 20mA device @ 2V. So, the effective resistance is 2V/.02 = 100Ohms. That figure can be highly variable. What you want is a device that limits the current and decreases the circuit sensitivity to variables. For a first-order approximation, the origial circuit had a 5:1 ratio between the "fixed" resistance and the "variable" one. That's pretty good, IMO. So, I wouldn't cascade more than about two LED's with the given voltage. And more then two, and I think you'd need a higher supply. Alternatively, you can run the LED's at a lower current, and still get pretty good illumination from them. Good to experiment a little to get the best results.

For a better result, there are sensitivity equations that involve taking derivitives and forming ratios. Even then it's a judgement as to how much "sensitivity" you can tolerate.
 
Last edited:
You need a resistor, if you connect six in series and the supply voltage happens to be 12.5V, a high current will flow. The forward voltage isn't accurately controlled, so the total Vf for the string might be 1.9V and will also vary with the temperature, as it heats up the forward voltage will go down which could cause thermal runaway.

This is why I suggested two strings of three, in which case you need a 330R resistor.
 
Schematic

In the Spirit of thread recycling (I don't want to clog up the message board with 2 posts in one day, I wanted to ask if anyone saw a problem with this, my very first schematic

**broken link removed**

When the Voltage is over 1.2V * ([1k+1k+10k]/[1k+1k+10k]+1k) then the green LED is on
When the Voltage is over 1.2V * ([1k+10k]/[1k+10k]+ [1k+1k]) then the yellow LED is on
When the Voltage is over 1.2V * (10k/10k+[1k+1k+1k] then the red LED is on

Have I dived in too deep and thus got EVERYTHING wrong, have I got most wrong, some wrong? or by some miracle have I got all of it right?

Basically its mean to show that if the battery (Good ol' standard AA 1.2V Rechargeable) is above a certain charge all LEDs are on, if it drops then Green goes off and Yellow and Red is on, If it drops further then only Red is on, and if it drops even further, no LED is on ... have I achieved that :O
 
Last edited:
The FET's probably aren't needed. Next time, just start another thread. It's better than hijacking someone else's.
 
The FETs probably aren't needed. Next time, just start another thread. It's better than hijacking someone else's.

You mean hijacking my OWN thread :D

And I used FETs so that the voltage provided to the LEDs are constant, is that reasonable or are they really not needed, how do I re-arrange it? And other than that is it OK?
 
Last edited:
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top