Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Voltage Drop Question

Status
Not open for further replies.

shaneshane1

New Member
I have a simple LED voltage drop question

This is a really stupid question that i cant seem to get my head around :confused:

here goes

I have a LED running of 12V

so 12V - 2.1V(voltage drop) = 9.9V

then 9.9V / 0.020(Amps) = 495ohms


so my question is do i NEED to include the voltage drop in the math?

when i put a multimeter on the anode(+) of the LED it shows 2.1V

and when i do the math without the voltage drop included

so 12V / 0.020 = 600ohms

I still get 2.1V on the anode of the LED?

is the current to the LED decreasing if i dont add the voltage to the math?

and also if im reading 2.1V on the LED anode does that mean there is a voltage loss of 9.9V through the resistor?
 
Your calculation of 495 ohms is the correct method. The voltage accross the LED varies a little with different current. I usually go to the next standard value of resistor in this type of calculation so a 510 ohm would be fine. You should calculate the wattage thou. (I^2)R or .02x.02x510=.204 watts.
a 1/4w resistor is fine.
 
The 600 ohm resistor does not have 12V across it. It has 12V - 2.1V= 9.9V across it. So the current is 9.9V/600 ohms= 16.5mA.
 
audioguru said:
The 600 ohm resistor does not have 12V across it. It has 12V - 2.1V= 9.9V across it. So the current is 9.9V/600 ohms= 16.5mA.


Ok thanks!!!

what gives the LED its "votage drop"?

and the voltage on the battery(+) still remains at 12V or close to anyway?

is the led only "using" 2.1V from the 12V supply?
 
The LED "voltage drop" is determined by the physics of the semiconductor junction. Within the range of current you're providing, it doesn't vary much.

The voltage on the battery(+) shouldn't be influenced much by small variations in current.

The LED and resistor "use" power, which is not exactly the same as voltage. Power is current times voltage, so the LED uses 2.1V times 0.02A, and the resistor uses 9.9V times 0.02A.
 
The LED forward can vary quite a bit so go for the lowest voltage drop and highest power supply voltage when calculating the current and resistor power dissipation and the highest voltage drop when taking the minimum power supply voltage into account.

For example, powering an LED with a nominal voltage of 3.5V (this can vary from between 3.3V and 3.8V) from three 1.5V cells is a bad idea as the battery voltage will quickly drop to 3V causing the LED to turn off.

The correct way of doing it is to use four 1.5V cells to give 6V and calculate the resistor value assuming the LED's voltage drop will be 3.3V.
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top