BTW -- your other question:
does a LED run on current or voltage.....
is best answered by saying BOTH!
As should already be obvious based upon my last reply, the voltage causes the current to flow through the LED. The LED appears to the circuit as a load, and voltage will be dropped across it. The current through that load is what actually does the work, but its presence would be impossible without the associated forces required to move the necessary electrons.
Consider a device like the L53SGD super bright green LED from Kingbright. The specs on this LED show that it is a 20mA/2.2V LED. Suppose that you wanted to operate this LED from a 9V source, as shown below:
**broken link removed**
As already stated, the LED is a 2.2V device, and the source is a 9V source. We want to use a resistor in series with the LED to limit the current through the circuit so as not to destroy the LED with too high a current. The question is, what value should the resistor be?
The first thing that we need to know is what effective resistance the LED will have in this simple series circuit. So, based upon what we know -- the forward voltage across the LED (2.2V) and the design forward current of the LED (0.02A) -- we can calculate that the LED will effectively be a 110 ohm resistance (2.2V divided by 0.02A). Remember that...
We know that we want approximately 20mA (0.02A) as our circuit current, right? So all that we need to do is to determine what resistance will cause (or allow, if we are thinking in terms of current limiting) the desired current when supplied by a 9V source.
Using Ohm's Law: R=E/I -- R = 9/0.02 -- R is determined to be 450 ohms. Subtract from that the effective resistance of the LED (110 ohms), and we come up with a calcualted value of 340 ohms for the series limiting resistor. The closest standard resistor to this value in a common 5% tolerance would be 330 ohms. 330 + 110 = 440 ohms, which would allow a circuit current of 20.454mA.
Now let's see how the circuit voltages break down. Again using Ohm's Law, and based upon a current of 0.020454A, we see that the voltage across the 330 ohm resistor would be 6.74982V (0.020454 x 330), leaving 2.25018V across the LED (9.0 - 6.74982). Of course, as it is a simple series circuit, the current through the LED is the same as that through the resistor at 20.454mA.
BTW -- the resistor used here should be at least a 1/4-watt type, as the calculated continuous power to be dissipated in the resistor is 0.1380614319 watts (6.74982V x 0.020454A), which is greater than the power capacity of a 1/8-watt resistor. Also, should the LED short, causing the full source voltage to be dropped across the resistor, the 1/4-watt rating would still be (just) adequate, as the calculated power then would be 0.245 watts.