I could use 500ohm.
specs for the LED I will be using:
Specifications:
Size (mm) : 5mm
Lens Color : Water Clear
Reverse Current (uA) : <=30
Life Rating : 100,000 Hours
Viewing Angle : 180 Degrees
Absolute Maximum Ratings (Ta=25°C)
Max Power Dissipation : 80mw
Max Continuous Forward Current : 30mA
Max Peak Forward Current : 75mA
Reverse Voltage : 5~6V
Lead Soldering Temperature : 240°C (<5Sec)
Operating Temperature Range : -25°C ~ +85°C
Preservative Temperature Range : -30°C ~ +100°C
I use this conversion to find out the correct ohm:
Okay so how do I determine the resistor to use then ?
This is where it may get a bit tricky for you but here goes.
Determine what we will have in the circuit.
- Our 3V 20mA LED package
- 12V supply from the PSU.
Okay so our supply is 12V and our LED is 3V ( working or forward voltage )
calculate : supply voltage (12.0V) minus forward voltage (3.0v) equals 9.0v
calculate : new circuit voltage (9.0v) divided by required current through the diode (0.015A) equals 600 Ohms
So we require a 600 Ohm resistor to be fitted to the ANODE (positive) lead of the LED package when placed into the 12V circuit which will limt the current through it to 15mA.
Note: It is important not to mix your units here you want to convert the mA into A hence the 15mA = 0.015A
If you wish to run the LED at it's maximum rated current just substitute 0.020A in where the 0.015A is.
How to: Use resistors and LED's in my rig ? (a guide for the first timer) - Overclock.net - Overclocking.net