Opinion on IR LEDs (related to multitouch tables)

Status
Not open for further replies.

Gontarz

New Member
So I was thinking of building my own multitouch table, it seems like a pretty simple project (heres link if interested: Build Your Own Multitouch Surface Computer - Page 1 | Maximum PC)

The only real electronic components that they had to make in the project is soldering together the infrared LEDs. They did this by using 12V from the computer PSU, and since the LEDs has a voltage drop of 1.5V, they made stings of 8 LEDs in parallel. Someone mentioned in the comments that not using resistors is not a good idea.

As you may have noticed from of my horrible terminology, i dont have a lot of experience in electronics (i can put together a PC, thats about it). Is their way safe? Is there a better alternative?
 
You do need a resistor in series with LEDs since they are current operated devices, and their current will change significantly with a small change in voltage. The resistor stabilizes the current. In this case you may need to have only 6 or 7 in series with the appropriate resistor size to give the LED current you need (you typically want about 2V drop or more across the resistor for reasonably stable current control).
 
so if i was going after a 3V drop from 12V to 9V, what resistor would i use?
 
Last edited:
An LED, like all diodes, as a specific voltage drop across it - this varies slightly with temperature and current but for the most part, the datasheet will provide the 'ideal' voltage. Current kills LED's, so you need a way to limit the current running through one. For simplicity a resistor in series is used for current limiting, its value is calculated from good 'ol V=IR.

We want R, so, V, voltage, is the voltage across the resistor. In your case, with a string of 8 LED, each with 1.5v across them, thats 12 for the LEDs. If your supply voltage is 15V, then the voltage across the resistor is 15- 12 = 3V.
I, current is how much current you want to flow through your LED string. Generally 5mm/3mm LED's (infrared or otherwise) take a max of 20mA. So say 15mA to run then cooler to make them last longer. 15mA = 0.015A.

R = V/I = 3v/0.015A = 200ohm.

Not suggesting thats what you should use (depends on how many LED's in series, and your power supply voltage) but it gives you a way of calculating a rough value.

I'm guessing the reason you've heard 'resistors are no good' is because in order to limit current, they have a voltage across them, and a current running through them. P = IV, Power = Voltage x Current. In the above example, with 3v across the resistor, and 15mA running through it, thats 45mW (0.045 Watts) of electricity, being wasted as heat in the resistor. These days efficiency is paramount for battery powered portable (and high power) applications, so sometimes a simple resistor is just too inefficient.

However, unless you're powering this from a tiny battery and want it to last as long as possible, OR, you're using 10W LED's, a series resistor is simple, cheap, and does the job well.

How you arrange your LED's is up to you, but I would advise that, if your supply is 12v, use 6 LED's in series, with a 220ohm resistor (or 180 for slightly high current). Then you can parallel these LED/resistor strings on your power supply. Not 'ultra' efficient but reliable, simple and cheap.
 
Status
Not open for further replies.
Cookies are required to use this site. You must accept them to continue using the site. Learn more…