Circuit to manage light level backlight led network from a MCU

Status
Not open for further replies.

CaptainBlood

New Member
DIY DAC logic to manage a backlight led network light level with a MCU

I have differents panels which needs variable backlight. Light level will be driven by an MCU. Number of led for each panel may be different, as well as the number of available output pins. To quantify the problem a little bit more let say that per panel no more than 32 leds should be wired together, probably less. So I have to find out a generic solution to my problem that I would adapt to each panel case.

Here a scheme of my first solution to the problem :

**broken link removed**

This is a ktechlab study I did for 13 bits DAC which probably is a higher resolution to any of my real needs. Please note that the LED network is represented by a single LED, which is inaccurate. Each of the MCU Output is represented by a switch connected to Vdd. The 0V reference bottom left is probably useless. A pull-up or pull down resistor may be wired at each MCU output before the diode, they are not represented there.

The problem with this solution is that the transistor state of the transistor. Most of the time It be will in intermediate state. AFAIK (very little, I must say), the problem will be waste of energy converted into heat by the transistor. Maybe it's good start for a good solution ? Please feel free to comment !

So I've been thinking of an alternate solution with a transistor wired to each of the MCU output and that would be only open or saturated, which may (AFAIK) avoid energy loss and heat.

**broken link removed**

Again pull-up/pull down resistors are not implemented at MCU Output level. Resistor values are mostly irrelevant. Its only the general conception I'm willing to talk about.
So I expect this second scheme to loose less energy than the first one.
I have two questions there :
Do I have to limit current with a resistor at each transistor collector ?
Should I scale my DAC function through MCU Ouput resistor ? through transistor Beta ? Both ?

Generally speaking, and regarding energy loss, is there any benefit having lo drop diodes on both scheme ?
Both solutions are using classical transistor, but maybe other transistor are more interesting ?

A subsidiary problem could be the need to adress some of these leds for a short blink, but I gess this is another story.


Please excuse the poor representation of these two pictures as I have very generic tools and little practice to generate them.

Thks for your attention.
 
Last edited:
Either circuit will waste the same energy since they control the light intensity by linear variation of the current through the LED. The only difference is that the first dissipates the heat in a transistor and the second dissipates the heat in a resistor.

If you want to minimize heat loss than use a PWM signal to control the LEDs. Varying the duty-cycle of the PWM signal varies the LED brightness. You can build a PWM controller with a 555.
 
Funny you mention it, that was my guess ( guess only, I'm IT not an electronician.)

the second dissipates the heat in a resistor
The one I mentionned to be at the collector ?

So the first circuit should be good enough ? And easier to implement 555 PWM I guess ?

Thanks for your attention, interest and support.
 
If you are going with the first approach, it would be much easier to use a D/A IC chip to generate a variable voltage to control the transistor.

If you eliminate the base resistors and put a resistor from the transistor emitter to ground, then the current will be roughly proportional to the base voltage.
 
If you eliminate the base resistors and put a resistor from the transistor emitter to ground, then the current will be roughly proportional to the base voltage.
You mean with the second approach I guess ?

Thks for your attention, interest, time & support
 
DAC IC seems great, but they are quite expensive and hard to find here in Europe. So ordering some would include high posting rates + taxes which are also calculated by excess. Some hardware ordered in USA finally costed us twice the original price (probably tax excluded). Although ordering 1K units would be competitive I must say.

As you probably know I wired these two resistors on base because it was recommended to reduce the difference of behavior between two examples of the same transistor. That was quite long ago, so I can not explain it any deeper. Anyhow I will study your proposal, since the first solution seems the cheapest. What wattage for the emitter to ground resistor would you recommend if 15mA*20 leds = 300mA are to be supported ?
 
Last edited:
For, say a 5V maximum base signal, the emitter voltage will be about 4.4V. The resistor power would be 0.3A * 4.4V = 1.32W. A 2W resistor should suffice. It's value would be 4.4V/0.3 = 14.7 ohms (15 ohms).

For DACs can you order from a place such as this?
 
Status
Not open for further replies.
Cookies are required to use this site. You must accept them to continue using the site. Learn more…