Driving XHP70 32W POWER LED near max brightness?

Status
Not open for further replies.

Very simple. If you turn the LED on and off at a fast rate, the human eye will see it as always on. This is commonly done in many LED applications. The LED is only dissipating heat when it is on, so if the duty cycle is 50%, then you have reduced dissipation by 50%. LED's come up to full brightness very very fast, so you can turn them on and off at rates up to tens of KHz if you want. You only need to toggle them at a hundred Hz to make them appear to be always on at full brightness. A lower rate would help make the switching device (a FET for example) be more efficient.

The problem with this idea is that it works fine because the eye looks after the smoothing of the light, but in a projector application there could be aliasing between your toggle rate and the image rates of the projector. Synchronization would solve this.
 
watch out the 19V if is not regulated otherwise, it will blow the 20V caps

You could probably run off CPU +12 and add 3A PTC or 28 AWG hookup wire if too hot. i.e. 12.25V instead of 11.75 CV
 

I googled these keywords: "Apperent brightness of pulsed LED human eye", and got lots of hits. I read a bunch, and there seems to be no consensus on your point. This discussion typifies the lack of consensus. Here is another.

Please supply some references to published science on this topic. I didn't find any...
 
I'm not a specialist in this area but I did find some references that may be useful. A comprehensive list of papers covering the early work on flicker fusion
"An Annotated Bibliography of Flicker Fusion Phenomena 1740-1952" by Carney Landis sets the stage for the basics on the persistence of vision (flicker fusion).
I found up-to-date discussion of flicker effects in various industry app-notes that support the basic premise that human vision generally does not perceive pulsing above a certain rate. The actual rate is dependent on quite a few factors, so we might not agree on the critical minimum frequency that is reasonable, but perhaps claiming that a pulse rate above 500 Hz might not be controversial.

I was less certain about my claim "appear to be always on at full brightness" and had to do some more reading. The many application notes discussing PWM for LED dimming quickly made it obvious that I was wrong on this point. A duty cycle of 50% on/off of current through the LED apparently generates an average of one half the perceived light intensity according to most LED manufacturers. So it seems that my suggestion was so much uninformed horse dung after all. Well it happens now and then. Sorry about that.

However, if one is using the LED to project an image that is constructed in frames or lines, there is still some merit in the idea of turning the LED off during times in the image construction where no pixels are being projected. I don't know if there are any such times in modern image display standards.
 
Ok, I was a bit confused about what you were saying since duty cycle PWM is how you dim an LED.

I wonder if I could tap into the projector to get a good supply. It has many voltages but I don't know which rails can support a 2.4A load without shutting down some other function. For example there is one that reads about 14.25V when the projector is on, but it seems to go to the motor that spins the color wheel. When I'm home I'll post some details about the projector power supply, it seems strange. Most of the projects I've read about where someone mods a projector have a ballast for the lamp and a totally separate supply for the electronics. Mine only has 120VAC come into the ballast, then 3 wires come out of the ballast into the secondary supply which then goes to the motherboard. There is no other path from external power to the motherboard. I checked the wires from ballast to the mobo power supply expecting them to just be passing the AC power from the wall - but no, they seem to carry 300V gnd, and -300V DC, then this supply puts out several lower DC voltages to the mobo. I feel like I must have made a mistake because this doesn't make much sense to me. Though I have been surprised before by methods that turn out to be common although they seem to be odd. Anyway, as I said I'll post pictures and more info on that, I'm sure there's no way for anyone here to tell what this projector is doing just from this, unless they happen to work with a lot of projectors.
 
For the most primitive ok design , one can use any regulated SMPS at 12.0V and add AWG 28 or 30 wire resistance to draw the exact current required, knowing how many mOhm per foot of hookup wire if long is OK.

For the Nominal LED

Typical Forward Voltage 5.8 V White @ 2100 mA (6 V)
11.6 V White @ 1050 mA (12 V)

1A at 0.4V drop is a 400 milliohm Resistor per LED array on 12V ... 1/2W to 1W size
Switching to 600 mohm will give 2/3A .

Measure LED with such a value with DVM will verify if adjustments are needed.

Benefit: No flicker. No regulator needed other than Vreg source. Eg attached.

If you need dimmer control a Power FET or transistor can be biased to match this drop with a pot // fixed R's
 

Attachments

  • image.jpg
    456.2 KB · Views: 226
I was wondering if that might work. I mean, if I take a 12V 1A power supply, and my LED can take 12V 1.022A, it does seem like a little series resistance could keep it safe.
 
I was wondering if that might work. I mean, if I take a 12V 1A power supply, and my LED can take 12V 1.022A, it does seem like a little series resistance could keep it safe.

You are forgetting that high-power LEDs go into thermal runaway if driven from a voltage source, even with a small amount of series resistance. You really do need a constant-current driver. Quit fighting it!

You also forgetting that your 1A power supply will put out a lot more current if overloaded. At that point, it becomes a contest as to if the power supply or the LED will blow up first.
 
Last edited:
It depends on your supplier. Prefer Toshiba parts and other sources with tighter tolerances with voltage bins of +/- 0.5 to or +/- 0.1

But Cree's default specs are only nom. and max for the 1 to 5% stuff that their high volume customers would reject or spec as above.

Forward voltage (12 V, @ 1050 mA, 85 °C)

Vf= 11.6 typ. 12.4 max.

So it is possible theoretically, you get the rejects of the 11.6 +/- 0.2 V bin not shown in datasheet, but available upon request, Default Vbin K in datasheet infers 12.4 max at 1050mA at 85'C

Then again since it is unlikely it will be unlikely you would be dissapointed with a slight almost insignificant drop from 1A to 0.8A
 
You are forgetting that high-power LEDs go into thermal runaway if driven from a voltage source, even with a small amount of series resistance. You really do need a constant-current driver. Quit fighting it!
No Mike you are wrong. When you add Rs to ESR in same ballpark, the thermal resistance of 0.9 deg C/W is less than the NTC voltage coefficient of -8.5 mV / deg C with the small Rs added.

I left out details to compare the worst case heatsink Rca but you can figure it out.
When you add Rs to Match ESR of Power LEDs, the tempco of -8.5mV/'C is cut in half and more with larger Rs.
A 12V 1A LED has a nominal ESR 0.83 Ohms


This I have discovered , I call Stewart's Theorem for all diodes for 1= watt rating * ESR product at 25'C


Since this is 4 LEDs in series each 3W , the ESR at 25'C is 4x 1/3 Ohm and at 85'C is appears to be a bit lower or 7/8 Ohm.

For very simple efficient design, and tolerances considered, this works.

But if no heatsink, then yes.... At max current you will have a problem.


i can show you dozens of my examples with Rs=o and all LEDs in parallel using thin AWG30 hookup wires to 16 or 18 AWG around my house and yard.

A good CPU heatsink & fan is 0.1 deg C/W, a crappy one with slow fan is 1 degC/W.
 
Last edited:
Well, let triode build it as you suggest. He is paying for the LEDs; you are not.
Does that mean you dont understand Ohms law for thermal calculations with NTC and my positive experience on ESR compensation? I dont expect everyone to understand, but you ought to.
 
Last edited:
They are almost $20 per LED, and I already have the switching current limiter I bought and the linear one. Everything I can find seems to suggest that current limiting is needed. For little 3V 60mA LEDs I routinely just stick a resistor on them, but I can't find any example of a 10W+ LED being used this way, let alone 32W. So to be on the safe side I'll use the current limiter. I'm just not sure how big of a step down the switching one can handle. But I can always test it with a resistor and a multimeter before I run the LED on it.
 
An adjustable CC or one which regulates to chip temperature is best design in case of fan fail.

My rule of thumb or trick if you wish, to avoid thermal runaway is to add enough wire resistance or Rs power resistor or driver RdsOn to match or exceed the LED ESR. I have applied this to arrays or single LEDs from 60mW to 100W.
The other way is to run them at 1/3 or 1/2 of Pmax.

When using the absolute max power rating, it is very important to understand and match the heatsink performance to that of the chip, if possible or better. In this case, the chip junction to solder surface = 0.9 °C , CPU heatsink can be as low as 0.1°C/W but the sinkpad to heatsink may be poor. Meanwhile a medium sized TO-220 sink 5°C/W would not be good here for more than 12W.

... If 32 W LED temp rises 60 °C from 25 to 85 °C, they were assuming from junction to ambient , Rja = 60 °C/32W ~2 °C/W or about equal between chip to solder and solder on board to sink & sink to ambient. Forced air can improve thermal conductance x5 with ~ 2m/s flow velocity and a bit better with more. But since the LED chip area is so small the job of Watts/ sq. in or sq.cm is as challenging in LEDs as CPU's.


Note in the graph from the CREE datasheet at 85'C , and 2400 mA, Vf=12.7 ~ 12.75 and 12.5V , If=2100 mA, so ESR=ΔV/ΔI=0.25V/0.3A = 5/6 Ω

Since this is adding 5/6 Ω Fixed R @2.4A would be a 2V drop above 12.75 or 14.75 is a pretty non standard voltage,

I will retract my previous suggestion to run at 12W fixed V with Rs and instead run at 32W with a matched SMPS CC power supply for best efficiency and dim function.

The fan should continue running until cool after LED is powered off for a short while.




In small volume the CC power supply from AC maybe 50 to >100% of the power LED cost.
 
Last edited:
Keep in mind CPU's are very stiff ceramic and alum clad boards are flexible so the pressure from corner screws may cause warp and coplanar error from a thin board and surface roughness in microns like burrs and scratches will degrade thermal resistance significantly, even with silver grease.
 
Last edited:
I feel like worrying about that might be over thinking this a bit. But I do have plenty of mechanical parts, I could put springs or rubber spacers under the screws, like they do in a CPU mount to allow a little float room. I also have thermal switches, and I plan to mount one near the LED. If the heating is due to bad contact with the heatsink the temperature rise should be slow enough for the thermal cutoff to catch it. I'm not sure what temperature I should use, but I will probably pick a conservatively rated one, since I don't need to run near max temperature.
 
If you have played with CPU temp rise, you will understand these coplanarity issues with heat flux on contact pressure better with 200W/ sq in. Ballpark , contact area on substrate. alum is best for heat velocity and copper for thermal conductance.

Adjustable 4.2 A CC LED driver is best for you.


... although for me I would use active DC +Ac biased PWM hysteretic buck-boost modulated low RdsOn drivers with DC choke and 20kHz from a small leadacid battery with trickle charger and measure float voltage to sense junction temp. For a large array of these chips up to 1kW for low duty cycle useage.
 
Last edited:
Status
Not open for further replies.
Cookies are required to use this site. You must accept them to continue using the site. Learn more…