Why does the resistance between sink and ambient increase, when the power dissipation falls? I makes no sense that a bigger heat sink is required, when the power decreases. What's wrong?
Why does the resistance between sink and ambient increase, when the power dissipation falls? I makes no sense that a bigger heat sink is required, when the power decreases. What's wrong?
The thermal resistance (sink-to-ambient) of a heat sink is its efficiency to release heat to the ambient. As it is a sort of resistance, it must be low enough to protect the junction.
°C/W (or °F/W, if you like) is temperature/watt.
Thermal resistance is measured in °C/W, it's the temperature reise above ambient when the it's dissipating 1W. A small heat sink will warm by 10°C when it's dissipating 1W and a large heatsink might only rise by 0.5°C given a power dissiation of 1W, so the thermal resistances are 10°C/W and 0.5°C/W respectively.