Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

What's with all that resistor power rating stuff, anyhow?

Status
Not open for further replies.

carbonzit

Active Member
Following a side discussion in another thread here, where someone warned me that a 2-watt resistor would get very hot if one actually ran 2 watts of electricity through it, I decided to test the hypothesis. I found a 75Ω, 3W resistor in my junk box, connected it to my trusty bench supply, measured the current through it just to be sure (yep, 200mA), and ran 15 volts through it.

Sure enough, it got toasty. Then it got really, really hot. Too hot to touch. Ouch!

So now I'm puzzled: how the hell does one rate resistors, power-wise? I had thought up to now that a 3 watt resistor could withstand 3 watts continuous with no ill effects. Apparently not.

How do you specify a big-enough resistor (power-wise) so that it doesn't get scorching hot? Any rules of thumb? Let's say one has a current-limiting resistor that dissipates 2 watts: what size resistor would be adequate? (Assuming typical construction techniques, no combustible material adjacent to resistor, a little warm is OK.)
 
Following a side discussion in another thread here, where someone warned me that a 2-watt resistor would get very hot if one actually ran 2 watts of electricity through it, I decided to test the hypothesis. I found a 75Ω, 3W resistor in my junk box, connected it to my trusty bench supply, measured the current through it just to be sure (yep, 200mA), and ran 15 volts through it.

Sure enough, it got toasty. Then it got really, really hot. Too hot to touch. Ouch!

So now I'm puzzled: how the hell does one rate resistors, power-wise? I had thought up to now that a 3 watt resistor could withstand 3 watts continuous with no ill effects. Apparently not.

How do you specify a big-enough resistor (power-wise) so that it doesn't get scorching hot? Any rules of thumb? Let's say one has a current-limiting resistor that dissipates 2 watts: what size resistor would be adequate? (Assuming typical construction techniques, no combustible material adjacent to resistor, a little warm is OK.)
The optimistic power ratings for resistors are typically for free air at 25'C and possibly with the leads connected to a heat sink, such as a circuit board with short leads. At the point the body of the resistor may be at 150 to 175'C. They have to be derated for ambient temperatures above that.

A good rule-of-thumb is to derate the resistor at least 50%. Even at that the resistors will still get fairly warm. After all they have to dissipate the power from a small volume and surface area.

Similarly you generally need to significantly derate semiconductors from their power ratings. For example, the maximum power rating of most power transistors is at a case temperature of 25'C, which would likely require a water cooled heat sink, and a junction temperate of 125'C which is rather high for good reliability.
 
Another factor is the desired reliability. Typical failure modes have Arrhenius-type rate equations, so the reliability can drop exponentially with increased operating temperature. Hence, careful design dictates measuring the operating temperature of things at rated power -- and increasing the component's power rating if the projected reliability doesn't meet design goals.

Back in the days of carbon composition resistors, I made the same discovery you did on a 2 W resistor -- but it involved the white burned whorls on my thumb's fingerprint... :)
 
carbonzit, welcome to the wonderful world of de-rating power components based on application =) Keep mind mind you can slap a heatsink on one of those resistors (as crut kind of hinted at) and get more than 2 watts through it no problem and a lot cooler as well. Find the exact thermal specs for a specific resistor can sometimes be hard many cheaper ones simply don't state it anywhere.

Try looking up thermal considerations with simple transistors, a well made PDF for a transistor will include thermal information including max junction temperature, junction to case thermal resistance, and a power rating that's listed for free air (meaning open free flowing air NOT in a case) Once you get into power components (basically anything over 1/4 watt) you have to add basic material thermal properties to your knowledge understanding heat sink coupling and then getting into air flow. I'm not sure if there's an similar value to 'max junction temperature' for resistors stated in a datatsheet I've ever seen but there is one and it's exponentially related to longevity as squishy was saying, that's based on the material and it's thermal properties.

Semi conductors are a little easier because their junction temperature limits are almost always listed along with junction to case and case to ambient thermal resistance, which are the key numbers you need to really figure out if the thing is gonna smoke or not.

Squishy, I'm with you on the burned whorls, tons of fun.
 
My take on it is that a 2W resistor is fine at 2 Watts, but so hot that it will burn stuff that is close to it, such as circuit boards, wires, and my fingers. Combine that with the fact that a 5 W resistor is not much more expensive, and it is easy to see why derating to 50% is very common.

However, resistors are very robust when it comes to short-term overloads. In this respect resistors are much more tolerant than semiconductors. So if you have a resistor that needs to handle 10 kW for 1/4 of a mains cycle, a 50 or 100 W resistor will be fine.
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top