loblahblahblah447
New Member
OK here is the deal, I have a 40W resistor and I want to limit its output to 15W, how would I go about doing this?
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Reduce the voltage that feeds the resistor. When the voltage is 1/2 then the dissipation in the resistor is 1/4.
Thanks for responding! Reducing the voltage is not an option, I am trying to power from a portable energy source and want full voltage... I want to limit the energy used on the resistor side, not on the power source side, if that makes any sense.
Makes sense, but not possible. The wattage power dissipated by the resistor is determined by it's resistance value in ohms and the applied voltage to the resistor. To lower the power dissipated by the resistor you either can lower the voltage or raise the resistance.
So the only way to get 15W output is to spec a 15W resistor?
Well basically what I want to do is run a heating element on a disposable battery. As I understand it, the heating element is essentially a resistor.
The spec I have for the heating element is a maximum wattage of 40W, but I want the power output to be 15W. Also, there is a spec for 120V and 240V, am I to understand that this is the minimum voltage required?
Well basically what I want to do is run a heating element on a disposable battery. As I understand it, the heating element is essentially a resistor.
The spec I have for the heating element is a maximum wattage of 40W, but I want the power output to be 15W. Also, there is a spec for 120V and 240V, am I to understand that this is the minimum voltage required?
The heating element will have to be run at the voltage it is rated at for the wattage rating to apply. If you run a 120vac heater at 12vdc it will not generate it's rated heating power. A 40 watt heater designed to run at 120vac will only output .4 watts of heat if run at 12 volts.
Lefty
Did you actually mean 4 watts or 10% of voltage = 1% of power? Thanks lefty.
The power rating of a resistor has nothing to do with how much heat it makes. A low voltage across a resistor makes it heat less. A high voltage makes it heat more.
Ohm's Law determines how much a resistor heats.
A resistor with a high resistance heats less than one with a low resistance if they both have the same voltage.
Well let's say I have a D battery, 1.5V, with a good energy capacity. All I know about the heating element is a max power spec of 40W, is current a somehow fixed value in this situation? Without a resistance value for the heating element am I SOL to calculate anything? I don't know what kind of assumptions I can make.
An alkaline battery cell is 1.5V when brand new. Its voltage drops as it is used.
A brand new D cell will be 1.3V when it has a 1A load into 1.3 ohms which is only 1.3W of power in the resistor.
Its voltage will be 1.2V and the power will be 1.1W in half an hour.
Its voltage will be 1.0V and the power will be 0.77W in 6 hours.
If the heater produces 40W with a 120V supply then it will produce 15W with a 45V supply. It will produce 0.006W with a 1.5V supply.
BUT last a very long time.It will produce 0.006W with a 1.5V supply.
hi,
I think you are misunderstanding the heater spec,
it says quote:a max power spec of 40W.
This is the max power/heat at which it can be used, NOT any other value less than this.
Lets assume the heater has a resistance of say, 12 ohms.
If I connect the heater across a 12V battery the Iamps= 12Volt/12ohm = 1amp.
Watts = V * I or in this case,
V^2/R = 12^2/12 = 12Watts.
Whats the voltage value you want to use and the resistance of the heater coil???
What do you mean when you say that it can't be used at a value less than 40W?
Basically, I need a resistance value for the heater to calculate the possible power output?
Also, thanks guys, this has all been extremely helpful so far!