Thanks for responding! Reducing the voltage is not an option, I am trying to power from a portable energy source and want full voltage... I want to limit the energy used on the resistor side, not on the power source side, if that makes any sense.
Thanks for responding! Reducing the voltage is not an option, I am trying to power from a portable energy source and want full voltage... I want to limit the energy used on the resistor side, not on the power source side, if that makes any sense.
Makes sense, but not possible. The wattage power dissipated by the resistor is determined by it's resistance value in ohms and the applied voltage to the resistor. To lower the power dissipated by the resistor you either can lower the voltage or raise the resistance.
Makes sense, but not possible. The wattage power dissipated by the resistor is determined by it's resistance value in ohms and the applied voltage to the resistor. To lower the power dissipated by the resistor you either can lower the voltage or raise the resistance.
I think you have to be a little more specific about what you working with and what you are trying to do with it.
When selecting a resistor there are two key specification that have to be determined, the resistance value needed in ohms and the maximum power dissipation required. One would not specify a 15 watt resistor that was going to actually dissipate 15 watts, that would be subjecting the resistor to damage as it would be too close to it's maximum allowed wattage. If a specific resistor was going to dissipate 15 watts of power one would specify a 25 watt or more power rating.
Well basically what I want to do is run a heating element on a disposable battery. As I understand it, the heating element is essentially a resistor.
The spec I have for the heating element is a maximum wattage of 40W, but I want the power output to be 15W. Also, there is a spec for 120V and 240V, am I to understand that this is the minimum voltage required?
The power rating of a resistor has nothing to do with how much heat it makes. A low voltage across a resistor makes it heat less. A high voltage makes it heat more.
Ohm's Law determines how much a resistor heats.
A resistor with a high resistance heats less than one with a low resistance if they both have the same voltage.
Why are you wasting power by heating a resistor. What is the circuit?
Well basically what I want to do is run a heating element on a disposable battery. As I understand it, the heating element is essentially a resistor.
The spec I have for the heating element is a maximum wattage of 40W, but I want the power output to be 15W. Also, there is a spec for 120V and 240V, am I to understand that this is the minimum voltage required?
As Audioguru mentioned in the first reply, the heat output is dependent on the voltage applied - to reduce the heat you need to reduce the voltage.
So what voltage is the element 40W at?.
Also, how long do you expect the battery to last, taking 15W from a disposible battery will drain it pretty rapidly - with heat output dropping all the time.
Well basically what I want to do is run a heating element on a disposable battery. As I understand it, the heating element is essentially a resistor.
The spec I have for the heating element is a maximum wattage of 40W, but I want the power output to be 15W. Also, there is a spec for 120V and 240V, am I to understand that this is the minimum voltage required?
The heating element will have to be run at the voltage it is rated at for the wattage rating to apply. If you run a 120vac heater at 12vdc it will not generate it's rated heating power. A 40 watt heater designed to run at 120vac will only output .4 watts of heat if run at 12 volts.
A thermostat controls the amount of heat from an electric heater by turning it off when it is warm enough then turning it on when it has cooled a little. If the heater is large then the on-off timing can be slow. You can turn it on and off quickly with an electronic circuit so the temperature varies only a small amount. The duty-cycle of the on and off determines the amount of heat.
The heating element will have to be run at the voltage it is rated at for the wattage rating to apply. If you run a 120vac heater at 12vdc it will not generate it's rated heating power. A 40 watt heater designed to run at 120vac will only output .4 watts of heat if run at 12 volts.
No, I meant .4 watts. It's a square function, watts = (voltage X voltage) divided by resistance. The resistor's ohms value is fixed, but the wattage consumed is based on the applied voltage.
The power rating of a resistor has nothing to do with how much heat it makes. A low voltage across a resistor makes it heat less. A high voltage makes it heat more.
Ohm's Law determines how much a resistor heats.
A resistor with a high resistance heats less than one with a low resistance if they both have the same voltage.
Well let's say I have a D battery, 1.5V, with a good energy capacity. All I know about the heating element is a max power spec of 40W, is current a somehow fixed value in this situation? Without a resistance value for the heating element am I SOL to calculate anything? I don't know what kind of assumptions I can make.
Well let's say I have a D battery, 1.5V, with a good energy capacity. All I know about the heating element is a max power spec of 40W, is current a somehow fixed value in this situation? Without a resistance value for the heating element am I SOL to calculate anything? I don't know what kind of assumptions I can make.
hi,
I think you are misunderstanding the heater spec,
it says quote:a max power spec of 40W.
This is the max power/heat at which it can be used, NOT any other value less than this.
Lets assume the heater has a resistance of say, 12 ohms.
If I connect the heater across a 12V battery the Iamps= 12Volt/12ohm = 1amp.
Watts = V * I or in this case,
V^2/R = 12^2/12 = 12Watts.
Whats the voltage value you want to use and the resistance of the heater coil???
An alkaline battery cell is 1.5V when brand new. Its voltage drops as it is used.
A brand new D cell will be 1.3V when it has a 1A load into 1.3 ohms which is only 1.3W of power in the resistor.
Its voltage will be 1.2V and the power will be 1.1W in half an hour.
Its voltage will be 1.0V and the power will be 0.77W in 6 hours.
If the heater produces 40W with a 120V supply then it will produce 15W with a 45V supply. It will produce 0.006W with a 1.5V supply.
An alkaline battery cell is 1.5V when brand new. Its voltage drops as it is used.
A brand new D cell will be 1.3V when it has a 1A load into 1.3 ohms which is only 1.3W of power in the resistor.
Its voltage will be 1.2V and the power will be 1.1W in half an hour.
Its voltage will be 1.0V and the power will be 0.77W in 6 hours.
If the heater produces 40W with a 120V supply then it will produce 15W with a 45V supply. It will produce 0.006W with a 1.5V supply.
hi,
I think you are misunderstanding the heater spec,
it says quote:a max power spec of 40W.
This is the max power/heat at which it can be used, NOT any other value less than this.
Lets assume the heater has a resistance of say, 12 ohms.
If I connect the heater across a 12V battery the Iamps= 12Volt/12ohm = 1amp.
Watts = V * I or in this case,
V^2/R = 12^2/12 = 12Watts.
Whats the voltage value you want to use and the resistance of the heater coil???