I need an adjustable loading circuit so I can load test various batteries from 1A up to 100A at various current increments. Using resistor networks to achieve the various loads is not very efficient or accurate so I thought using MOSFETS or transistors as an active load could be used for this.
What I'm trying to do is determine the discharge capability and voltage drop of certain batteries at various load currents. The batteries I'll be dealing with are 1.2v NiMH/NiCD and 3.7v Lithium Polymers for use in R/C applications where current draw can be very substantial.
To give you an idea of what I've come up with so far (and to show I'm actually trying, not just seeking an easy answer); Put a precision .01 ohm shunt resistor at the emitter of an NPN transistor, the battery to test as the transistor's supply, and a highly accurate voltage supply to drive the base. As the base voltage increases, it will increase the voltage at the emitter, and therefore across the .01 ohm resistor. The transistor will drop the rest of the voltage. I can then measure the voltage across this resistor, which corresponds to the current. I can then measure the battery voltage to see how much it drops as current increases. However, most transistors aren't capable of 100A and I'd have to take the base current into consideration (which could be substantial) for my driving voltage circuit. The resistor is just an example. I have a VERY large heatsink for the active device(s) already and have a few fans to help cooling. These test measurements will only be for about 10 seconds at a time.
Also, I know the dangers of exceeding the current ratings of these batteries, so I will be careful. The battery under test will be in an aluminum box for safety.
Any ideas on an accurate circuit for this? Can I simply parallel transistors? If I do so, how do keep one from conducting more than the others (due to parameter tolerances) hogging the current until that one blows? Can MOSFET(s) be used instead? There has to be a more elegant solution.
Thanks in advance!
(And no, this isn't a homework assignment, just a project for my hobby )
What I'm trying to do is determine the discharge capability and voltage drop of certain batteries at various load currents. The batteries I'll be dealing with are 1.2v NiMH/NiCD and 3.7v Lithium Polymers for use in R/C applications where current draw can be very substantial.
To give you an idea of what I've come up with so far (and to show I'm actually trying, not just seeking an easy answer); Put a precision .01 ohm shunt resistor at the emitter of an NPN transistor, the battery to test as the transistor's supply, and a highly accurate voltage supply to drive the base. As the base voltage increases, it will increase the voltage at the emitter, and therefore across the .01 ohm resistor. The transistor will drop the rest of the voltage. I can then measure the voltage across this resistor, which corresponds to the current. I can then measure the battery voltage to see how much it drops as current increases. However, most transistors aren't capable of 100A and I'd have to take the base current into consideration (which could be substantial) for my driving voltage circuit. The resistor is just an example. I have a VERY large heatsink for the active device(s) already and have a few fans to help cooling. These test measurements will only be for about 10 seconds at a time.
Also, I know the dangers of exceeding the current ratings of these batteries, so I will be careful. The battery under test will be in an aluminum box for safety.
Any ideas on an accurate circuit for this? Can I simply parallel transistors? If I do so, how do keep one from conducting more than the others (due to parameter tolerances) hogging the current until that one blows? Can MOSFET(s) be used instead? There has to be a more elegant solution.
Thanks in advance!
(And no, this isn't a homework assignment, just a project for my hobby )