Each cell of a series string of Lithium battery cells must have its charging voltage monitored and a clamp must be made when it reaches 4.2V. It is called "charge balancing".
The voltage of each cell must be monitored to reduce the charging current if a cell is less than 3V.
The charging must be turned off when the charging current drops to a few percent of the capacity of the battery, and/or a timer.
audioguru,
thanks for the tips
that makes sense
PWM stands for Pulse Width Modulation, PWM can be used to drop voltage efficently because it varies the duty cycle of a square wave to reduce or increase power to a load. When you have a PWM square wave at 10% duty cycle it means 10 percent less power is being used. Just put filter capacitors to drop the peaks of the square wave to the RMS voltage, the output will be a lower voltage but less power will be wasted. This is an example of a Switch Mode Power Supply, or SMPS. The PWM signal can be generated with an LM555, and it can drive the base of a large transistor.
MOSFET KILLER,
thanks for explanation
so, if i understand this correctly; if the load on circuit increases LM555 would change the width of the pulse that in turn would provide more amps to the circuit.
is this how it works?
you said "the output will be a lower voltage but less power will be wasted"
so, let's suppose i have a typical power supply from a PC.
if i was to modify the pulse of PWM to go to maximum duty cycle then technically I should be able to get 18V or so on yellow wire that is normally set to 12V.
Capacitors can be used to smooth out the ripples.
I am just trying to understand this correctly.
What is the typical power loss difference between PWM and a simple linear regulator?
I am guessing that in my external laptop battery application it would be substantial difference.
I have seen that IBM ThinkPad uses 2 batteries in parallel x 3, and iRecharge external battery pack uses the same setup (2 cells in parallel x 3) and then they use some kind of voltage upconverter to raise the voltage to 16,19, 20, or 21 V (depending on desired output).
That design seem to be very bad as the external battery pack draws lot more current (I measured at least 1.9A being pulled out of battery pack) and battery pack gets really hot.
I think this is due to the fact that battery pack has to upconvert the voltage with only 3 cells in series and 2 in parallel (10.8V) to double that.
I think that if design had more cells it would be more efficient, not to mention the battery pack would last longer.
So, that is why I was thinking of creating the battery pack that would have enough voltage and amperage and never get more than lukewarm.
I was thinking of using a linear regulator, but I can now understand the problems after all of the input that I got from people here.
your input is much appreciated.
All true, and the cell's temperatures should be monitored.
In Sony packs, there is also series FETs that can open the battery connects in case of overcurrent or a cell that is too low. Sony put a lot of money into the monitor/protection hardware in each Li pack to prevemnt fires and cell damage. Li batteries are not user friendly like the Ni cells which are pretty hardy and forgiving.
Yes, IBM thinkpad has the same kind of setup – there are some very sophisticated electronics inside the battery, there is a main thermal fuse, overload fuse – all as suggested by Panasonic li-ion battery implementation sheet.
Panasonic batteries have internal over current protection as well.
thanks everyone for lessons!