Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

constant current lithium chargers

Status
Not open for further replies.

slack

New Member
Hi there guys!
I don't do message boards too much but this one has been nagging me for a while:

What's all the hype about constant current li-ion chargers? I've just built a simple voltage regulated power supply using an lm338 and a 4 ohm limiter to see if I could charge a li-ion cell and i'm getting some nice current flow when the battery is discharged(obviously) that slowly falls as the battery gets more charged(great according to specs).

My question is how would a constant current "phase" improve things since i'm already supplying 4.2v and letting the battery take as much as it can carry? It seems to me that a constant current regulator would hit the 4.2v ceiling trying to increase current and essentially become as good as a fixed 4.2v rail.

The extensive internet reference to such chargers is making me think i'm missing something here...can someone give me a physics lesson?

thank you,
73
 
Last edited:
As I understand it, Li-Ion charging is normally in two phases. The first is the constant current phase where the current is limited, and then the constant voltage where the voltage is limited to 4.2 V.

In the constant current phase, the current is limited to what the battery can take, or what the charger will take. It doesn't much matter if the current isn't constant, as long as it doesn't overheat the charger or is too much for the battery. What batteries will take varies, but most will be happy charging them at the 1C rate. If your current isn't constant, charging could take longer, but that is the only problem.

Your 4.2 V has to be very accurate, and you might get a larger number of charges at 4.1 V.

There are a couple of other rules that commercial chargers apply when needed. If the battery voltage is low, below about 3.2 V, charging should not be more than 0.1 C. The characteristics of the batteries mean that it takes very little time to come from 3 V to 3.2 V even at 0.1 C.

Also, you should not charge if the battery temperature is below 5 or above 40 C
 
Hi there guys!
I don't do message boards too much but this one has been nagging me for a while:

What's all the hype about constant current li-ion chargers? I've just built a simple voltage regulated power supply using an lm338 and a 4 ohm limiter to see if I could charge a li-ion cell and i'm getting some nice current flow when the battery is discharged(obviously) that slowly falls as the battery gets more charged(great according to specs).

My question is how would a constant current "phase" improve things since i'm already supplying 4.2v and letting the battery take as much as it can carry? It seems to me that a constant current regulator would hit the 4.2v ceiling trying to increase current and essentially become as good as a fixed 4.2v rail.

The extensive internet reference to such chargers is making me think i'm missing something here...can someone give me a physics lesson?

thank you,
73

Hi,

The main reason for the 'constant current' phase is to limit the current getting to the battery because there is a limit to how much current you should supply to a battery cell for charging it and that is given by the manufacturer. To exceed that limit means to risk some danger of fire or explosion or simply to damage the cell at best.

The phrase "constant current" however is sometimes used to indicate a "current limit", not necessarily a true constant current. The current limiting circuit often keeps the current constant and looking at the curves for a charging cell makes it look like a constant current, but it's just a current limit, not really a constant current. The current is "limited" to some value like 300ma, and it might look constant because the circuit is able to keep it constant even though it can vary. As long as it doesnt go over 300ma it's ok. So in short, the part of the circuit that limits the current does not need to be super accurate as long as it is close to the manufacturers max spec. Thus, the max current could vary from say 250ma to 310ma or something like that (the actual limit depends on the cell of course), and you will see it drop significantly once the cell gets close to full charge. At some point as the current drops the charging is terminated, usually around 5 percent of max current.

The voltage control is also a limit point, not a constant, and this is certainly evident when the cell is charging and you watch the voltage climb. The voltage limit however has to be very accurate. The max is 4.25v and the cell should not be allowed to reach that point really. There's a tolerance with most voltage meters, so a good place to limit voltage is 4.15v instead of 4.20v, just in case the meter is off. The penalty is a little less charge capacity but the benefit is longer cell life.
 
Last edited:
Hey guys, thanks for the answers.It's pretty much as i thought.The confusing part was that a real constant current supply would increase voltage in order to maintain a constant current and doing so would increase it over the cell's limit(say by just using an lm317 in constant current mode without a voltage limit).

As for accurate measurements i've setup current and voltage monitoring using an arduino and a wifi module and sending the data over to a GUI app on the desktop so i can plot and learn more about charging/discharging these things.

thanks a lot for the info!
73.
 
Hi,

The simplest way of thinking about it is that you use a power supply set to 4.200v with a current limit setting of say 300ma. If the cell is under 4.2v the current limit keeps the cell charging at around 300ma, but once the cell voltage reaches close to 4.2v the voltage regulation takes over and the current limit doesnt do anything.
So first the voltage limit doesnt do anything while the current limit is in effect, then later the current limit doesnt do anything while the voltage limit takes effect.
The tolerance on the current limit is say plus 10 percent and minus 50 percent (roughly) while the tolerance on the voltage limit is plus or minus about 1 percent with the absolute max at 4.250v, nominal at 4.200v, and a good place to set it at about 4.150v.
 
Diver, that's what I've read as well, it's constant current, up to 4.2 volts, and then constant voltage until the charge current drops to about 2% of the initial charge current. It's important however to note that from what I read after the first charging state is complete the cell is at an 80% state of charge, so you spend quiet a bit of extra time getting that last 20%. So called fast lithium chargers simply skip the second stage completely and charge at 1C till or better till the 4.2 volts is reached and then cutoff. This loses about 20% of the capacity of the cell but increases the cell life quiet a bit, as that extra 20% is where the chemistry is at it's limit for holding charge.

MrAl, where did you pluck the magical 300ma number from? Battery charging current is always related to cell capacity, and none has been given. Taht's why charge/dishcharge currents are always given as a whole/fractional relation to the AH capacity of the cell.

I've personally hand charged many Lithium cells from a simple constant current source using an opamp and a mosfet through a sense resistor. I'd feel comfortable using a 1C charge rate and simply monitoring the voltage however because the failure mode of a Lithium cells happens to also tend to be a large orange redish ball of fire monitoring the cell temperature with a thermistor is a very basic step to making a safe charger. When hand charging Lithium cells I check the cells with a thermal gun, and simple touch to make sure nothing is going on.

For the better Lithium cells on the market today as long as the temperature is monitored charge rates can be as high as 5C for fast charging, obviously because of the heat (especially if not monitored) this can stress the cell chemistry out. One thing not mentioned so far is the low voltage cutoff, this is actually as important than the charging specs, because a cell at between 3 and 3.2 volts not charged immediately after discharge will cause the chemistry to passivate. 'storage charge' of LiPolys as best as I can find is 40% SOC in a chilled(not frozen) environment.
 
Last edited:
Hello,

The 300ma is just for reference. That's for an AA size Li-ion cell that i happen to have. For my bigger cell it takes 1 amp.
The 300ma is so i can say 310ma is maybe max, that's all.
 
MrAl, AA sized are from what I can find 800-900mah, safe to charge at 800-900ma. 300ma is approx 1/3C, that's a conservative charge rate but probably best if temperature isn't being monitored during charge. Again though the charge/discharge current rates for a secondary sell are always listed as a fraction or multiple of the capacity of the cell.
 
Hi,

Well, then none of those cells are the same as my cell, which the manufacturer states the charge rate is 330ma max.

As i was saying though, that 300ma was just so i didnt have to constantly reference every capacity cell when talking about max charge rates and cutoff rates, etc.
 
Hello,
So I've made a little fast charger using that lm338.It's current limited to about 1.2A and the voltage is set to 4.18V.I'm monitoring battery voltage and current draw using an arduino and sending the data wirelessly to my app.So far i've tested with a couple of 16340 1200mAh Li-Ions and it looks pretty good...they charge from 2.9 to 4.1v in about 20 minutes, 10 more to get the current draw under 150mA and that's where i stop it.There is really no heat generated and they take about 40 mins to discharge to 3v spinning an electric drill motor.

73
 
Are you sure about your numbers they don't make sense, in order to charge a 1200mAh battery you'd need 1.2amps for an hour, 30 minutes is more like 2.4amp charge current.
 
Sorry about that, i messed up my battery capacity and my current limit...so the current limit is at 2 amps and i haven't fully charged one yet, i've been stopping it when the current drops below 200mA.I consider this level of charge to be quite acceptable although i might experiment with upping the voltage limit to about 4.24 and letting it saturate to about a 50mA draw.
 
According to Wikipedia 4.235 volts is the maximum tolerable cell voltage, and that's probably at room temperature, this is not a safe charging voltage. I'm assuming since you're charging these cells that they're naked cells without protection circuitry, cells charged to this peak of their chemical holding ability and then discharged too quickly will burst into flames. Modern devices that use Lithiums have built in current/thermal limiters that appear as small strips on the side of the cell, if yours doesn't have this make sure you have a fire extinguisher nearby for any further testing. For practical devices, 4.1 is a common limit. Practical charging considerations are best left up to the cell maker.
 
LiIon polymer batteries should not be charged at a rate greater then about 0.7C. Higher rate will cause too much cell heating and reduce its longevity. It also will not reduce the charge time as the cell will hit 4.2v sooner with a lower state of charge, requiring more time in the constant voltage phase.

It takes about 2.5 hours to fully charge a LiIon battery that has been totally discharged. Just no way to speed it up. So called quick charge just does a partial charge, to about 70-80%. Putting a slightly larger mAH battery that is only charged to 80% is perfectly acceptable.

Most cellphones charge at a rate between 0.5 and 0.6 C.
 
Hello again,


I know there has been a little discrepancy in the max charge voltage of the Li-ion cells on various web sites, so just to note i have been using 4.150v for some 10 years now and even go as high as 4.180v but that higher voltage was checked with a 50,000 count volt meter that is accurate and has been checked with other quite expensive meters around the same time as the measurements were made. I often terminate at a current of 1/20 cell capacity in milliamps. I always monitor the current and voltage during the charging process too just to keep an eye on things. I also like to keep the charger in the same room as i am in. I use a specially designed dual measurement meter (amps & volts) that can be read from across the room so i can glance up at it periodically while doing other things on the computer.
 
Last edited:
RCinFLA, that's only true for earlier Lithium chemistry, modern Lithium cells are significantly more durable for charge/discharge currents, heating should never be an issue with Lithium charging because they're 97-99% charge efficient, if they're heating up during charging something is wrong, which is why temperature needs to be monitored. I'd challenge that 2.5hour limit as being related to the charger design, you can most definitely fully charge a Lithium cell in less than 2.5 hours and safely without damaging the cell. If it's a series Lithium pack though all bets are off because cell balancing for Lithiums is very important, most modern consumer devices though only use a single cell.
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top