Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Home made battery charger - low current cuttoff

Status
Not open for further replies.

eimix

New Member
Hi

i made 6 and 8 cells (NiMh 1.2v each) charger, i used power drill battery charger and added voltage regulators into it - 9V 1.5A regulator for 6 cells pack, and 12V 0.1A regulator for 8 cells pack.

Everything works fine:
in beginning of charging regulators works like constant current, at the end they behave like constant voltage source. Current drops very low - something like trickle charging - no battery overheating, or overcharge.

Problem is that the only indication that charging is finished - voltage regulators cools down - means low current is flowing throw them.

Is there any other way to detect end of charging in my case?
Some sort of current threshold (circuit cutoff at low current)?
Or led indication if very low current is flowing? (<10mA or smth)?

I tried googling, but i cant find anything like this, probably i'm not searching right.

Please help,
Thanks
Eimis
 
Hi there,

Actually, that is not the correct way to charge NiMH cells.
I think you should look into this a bit further before the cells get
damaged.
Lead Acid cells and Li-ion cells charge on constant current/constant voltage,
but NiMH cells need a different way of detecting the end of charge, typically
what is called the 'minus delta V' technique.


If you want to charge with something simpler than that then you would
have to estimate the charge taken from the cells and try to approximately
charge for the amount of time it would take to replace that charge, but
no longer. For this all you would need is a wall wart and a current limiting
resistor, as you would be charging with current alone.
Another problem with charging with a constant voltage regulated source is
that the cells characteristic voltage changes over time and the regulator
has no way of knowing what it is at any given time.
Charging with a current you can roughly replace the charge after some usage
by knowing the current that your device draws from the cells and how much
current your wall wart or charger puts out, on average.

For example, if your device draws 1 amp and you run it for 1 hour, you need
to charge with a 1 amp charger for 1.4 hours, or a 500ma charger for 2.8 hours,
or a 250ma charger for 5.6 hours. This is really the only way to do this without
a more sophisticated charger circuit that can detect the true end of charge.
There's also no need to regulate voltage, unless you want to set it for some
high extreme like 1.8v per cell, or 2v per cell, just for a sanity check.
 
thanks for reply,
but i'm getting confused more and more :) i like building things, and do not always like redoing :D

Original charger is 29-30V, 2.5A charger, it was working with same cells (15 cells in series) charger used thermal sensor in battery and stopped charging (relay cutoff).
if i put same charger on 6cells pack - charger's 5A fuse fails, so that's why i used voltage regulators (78L12 and L7809). they decreases voltage and current.
In result - regulated voltage ensures that voltage per cell is no more then 1.5V.

Why my solution is not going to work?
according to my calculations - 9V charge voltage would result in 1.5V per cell (for 6 cells pack), and i measured current at the end of charging - it is very low (<40mA for 3000mAh 6cells)- so what kind of damage could it make for batteries?
Is 1.5V too low or what?

Timing does not looks like option - batteries not always are discharged completely and so on.
 
Last edited:
Hi,

it is actually easy.

You MUST have a constant current source and at least 1,4V per cell that you are charging. Obviously the cells must be in series.

To calculate the charge you simply take 10% of the rated Ah on the battery. So 6 cells, each 3000mAh will take a charge of 300mA which should be applied for 14 hours. This is the safest by far. You could charge them at 800mA for 7 hours but need to check for temperature.

The reason for the thermal cutout is that heat damages the batteries and on some rapid charges high currents are applied.

As MrAL said, it is CONSTANT CURRENT, the current must not reduce at the end of the cycle like a lead acid battery.

This is exactly what you want and very simple you do need to buy the chip with the source code.

https://www.angelfire.com/electronic/hayles/charge1.html

Andrew
 
Originally charger was charging @2.5A (1.5hour for full charge), so current upper limit is not the problem.

Why current could not be reduced at the end of charging? Any reason for that?

As nothing is overcharged or overheated right now - i'm satisfied with that i have right now.

Problem is (as in first post) detecting end - use led which turns on then current is very low, some kind of circuit should do this, but do not even realize what kind of. Or total circuit cut off if small current is flowing.
 
Hi,

I did not read your post correctly, apologies.

If I understand you correctly all you want is some indication to show that the battery is charged and you want to use the change in current to do so? Well I don't know how to do that but what I have on all the chargers I have built is a cheap 0 - 5A analogue Ammeter. The rapid drop in current from 2.5A down to 40mA will appear as a 0 reading on the meter and will indicate fully charged. Simple and cheap.

Andrew
 
quite good solution for indication :)
Thanks

but if anybody has ideas about automatic cutoff with some relay or something - please let me know ;)
 
Hi again eimix,

I am trying to tell you that this is not the right way to charge NiMH cells.
It doesnt matter what the old charger did, the manufacturer doesnt care
that much how fast your cells get killed.

There are only two basic ways to charge these type of cells correctly...

1. End of charge detection scheme (somewhat complex)
2. Timed charge (simple, fairly easy to set up)

With the voltage detection circuit you are using 1.5v per cent isnt too bad,
but eventually that might not work anymore for the same cells. That's why
we need constant current.

Anyway, if you want to stick with your regulator scheme then you can at
least do a test to see if it is even working at the present time.
Discharge the cells, then hook up your charger and measure the current
and log the time. If you do this for every minute you will get accurate
results, but that's hard to do unless you have a data logger. You may
wish to do this every 10 minutes instead.
As the cells charge you will log time and current, then as the current drops
and you end the charge you then add up all the current readings and
using the time values you calculate the ampere hour input to the cells.
This could come out to 1.4 times 3000 for your cell, which is 4.2 Ahrs.
If you indeed calculate this value (or close to it) then your cells are
fully charged, but if not then they are undercharged. This test will
give you a good idea if it is working or not, at least for now. You
could repeat this test in the future to see if it is still working properly
too.

The timed method means you put the cells on charge for a given time
period, depending on how long you use the cells in a device.
 
MrAl

I think we have both missed the point to some degree.

He did say that he took an existing working drill charger and addded some voltage regulation to it. Therefore I assumed that is was probably for NiCd or NiMh to start off with. It is already a constant current charger. I also think that the design may well be that when the battery is at full charge that the current automatically shuts off and that there is this small residual current floating around to take care of self discharge whilst the drill was in the in the charger.

All he wants is for an LED to go on when the charged state is reached.

Whatcha think?

Andrew

PS: eimix can you confirm that this was a NiCd / NiMh charger to start with?
 
Yes it was NiCd/NiMH charger.

even cells I used are from same drill battery pack
(there was 15of them, and they are good - they are rated to be 3000mA, in reality seems that they still has high mAh)
i dissasabled it, made 6pack and added thermo sensor (some thermistor was in original battery).
 
Anyway, if you want to stick with your regulator scheme then you can at
least do a test to see if it is even working at the present time.
Discharge the cells, then hook up your charger and measure the current
and log the time. If you do this for every minute you will get accurate
results, but that's hard to do unless you have a data logger. You may
wish to do this every 10 minutes instead.
As the cells charge you will log time and current, then as the current drops
and you end the charge you then add up all the current readings and
using the time values you calculate the ampere hour input to the cells.
This could come out to 1.4 times 3000 for your cell, which is 4.2 Ahrs.
If you indeed calculate this value (or close to it) then your cells are
fully charged, but if not then they are undercharged. This test will
give you a good idea if it is working or not, at least for now. You
could repeat this test in the future to see if it is still working properly
too.

I will do this as soon i get discharged batteries completely.
 
Hi again,

Andrew:
Well, i noticed that he was using voltage regulation, but it is sort of a
voltage limit type circuit, so it will work up to the age of the cells
where they can be fully charged before 1.5v is reached. Once the
internal R increases they will no longer fully charge. That was my
main concern. Yes, i guess we should be thinking of ways to detect
end of charge too.

eimix:
Ok that's good, just keep in mind that you may have to raise that
voltage in the future when the cells age a bit more.
What kind of circuit are you comfortable with, can you use
LM339's or would you only want to use small transistors?
(This is for the end of charge detect).


Perhaps this is a bit better explanation...

If you limit the voltage across a cell yes the current will
eventually decrease, from 2amps down to 50ma or so,
or even lower, but that doesnt mean the cells are fully
charged. This is because the cells do not have a repeatable
voltage characteristic, unlike lead acid or Li-ion cells, which do.
This is why there are many chips designed just for charging
NiMH cells, and the way these chips work is totally different
than an 'ending voltage' detection scheme.

Here is a short list of charging schemes used with different
battery types:

1. Voltage detection with current limit: Lead acid, Li-ion
2. Minus delta V: NiMH,NiCd
3. Minus delta T: NiMH,NiCd
4. Max T: NiMH,NiCd
5. Timed: NiMH,NiCd

Max T (temperature) is often used as a backup to other
types of charge regimes also.

Timed, which is max time, is also often used as a backup
to other charge techniques, just as a fail safe in case
something else goes wrong with the charge process.
 
Last edited:
Thanks for detail explanation :)

At my tech level i prefer IC's with no more then 3 pins :D
(anw i did some circuit with 555 timer successfully)

But you are going to win :) my way is not best way to charge baterries.
i looked into charger deeply, it has relay which cutoff circuit if Tmax is reached (sensor 45-55C in battery) that sensor has 1-2ohm resistance in normal temperature, and >200k ohm if temperature is reached.

I have included that sensor in my battery before, so it would be good to use it.
But as charger is max 9V max, overcharge is not possible.

If i use 12-15V 2.5A constant current regulator - this would warm up batteries? yes? and sensor would make it's job?

But still i need to charge 8cells pack without sensor in it. I could put one in it, but cannot find such one, (in local stores only >60°C is available). Thermistor could be a solution?
 
Hi again,

You might be able to use a thermistor with a little circuit to detect
high temperature and shut off the charger, that would be good i think.

If you were charging at a lower current like say 300ma that would be
good too because the cells can stand some overcharge at that level.
At 2.5 amps you have to be very careful or you could damage the
cells quickly. You'd have to buy all new cells again if that happened.
 
Hi,

Herse a tried and tested charger I have built and used:
http://www.stefanv.com/electronics/usb_charger.html

Its USB powered (500mA limit), but you can change that. In fact, I tihnk you could modify it to take more current, but you would have to have a thermal sensor for each battery, or in a place that can detect the rise in temperature of all the batteries.

The delta V technique mentioned by someone else is really tricky with NiMH cells, as its literally about 5mV per cell....if you consider that every circuit has noise...and that can easily reach 10mV you can see why its a bit of a sod to detect the drop in voltage...so I ithnk its mandatory to have a thermal cut-off, regardless of what method you use. You coudl even use that thermal cut-off as the primary charge termination.

The problem with fast chargers is the higher current..because most circuits aren't that efficient, the more current passding through it, the more power it disappates as heat, and therefore the bigger it gets. I'm still trying to design a 'good' NiMH charger with a micrcontroller and switched mode power supply (so it can be small) and it ain't easy.

That said, the comparator cirtcuit given in the link should give adequate cut-off to stop charging. If you replace the output transistor with a better constant current source (at the moment its current is limited by the transistor and comparator) you could 'beef' it up a bit.

Blueteeth
 
Never been a problem with my desktop or laptop...they'll kick out 450mA happily without the data lines connected. I tihnk its one of those bits of a protocol that isn't always fully implemented. But you are right of course, technically it sohuld limit the current to 100mA.

Blueteeth
 
i have a question. i want to build or put together a rechargable battery from scratch that i can recharge via usb = 5v or a hand dynamo and solar panels. What kind of battery can i use for this. I want it to be very small. Also the real question is how do i get it to charge safely as far as not burning out the battery or causing it to leak. I have been able to pull this off, but i have to be disconnect the battery and keep a close watch. My question is what do i use so that when the battery is fully charged it stops taking in a charge. Also is it safe to charge it with different voltage as far as the hand dynamo and solar panels are concerned.
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top