External battery charger circuit

Status
Not open for further replies.

Blueteeth

Well-Known Member
Hi,

Just thought I'd pick peoples brains about a little problem I have.

I have designed a system that runs on batteries for a customer, great battery life, but to save them money on alkaline ones, they want to use rechargables. Thats fine since the circuit uses a LDO regulator for 3.3v, so any almost any battery type can be used (3*AA's).

However, since the system is sealed for an industrial enviroment, they don't want to open up the box to take the batteries out for recharging, so most stock of the shelf jobs are out. Therefore, I am left with two options:

1. Retrofit a small charging circuit inside the box. Would take in power from a plug-in DC power supply (simple regulated job) via a DC jack, control the charge, and provide indication of 'fully charged' via LED.

2. Somehow, provide an external circuit (also powered by a wallwart type thing) that can charge the batteries inside. Less crap to go into the box, and few circuits to build, as they only need two chargers, for 8 units. This could have 'charged' indication either inside the box (signal provided by the external charger, LED inside the box) or on the plug itself, as I've modified DC plugs before to include tricolour LED's.

Now, as you can tell, I'm not overly keen on option one, its more work, more hassle, but I do realise that charger circuits are generally 'in-system' since they need to closely monitor the batteries' voltage/temperature.

I don't have much experience with building battery chargers from scratch, and this has to be something that does not require batteries to be put into it, so I've come here. I believe they wish to use Ni-MH, as opposed to Li-ion, and charging time is not that urgent, preferably under 8 hours.

Any suggestions/experiences/links/advice are welcome.

Blueteeth
 
I see two problems with using rechargables:

1) They are only 1.2V and not 1.5V - so it's a bit tight for the LDO, drastically reducing battery life.

2) As alkalines have a "great battery life", they are likely to be disappointed with how often they have to recharge. NiCd and NiMh self discharge over a fairly short time, compared to alkaline.
 
Sure, but its not my decision whether or not to use them. They are using these units 24 hours a day 7 days a week, planning to last 4 years. Alkelines aren't really an option for them. Saying that, I managed to get the current consumption down to 0.6mA (sleep modes, and tight timing of operation) so the battery life should be waaay up there. Unfortunately, the customers 'workers' insist on destroying the cables that come out of it, frayed wires = contact = shorting the power supply. I originally used a switch mode power supply, but removed that for the above reason, it simply blew the diode in it.

I am kind of stuck here, since despite voicing my concerns, and offering my professional advice, it is rarely heeded. I simply 'do as they wish' :/
 
I would suggest then, as a first step, modifying them to take 4 NiMh - with only three it's going to be really bad!.
 
possible option

I´m new to programming but have been working with RC electronics for a while. My opnion is that you should reconsider using lipo. In an enclosure they can be safe. Here is a link to a very simple, very small lipo charger that you don´t need to babysit (I have left it on overnight many times), takes less than 2 hours to charge (less if you change some resistors) and cost less than 2$ to make.
It sounds like they are using the product remotely and then charging it later. Is that correct? This charger could be pluged into most laptop powersupplys or to any 12v (up to about 20v) DC. https://www.shdesigns.org/lionchg.html
He shows how to adapt it to many sizes and numbers of cells. I hope that helps a little.
 
I would highly recommend using "Alkaline Rechargeable Batteries"
All other rechargeable batteries have a very short shelf life.
I had a real problem with my camera, I would charge my batteries up
ahead of time so I had them when I needed them and less then a week
later all of them were dead, even though I hadn't used them yet.
Alkaline Rechargeable have a 5 year shelf life.
Get the new "Pure Energy XL" brand. Check out there website, very interesting.
I now think all other rechargeable batteries are just money grabbers.
Also remember these are fully charged when you buy them and 1.5 volts.
My camera is working great now, even after sitting for weeks.
If you have some electronic experience you could get one of there battery rechargers
and just run wires from it to your device. This would be easier then designing something from scratch.
The charger comes with 4 AA and 4 AAA for about $20 at WalMart.
It even tells you when the batteries need to be replaced. Also Alkaline rechargeable like to be recharged.
The more often you charge them the longer they last, unlike other rechargeable.
 
Last edited:
What's the dropout voltage of your LDO?

What's the minimum voltage which your load will work down to?

What capacity cells are you using?

Is charging time important?

A simple diode and resistor can be used to charge the battery in between 12 and 16 hours if you're not bothered about speed, fast chargers are more critial but you can buy specialised ICs for the purpose.
 
Hero.

Dropout voltageof the LDO is approx 60mV for the current draw.
Since the above LDO reg has pass through, my circuit will work down to 3 volts. In fact, the cutt off limit should be about 3.1, thats when it starts to malfunction, but will keep settings/eeprom, down to 2v.

Capacity cells? I don't know, I do not have complete autonomy over the situtaion (come on guys, I'm a contractor, we all know what customers are like) although I recommended to them, Lithium Ion, they may have got 'N-MH' beacuse they read somewhere they are 'better' (which for this app, isn't true at all).

As I stated before, charging time is not critical, although over 8 hours would probably be an issue for them.

Perhaps we could go back to the original question: Is it feasable/acceptable for the charging circuit to be seperate from the unit, that is, wall adapter - > charging circuit -> 2-3m of cable -> DC socket - > Direct to batteries? So that at 'the battery end' it would require just a connector to the terminals (of whatever voltage we use, 4*1.2vAA or whatever) and all the gubbins in a small box seperate.

I have a relatively basic understanding of charging batteries, constant current, pulsed, etc.. and for my own projects, I would just knock up a little constant current source, with a voltage comparator, or even a lil 8 pin PIC, but these guys are tough, and a lot hinges on its success. I was asking because of my lack of experience with charging.

About the battery types, sure the output voltage will sag, and although N-MH/Li-ions hold their voltage better for heavy discharge currents, this is essentially a low current app, with sleep mode being 0.4mA, and peaks of 35mA for 100ms. I have heard about alkeline rechargables, but heres the rub, I am not the one deciding on what battery types to use! They simply think 'rechargable' and get what ever floats their boat, therefore, I am preparing myself to design chargers for almost any batt chemistry...so at least when they do get delivered, I can pick one that suits it best.

Trust me guys, if it was up to me, it would have been done ages ago.

abbarue, you're right about the self discharge curves, Ni-MH have a higher SD than Li, whereas alkelines have a wonderful shelf life.

Perhaps I was over thinking this, since I have read a fair bit about maxims chargers, and Linear's, both of which go into incredible detail, maybe a simple constant current charger, with a MOSFET and a comparator would do?

Thankyou for all your replies, its all going in here *taps head*

Blueteeth
 
Energizer and Panasonic recommend switching the charging to a low trickle current when their Hi-MH cells are fully charged. They recommend detecting full charge by the slight drop in voltage and having temperature and a timer as backups in case the voltage sensor fails.
 
Hi again,

Thanks audioguru, and everyone who has posted

Right well, a bit more info now, to make things easier. They have 2500mA AA Ni-MH's. From reading the datasheet for thisparticularbatteries, it seems it's discharge curve, for low current apps is quite impressive, dropping to 1.05v at 'the knee' point, where they start to discharge rapidly (life time end = 3.15v). This means I can still use 3*AA's (3.6v) for the app, as I mentioned before, the LDO regulator has 'pass-trough', so it doesn't just die when the input drops below 3.3v. Obviously 4 AA's would be better, but we're limited on space, and they wish to keep everything the same (its a retro-fit).

Audioguru, thankyou for confirming this. I have done a lot of research, but I really wanted to hear something like that from soneone else to confirm it
It seems I will have to put a circuit in with the batteries, with the external 'charging jack' just being from a DC PSU. Not ideal, but at least then I can keep an eye on temperature with a thermister (backup).

I found this rather wonderful simple schem on google:
http://www.stefanv.com/electronics/usb_charger.html

It has a great description of operation, but would need modification for charging 3*AA's. I am dubious about using temperature alone to 'turn off' the charge current to trickle current. But I do not want a fully fledged PCB with a expensive IC deticated to this purpose...then again, I don't want the batteries to die after 4 charges.

I relaly like the idea of using comparators though....as I also need a 'low batt' warning light, so a quad comparator would have a couple spare for this job.

Detecting the 'slight drop' using comparators could be tricky, and may involve something like a peak-hold thing (charging a cap via a diode) used as a reference, with slight hysteresis, so that when the battery voltage startsto drop, the circuit remembers the peak, see's a difference, and turns the thing to trickle, andlights an LED.

Are there any more schematics like theabove one on the web? I'm googling my heart out here, but its mostly app notes from maxim, linear, and national semi :/

Thank guys,


Blueteeth
 
A bit off topic here but have you considered inductive coupling for your units for the charging side of things ?

If the units are used and charged extensivley then the wear and tear on the charger jacks/plugs/leads might make this a more viable way to go. The user just has to sit the unit on the charging plate and when they are ready to use it just pick it up.

A lot of rechargable toothbrushes use this method of charging.
 
You didn't say how important the charge time is.

If you can wait 12 to 16 hours then just use a resistor, diode and mains adaptor to charge it at 25mA.

If you go for the fast charger option don't forget you're looking for a voltage knee, the voltage will go up to something like 1.6V, then start to fall so your comparator will switch on, then off and back on again. The easiest way of building a battery charger that detects this is to use a microcontroller, this will allow you to incoperate other functions like a timing which will keep the component count low.
 
picbits,

Interesting idea, but for industrial use, I tihnk they are after a tried and tested method wit 'off the shelf' parts. If I hadmore time I would look into it, as it would be a novelty for them. Of course parts with wear over time, but thats what 'spares' contracts are for

Hero, as I mentioned, charge time is not overly critial, but preferably under 8 hours. Therefore, I should probably use more than just a current source, but we're not talking a 2 hour charge time here.

Apparently, Ni-MH indeed have a 'knee' but according to several datasheets form chip manufacturers (I'll get back to you with a link) they must switch to trickle current (or indeed turn off the charging current altogether) when the cells voltage is at its peak, where-as nicads, should be roughly 45mV down from the peak. This makes thinks trickier for me, as I designed a peak-hold circuit to detect this.

In order to detect the drop in voltage for 3*AA NiMH, it looks like I will have to detect something silly perhaps a maximum of 15mv (5mv per cell).

Of course some analogue and a uC's ADC will see this no trouble, but primarily it looks like I'm going with temperature change. Looking for a peak of 10C above ambient (30-35C). Secondary to that, the voltage knee, or rather, the 'peak', which I will have to calibrate for these particular chemistry (comparator + precision reference). Thats two comparators to turn it off.
Although it is very similar to NiCD's, I have read in several places subtle differences which dictate the order of switch off.


Thankfully, I have just been informed they want this project to have a face-lift....ergo...the charging circuit can go on a new PCB, not seperate. Which sort of makes this post redundant as I can now just use a maxim chip, or something specifically designed for the job (including a uC, OR the existing uC with a bit more code!). That said, its been a great learning experience, always good to know you guys really know your stuff. I will do some tinkering over the next week, and post my solution. As I said, now I have a hell of a lot more options since they want the entire thing redesigned for some extra's, which can include the charging circuit ON the main PCB.

Finally, Nigel, fantastic link! Very detailed. Usually I just like to get a schem, but he describes its opertation very well, a great basis for a true Ni-MH charger. I'll go with 400mA charge current, for 2500mA, thats roughly 6 1/2 hours. Not too quick to make timing overly critical, but not too slow for them.

Thank you all,

Blueteeth
 
Charging batteries involves losses. A 2500mAh battery is not anywhere near fully charged when it has 2500mAh of charging current. It might take 40% more current or time to be fully charged.
 
Sure, but I was hoping not to go above a 500mA power supply. The socket DC power supplies generally start to get bigger over 500mA (except SMPS's of course).

So based on that 40%...C = 2500mA, so total current = 2500 * 1.4 = 3500mA.
For roughly an 8 hour charge, thats 3500/8 = 438mA.

So a 500ma PSU, and with a suitable management circuit (cut-off method, dT/t, or dV/t = I source) I should be able to get them fully charged from almost empty within 8 hours.

Also, They will probably not be completely discharged, although, 3.1v will be the 'end of battlife' for the unit. thats 1.03V per cell, pretty much dead.

So either way, we're talking under 8 hours, with a charging current of roughly C/5 possibly up to C/4...nothing to hardcore, but far from 'trickle current'.

Cheers,

lueteeth
 
Energizer recommends detecting the small drop in voltage when a battery is fully charged plus using temperature or pressure and a timer as backups.

You have only a temperature sensor circuit that might fail. Then what will happen when a battery over-charges?
 
I would use the voltage method first, then the temperature (small thermistors are pretty chrap), then a timer so it charges for no more thatn eight hours.
 
After reading this:

https://www.electro-tech-online.com/custompdfs/2007/10/nickelmetalhydride_appman.pdf

and these:
**broken link removed**
**broken link removed**
http://www.myra-simon.com/bike/charger3.html
http://www.angelfire.com/electronic/hayles/charge1.html

I tend to disagree with you both. Although nothing is 'clear cut' there is some mixed opinions about the '-dV/dt' of NiMH. From all the above, there is universal agreement that the temperature of the NiMH cells increase rapidly at the ideal 'termination point'.

I have read many more sources than the above, both reputable (from chip manufacturers' datasheets) and not so (bike owners lol). Some settle for the idea that NiMH respond just like nicads when charging, albeit with a much smaller 'v-drop' at termination.....while many others claim that the ideal termination should be before this point ie: at the peak, not after, when dV/dt = 0.

I think its pretty clear though that temperature monitoring (and its use to shut off full charge current) is paramount. Its a hassle, because I simply have a cheap 3*AA cell holder, PCB mount, which will need a hole in it for a thermistor. And I will include some form of v slop detection, dV/dt=0 to start off with (this isn't for nicads anyway).

The BIG question is, which method takes priority? a rise in temperature doesn't mean the cells are being damaged (unless of course it goes up to 45C) and that doesn't work for slow(ish) charging. And of course, detecting a 'even off voltage' peak, isn't always easy (we're talking sampling times here).

- I think I'm gonna grab one of the hundreds of 12F625's and do some testing, ADC, timers, lil bit of maths, she's sweet. I am STILL open to advice and alternatives though, all the research in the world isn't quite as good as hearing experiences. - And I'm trying to get the part count down here..preferably <10, including the constant current source (lm317 I guess).

Thanks again, I'm learning a hell of a lot here, little problems like this always get my brain working, please stop me though if I'm getting stuck on the little things, I'm not overly concerned about getting the batt's up to 100% charge every time, nor am I looking for 30 mins charge time.
1) reliability,safety for batt's,
2)ease of build.

Thanks guys,

Blueteeth
 
Perhaps temperature might be the best way but, as you say, it certainly isn't the most convenient. I've got several battery chargers and none of them seem to use temperature, I've not seen any thermistors or any thermocouples on them and they all have automatic shut-off so it's probably a safe bet to assume they use voltage.

Designing anything is always a trade off between several factors like cost, reliability and efficiency but it's up to you, the designer, to decide which one takes precedence.
 
Status
Not open for further replies.
Cookies are required to use this site. You must accept them to continue using the site. Learn more…