How is your math?Hero999 said:Even so, I find it quite hard to believe that a 2500mA battery can be damaged by charging it at C/10 as it'll only be dissipating 40mW at most.
Blueteeth said:Hey there,
'Trickle charging' is out I'm afraid, it'll just take to long, I'm really after charge times <=8 hours. And with 2500mA AA's, C/30..thats a long time! I think anything shorter than 4 hours charge time would probably make problems for myself, so a nice safe charge time would be 6-8 hours. Plus that means I don't have to have a super big heatsink on any components for C/2 current (1200mA?)...a nice 500mA would be nice.
Right, I've continued the research, but I don't want to keep this thread going forever, so its time for phase two. As far as I can see, I have three options:
1) Use an 'off the shelf' dedicated charging IC, from maxim, TI or linear. Use the recommended application circuit. Medium part count, hassle with parts. But no real design issues.
2) Build a 'smart' controller similar in operation to the above, except from scratch, using a small 8-pin PIC with its A/D for voltage and temp measurement. Program shouldn't be too complex, its essentially just timers and A/D comparators. Only forseeable problem is the ADC may not have enough resolution, increasing the part count for analogue. (total difference in voltage over the entire charge is 0.25V....for an 8 hour charge its rise time is so slow that a 10-bit ADC will hardly show any difference when the 'hump' hits).
3) Non-intelligent charger. Similar to the one using comparators. Its a simple linear charger, that shuts off when either the dV/dt = 0, or, the temperature rises above a preset. Again, small, simple, low part count, but not full controll over whats happening.
I have idea's for all three of the above, including some clever 'sample/hold' tricks for detecting the roll-off in voltage. Using a PIC 'should' be trivial in this case...internal low freq osc, A/D for voltage/temp and a backup timer. But as always, using a dedicated chip, designed by someone as huge as 'maxim' is preferable.
I think we can all agree that cell temperature should be used as a safety cut-off, whether the voltage has rolled off or not, so either way, the charger WILL stop charging before anything gets too hairy. Albeit, not the primary charger termination point.
I still don't know exactly why many schematics on the net are so damn complicated? Tons of analogue, some logic, even huge microcontrollers...I guess if they are charging 16+ cells, and monitoring each one individually, then its justified, but surely everyone else can see how a lil PIC could handle this and so much more?
Blueteeth said:Nigel, thanks once again for your input
The more I look at that link you posted, the more I am convinced I will just use that idea. The 12F675's are great little work horses, and will be perfect for that.
You are of course, correct about the schematics on the web...I don't why I didn't realise that DIY designers are essentially trying to do what I am, without 'skimping' due to cost. Well, they are either quite complex (rightly so) or...overly basic (old circuits for NiCads).
I will start tinkering today. After the huge amount of research I have done, looking at charge curves etc.. I am not convinced even a 10-bit ADC will cut it for an 'accurate' cut-off for voltage.
That said, combined with a precision NTC thermister, a timer, and a semi-controlled current source, I reckon we're on to a winner. Maybe I won't get +98% charge, but it doesn't matter for this app, above 90% is dandy for me, the important thing is cell life.
So, one more quesiton to get the ball rolling... A current source. Charging 3 NiMH cells min 3.0v up to 4.5v with *roughly* 400-500mA. I am not keen on having a large heatsink on a transistor, or a LM317. So either I reduce the current (increase charge time) or find a more efficient method, without stumbling into the world of switch mode power supplies. An LM317 would be more accurate, but surely using a transistor as a current source, it would disapate less power?
Blueteeth said:I'm still confused about how to work out the power disappation in the transistor....obviously say we have 0.5A flowing through it...voltage drop? From VCC (charging power) to GND it'll be PNP, collector-emmiter, battery, power resistor (1-3ohm 5W). At the start, the battery voltage would be roughly 3v. So, for a 6V powersupply...and a 2 ohm resistor, thats 1v across the resistor, 3V across the battery leaving 6-4 = 2v across the transistor? 2*0.5 = 1watt power disappation max. Or should I choose a series resistor with a value to drop most of the voltage?
Again, this is still in the 'LM317 or MOSFET/bipolar' decision stage :/ Any help (Nigel, I'm looking your way) would be appreciated.
Nigel Goodwin said:A transistor used in a linear fashion will obviously dissipate quite a bit of heat (as would anything else), which is why people have been suggesting a switching supply option.
Blueteeth said:I'm starting to think I'm out of my depth here. Nigel...PNP's? yes or no? lol Farnell awaits your reply
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?