WTP Pepper
Active Member
Hi
This is an observation that appears to go against general understanding everywhere of how these cells operate.
Firstly, charging these cells at constant current to a constant voltage (4.2v) then cutting off at a pre-determined current depending upon C of the cell is my general understanding. The constant current and cut off is controlled by the device being charged in combination with the current limits of the charger.
I have a Nexus 4 phone, a Nexus 7 pad and an eCig that are regularly charged.
e.g. When I charge with the cheap wall plug USB adaptor supplied with the phone, it will eventually show 100% charge after a period.
When I charge from a higher power charger (but the current should still be controlled by the charging device - the phone in this case) it also shows 100% after time.
However, when discharged through use, the device runs down quicker when charged to 100% from the wall power thing compared to being charged to 100% by my in-car USB charger.
I am aware some chargers charge to 4.1v to prolong life, but even taking that into account, it doesn't add up. I am talking about 50% reduction of use timewise.
A different chemistry I know, but I have some 2800mAh AA NiMh cells where it said in the instructions that you need a heavy duty charger to charge them and not a cheap Supermarket NiMh charger to maintain their capacity. I have a Revolex charger and this seems a correct statement as I get the best out of them charging them at C/2 as opposed to C/10.
Our battery experts at work say this isn't possible, but this doesn't seem to be the case. I am using experience against so called science.
No detailed solutions but would be interested in a basic explanation or observations from others.
This is an observation that appears to go against general understanding everywhere of how these cells operate.
Firstly, charging these cells at constant current to a constant voltage (4.2v) then cutting off at a pre-determined current depending upon C of the cell is my general understanding. The constant current and cut off is controlled by the device being charged in combination with the current limits of the charger.
I have a Nexus 4 phone, a Nexus 7 pad and an eCig that are regularly charged.
e.g. When I charge with the cheap wall plug USB adaptor supplied with the phone, it will eventually show 100% charge after a period.
When I charge from a higher power charger (but the current should still be controlled by the charging device - the phone in this case) it also shows 100% after time.
However, when discharged through use, the device runs down quicker when charged to 100% from the wall power thing compared to being charged to 100% by my in-car USB charger.
I am aware some chargers charge to 4.1v to prolong life, but even taking that into account, it doesn't add up. I am talking about 50% reduction of use timewise.
A different chemistry I know, but I have some 2800mAh AA NiMh cells where it said in the instructions that you need a heavy duty charger to charge them and not a cheap Supermarket NiMh charger to maintain their capacity. I have a Revolex charger and this seems a correct statement as I get the best out of them charging them at C/2 as opposed to C/10.
Our battery experts at work say this isn't possible, but this doesn't seem to be the case. I am using experience against so called science.
No detailed solutions but would be interested in a basic explanation or observations from others.