I recall NASA discovered that Li-ions above 4.10V/cell tend to decompose due to electrolyte oxidation on the cathode (+), while those charged to lower voltages lose capacity due to the solid electrolyte interface (SEI) building up on the anode (-) with Lithium Oxide and Lithium Carbonate.
This was verified by Dalhousie University by Dr. Dahn who proved battery death is accelerated for the duration above 4.1V, while NASA knew that capacity was reduced to 60% with this method of reducing Vcv charge to 3.9V MAX. This enabled them to use Lithium Ion batteries for > 8yrs.
I later show how one major brand total lifetime Ah*cycle delivered was increased by 5x over the typical 500 cycle Ah rating.
The Depth of Discharge also affects aging rating. If you are familiar with how Mil-Std_HDBK 217 works with MTBF , they model the accelerated failure rate with formula based on stress factors. They may have a model now for LiPo's that uses factors for %DoD and CV voltage.
I would expect time, t becomes an exponential MTBF accelerator for both of these factors outside certain thresholds and varies with chemistry and quality of contaminants in the electrodes and electrolyte.
I once plotted the lifetime Ah capacity for one brand based on 50% min. DoD and CV max with available Ah for each cycle and number of cycles in life time, which I will show below, but I lost the source data.
Charging at ANY rate consists of measure Voc, measure V at pulse load , measure initial ESR and final ESR with CC to Vcv then Vcv to 10%CC then shuttdown.
The risk here is that one cell reaches 100% SoC before the others and the balancer cannot bypass enough current during CC mode when current is reduced. i.e. Balancers may not be able to dissipate enough heat for a 0.5C rate if this implies 5400mA*0.5*4.2V = 11.34 watts !! per cell balancer.
FWIW below with little explanation...
If battery capacity is Ah=C , left vertical axis above is number of C cycles of battery life achieved vs slow CV charge voltage and 50% MIN DoD. but discharged at any rate like 0.5C to 2C depending on DoD rating from 10% to 25% to 50% to 100%. ( I will try to find source info) Note longest lifespan is 50% DoD for recharge threshold. (Lenova has a smart charge algorithm using these parameters)
The solution as I see it is to get a better charger with the following characteristics;
- programmable CV, CC and %%CC for shutoff levels,
- measures Rs or ESR imbalance throughout the charge cycle ( very important for self-heating) ( OR )
- Commutates bypass current with flying inductors ( like a SMPS) ( flyback half-bridge between cells, rather than passive TVS or active zeners with excess ESR )
- Computes actual Ah supplied to each cell and compares with estimated %DoD based on initial tests above (Voc, ESR) rather than to whole array or simply just ESR at end of charge.
Consider that in Car batteries with 850 A crank capacity at 7.5V each cell ESR must be balanced within 1% when new and when this mismatch rises , acid boiling in the weakest cell due to ESR*I^2 accelerates battery death quickly. Same holds true for LiPo's. The greater the ESR mismatch with a a passive balancer, the greater risk to cell death can occur due to one cell reaching full charge while others still in CC mode.
FYI
https://www.dal.ca/diff/dahn/publications.html