Hi,
Too slow of a charge rate is supposed to be bad for the cell. I've never actually measured anything to back this claim up though but that's hard to do. For one thing, we need to define what exactly a "slow charge" really is. Is it 10ma, 20ma, 50ma, 100ma ?
The cutoff current is supposed to be 1/20 times the cell rating in Ahr, but some guidelines suggest 1/33. So a 2Ahr cell should be cut out at either 100ma or 60ma.
What this means is that the lowest charge current should be either 60ma or 100ma for that cell. What happens if we charge at a lower current? Supposedly it hurts the cell beyond repair, so it is best to observe this guideline.
I charge my 1.8Ahr cells down to 50ma and have been using them for years. However, i also noticed that once the cells get down to 50ma it takes what seems like forever to get down lower than that. So i check the charger every hour and see what it is at, and if it is 50ma or less i turn it off. This seems to work quite well.
When it comes to charging any kind of battery, there is also a number that is usually referred to as the "Charge Acceptance" for that cell. This is a number that suggests that there is a certain minimum current level that is REQUIRED in order for the cell to actually get charged at all, even a little bit. Below this level, the cell gets little or no actual charge accumulation. I would think we might be able to combine this number with the cutoff current level and claim that the lowest charge level suggested by the cutoff current level is the lowest number that the cell can take before it stops even taking any further charge. With this in mind, we might want to always be sure to charge above the cutoff current and no lower than that, otherwise we might actually be not even charging the cell at all!