Some of the older members here will remember the 2 or 3 bobbin Regulator with a DC generator. This did 3 functions: Voltage limit, current limit and reverse current cutout. Temperature compensation was provided by the bi-metal leaf spring on the voltage bobbin's armature. Hence the higher terminal voltage on a cold startup. The current limit would be in operation until the battery started to come up, (otherwise the commutator would leak smoke and molten solder) After about 30 mins, when the under bonnet temperature rose to normal operating, the volts settled back down to about 13.8. This was an effective way of bulk charging and using the lower voltage setting (determined by the bimetal spring) to float charge (if the vehicle was running long enough) Note that the excitation of the generator's field was in a sense: PWM, by the action of the contacts in the voltage limiter, switching the field current on and off. (Some regulators had changeover contacts and 1 or 2 resistors to give a smoother control). This resulted in a somewhat sawtooth waveform (and commutator ripple). If one connected an oscilloscope to the battery, the peak voltage would be in the order of 15+ volts, Your analogue meter reads RMS so it would read about 13 or 14v. Batteries actually liked this PWM method of charging, this is why we used to get 5+ years out of a battery and never had any sulphation. And we all survived on 15 -20 amps of total charging current. I am not aware of any temperature compensation in these modern cars with alternators that output as much (or more current) than my welding machine....!!! Early alternator systems had external electro-mechanical regulators with temperature sensitive bi-metal leaf springs. These regulators were voltage limiting only as alternators were self current limited by virtue of the stator core saturating, reverse current was taken care of by those hefty rectifier diodes acting as a one way valve. MikeM: note your comments about temperature compensation, about time the vehicle manufacturers did this.