Consider a typical Electronic Control Unit(ECU), let's say Powertrain ECU.
It's my understanding some drivers are calibrated, maybe current control output drivers. Another words, at end of manufacturing line, measurements are made and driver calibrations are stored in NVM for optimal performance.
At runtime, ECU software will fetch these calibrations to driver actuators.
Please provide comprehensive tutorial on driver calibration theory. There might be some slope and linear equations involved. How are typical driver calibrations scaled so only certain range of correction factors are stored?
What do these calibration values represent? Current?