Thanks for the reply.
The problem I have in my head about this is:
When running at 36V input, at one third speed and (for now) constant & low torque requirement, the PWM will be providing 1/3rd duty cycle 36V pulses to the motor coils; but the reactance/reluctance of those coils will tend to smooth or damp the pulses out, so the voltage acting on the impedance of the coils will be lower than 36V, thus the current drawn is less than it would be if the full 36V were flowing.
Effectively, the PWM and coils are acting as a buck converter (I think).
So, the question then becomes, are there any efficiency gains to be had by keeping the inverter duty cycle at or close to 100% by varying the input voltage?
(Whether an inverter rated at 36V would operate correctly at 12V is another question that I'm ignoring for now.)
My thought is, that the vast majority of the losses in the motor are due to eddy currents induced in the cores as a result of the changing voltage across the coils -- 0v...36V...0V * 133/minute * 3 phases.
If you can reduce the maximum voltage, you reduce the rate of change, thus the induce eddy currents will be smaller. (I theorise.)
Of course, the boost converter of the required rating will also include a L component, but in a well designed circuit this will be chosen for efficiency.
I don't know if I am completely out of my tree in this thinking; but the idea has been floating around my head for a while now, so I thought I'd ask for opinions here.