I fail to figure out any explaination for this... If the transistor is on, it ought to let current pass through it. And it has a resistance and shall definitely get hot...
Remember: P= IV
If you put a BJT into saturation, the V(CE) drops to a very low level (0.2V nominal for the small signal BJTs; 1.0V for power BJTs). Now, if it's pulling 5.0A through a 30V rail, that would be a dissipation of 5.0W. However, if it were operating in a linear mode, with the same 5.0A, and a V(CE) of 15V, the dissipation would be: 75W, considerably higher (and it had better be able to handle that much). I've built Class D (switches between cutoff and saturation) inverters that could burn up 10W resistors, and yet, the drive transistors would be just barely warm.
On the other end, there is a small leakage current, usually amounting to a few uA for small signal types, to several hunred for power transistors. Again, assuming a leakage current of 250uA, with a V(CE)= 30V, you'll get: 7.5mW of dissipation. Not much to worry about.
That's how you figure it: P(C)= V(CE) X I(C)