It does indeed!
I've been thinking about it a little more and I think I've gone off on a bit of a tangent, but it seems to make sense... so here goes.
Imagine watching a TV. The television flashes up pictures at around 25 frames per second. This is fast enough for your eye to see a continuous image. However the tv signal is not a continuously smooth image, it is a series of flashes or bursts of signal. So perhaps if the resolution of the eye were faster we could see the bursts.
Perhaps it's the same thing with the receiver. If the signal is bursting on and off so quickly, the receiver's "Persistence of Vision" cant tell the difference between this and a continuous signal. So then, there are no frequency components really in the gaps between the signals, but the detector will tell you that if its resolution isnt high enough to see them?
So for a signal that's keyed on and off say every half an hour, the modulating frequency is so small in comparison to the carrier signal that the sidebands shrink to being practically unoticeable. This means the detectors bandwidth "resolution" is now adequate to make out the bursts intermittently. However if we up the modulation frequency, the bursts come more often, the receiver gets to a point where it cannot distinguish one from another and believes everything is continuous and outputs a constant detection. However there are no frequency components in the blank parts of the signal, the problem lies with the resolution of the detector and if we broaden the resolution of the detector at this point to again detect the sidebands, the same signal now looks 'Blippy'.
Does this make sense, or am I going crazy
PS. Then interestingly, in answer to the original question, which output would be correct... they both are, depending on the bandwidth of the receiver at the detector.
Megamox