I just switched on my 60MHz portable o-scope, using the AC adapter, removed the A-channel probe, set it to its lowest input value of 100mV, sweep is 5ms, and it reads Vpp=0.00V, Vrms=2.81mV, Vave= -2.81mV (Vaverage). There is no bobble in the readings at all. The trace is steady as a rock. I changed the sweep to 5ns. Same as before. This is NOT a high end brand name instrument.
Five minutes later it's bobbling 125 mV. The three voltage readings from before are now 65 to 75 mV. This is with or without a probe. This is more like it. This is what I expected. I might as well do a recalibration and see what it says afterward.
Most o-scope manufacturers state that the unit must be allowed to run 20 to 30 minutes to allow the front end of the input to stabilize before doing auto-calibration. My Tek scope says to remove the probes before starting recal. This scope says to do the same in the manual, but says to "short terminals 1 and 2 before starting" on the screen at the start of cal. I went with the manual. I removed the probes before starting. BTW, most o-scope manuafacturers recommend you recal if the ambient temperature changes 5 deg C.
OK, the recal is done. All the settings so carefully arrived at with usage are now out the window. After resetting everything in sight and saving it all, I do some testing. I take a fresh AA battery out of the package and measure it with a fairly new DMM. It reads 1.67Vdc. I measure it on channel one: 1.71V. Channel 2 is 1.69V. Fair enough.
And what are the traces doing while sitting there, no signal applied? They are both bouncing up and down slowly, less than 10Hz. Channel one's Vpp is 1V to 1.25V; channel two's Vpp is 780mV to 820mV. If I ground both at the probe, I have hashy traces that measure 340mV (Vpp) at Chan1 and 187mV (Vpp) at Chan2. If I set both their inputs to GND they are dead flat. The measurements are "****", meaning they are too small for even the minimal usable value, something like 2.8mV.
OK, here's the skinny: Digital scopes tend to look "noisy" in their traces as compared to an old fashioned analog scope. This same noise was there in the analog scope, but "bloom" caused by the phosphor on the screen made the trace look "fat" and largely consistent. Digital scopes and their tiny, distinct dots on the LCD screen look "noisy". Your frontend is a couple of FETs and like all transistors they generate their own noise. And the probes act like antennae, even with their cable's shielding. Shorting the tips to the ground clip helps reduce this noise some, but not entirely. An old TV repair trick used with dark screen TVs was to short the ground clip to the probe tip and hold it just above the neck of the CRT. If the horizontal sweep was working its signal appeared on the screen. No signal, dead horizontal oscillator.
So, if you follow the instructions in your manual for a calibration (after 30 minutes warmup) and you get the same results I got when done, you're probably OK. The main thing is, do you get the proper readings when testing against known signals? Does the battery voltage look right as compared to a reading on a known good meter? If you get your hands on one, test it against a known good signal generator. Remember, the quoted bandwidth for any instrument is usuall for a sine wave. A squarewave is good for one tenth that value. Ie, if your scope is rated 20MHz, that's for a sine wave, it will only accurately measure square waves to 2 MHz.
An oscilloscope is an extremely useful tool.
Have fun with your projects.
kenjj