OK, I am starting to understand this.
The pictures of the Signal Source, and the Test Record, explain things.
Reading some of the ITU document supplied by augustinetez i think that I understand the term dBpp.
I have never seen that before, maybe I have learned something today.
Power level Flatness. What i did and assume. i put level at 0dBm and varied Freq range over units spectrum and see how much the power would change. but this is best that i know
That sounds like a good method and is what I would do to test the Power Level Flatness.
If sweeping from 0.5 to 15Ghz, the highest output power measured was +0.4dBm and the lowest output power measured was -0.3dBm, I would say that the "Flatness" was 0.7dB.
Your Test Record:
A level (0dBm) is specified.
A Tolerance (±0.8dBpp) is specified.
But dBpp is with reference to the
peak output power, so how can we have something which is higher (+0.8dB) than the highest output power?
I think that my test record would look like this:
A level (0dBm) is specified.
A Tolerance (±0.8dB) is specified.
I would record the highest output power, and the lowest output power as the dB difference from the reference power (0dBm).
Speaking from a position of zero experience of your system, I think the use of dBpp is wrong.
Does this make sense?
JimB