I wanted to know how much power my 22" LED TV/monitor actually uses. The inline brick power supply is rated at 1.7 A 120v in, 12v 5A out. If it realy was that bad I'm sure it would be a fire hazard from the amount of heat it puts out, but it only gets a little warm.
I've got a clamp on AC ammeter, so by modifying an IEC power cord I can get it around the hot leg only. Unfortunatly, the 200 A full scale range only reads to tenths of amps, and the 0.2 A reading has pretty wide error margins.
Thinking about it a bit, I wound 5 turns of the hot wire around the clamp. Now I get a reading of 1.7 A, which divided by 5 means the TV is actually using 0.34A of wall power.
Is there any reason this trick would not give accurate readings?