Everything I read tells me that Watts should always be less than VA according to Power Factor. However. I am running an electronic ballast to light a metal halide lamp. I'm taking measurements between the ballast and the lamp.
The ballast is rated:
Input Voltage- 108-305V 50/60Hz
Input Current- 0.364A@120V 0.165A@277V
Output Power- 39W
High Power Factor- >0.95
THD- <9%
My meter is reading:
Watts- 83.4
VA- 37.5
V- 84.1
I- .447
It's a 39Watt lamp, why is the measured wattage so high, and why is it higher than the VA?
Also, I've read that using a computer power supply to power your breadboard and experiment circuits is a good idea. I have a 500W power supply but I am alarmed at the current output. At 5V it indicates 38A, 12V @ 27A...... why would this not kill me and/or burn up any circuit I make? Also, this has ATX cables, do I just use my meter to find out which voltage each is... and is ground referenced to the supply case, or is there an actual ground output? I don't have a meter yet so I haven't tried anything. Any info would be awesome!
I have a 500W power supply but I am alarmed at the current output. At 5V it indicates 38A, 12V @ 27A...... why would this not kill me and/or burn up any circuit I make? Also, this has ATX cables, do I just use my meter to find out which voltage each is... and is ground referenced to the supply case, or is there an actual ground output?
Your ATX supply voltages are referenced to the ground leads in the output cables, not (necessarily) the case. The leads are often color coded, e.g. red=5V, yellow=12V, blk=GND, etc. Use your meter to double check, as you said.
The 5V and 12V won't kill you. The currents are substantial, though, as you point out, so don't short the leads or you might be spot welding things
Also, the ability of a power supply to deliver its rated current does not mean the maximum current always flows. The load (your circuit) will draw as much as it needs, up to the rated output.