You need to measure both the voltage waveform and current waveform (since you are making a wattmeter you would be doing this anyway) and compare the difference in phase. The power factor is the proportional difference between the two.
By the way - you can buy a wattmeter that works like that and measures power factor for $20. The "Kill-a-watt".
Anyway, the purpose is building a digital wattmeter, sort of one like the "kill-a-watt" meter. It would be fantastic if the schematic of said meter is readily available, that would really help with information.
If not, I would still like to know how to accurately measure Power Factor without the help of an oscilloscope.
Besides the phase method, you can measure the apparent power going to the load, which is the product of volts * amps ... and then measure the watts to the load, and take the ratio of these two quantities. .... If you are capable of taking these measurements.
Besides the phase method, you can measure the apparent power going to the load, which is the product of volts * amps ... and then measure the watts to the load, and take the ratio of these two quantities. .... If you are capable of taking these measurements.
For DC. For AC the product of volts times amps is apparent power. The product of volts times amps times the cosine of the phase angle between them is true power. For example a capacitor will draw AC current when placed across an AC source but it will draw no real power from the source since the current and voltage are 90 degrees out of phase.
For DC. For AC the product of volts times amps is apparent power. The product of volts times amps times the cosine of the phase angle between them is true power. For example a capacitor will draw AC current when placed across an AC source but it will draw no real power from the source since the current and voltage are 90 degrees out of phase.
One easy way to do it is to just measure the secondary of a transformer. This will be fixed-proportional to the input voltage, so if it's 110Vac the output will be, say, 3Vac, and at 120Vac it would be 3.27Vac.
In the past I've used the same step-down transformer that drives the processor power supply to do this... but that's not as accurate as a separate transformer with a fixed resistive load.
Be careful using the transformer method: small transformers give a significantly higher voltage off load, so you need to calibrate your measurements to account for this.
Be careful using the transformer method: small transformers give a significantly higher voltage off load, so you need to calibrate your measurements to account for this.
A transformer's secondary voltage is normally specified at full load.
There are always more turns on the secondary than the ratio of specified secondary to primary voltage, to make up for the copper losses. A 120V to 120V transformer will not be 10:1, it'll be more like 10:1.2.
To calculate the reall turns ration, connect the primary to a known AC voltage and measure the secondary voltage and divide the primary by the secondary readings to get the real turns ratio.
Understand that, but in this case it's just used as a voltage reference, not power source. I wouldn't think it would need to be loaded down much at all just to make the voltage stable. It doesn't have to be at the correct secondary voltage, just needs to be stable and linear to the primary.
As in, it wouldn't matter if a 12V secondary had an output of 16V due to no load, as long as it's still linear with what the voltage is on the primary.