Strictly speaking, multimeters don't measure AC voltage or current, because the magnitude changes from one instance to the next. They measure the result of some math function of the AC voltage/current, and in some less expensive meters the AC waveform being measured is assumed to be sinusoidal (whether it actually is sinusoidal, or not) to get an accurate measurement result directly. The result of the math function implemented in a particular multimeter is a DC voltage that can be measured. To get a detailed measurement of an unknown waveshape from instant to instant, an oscilloscope is required. There are severe limitations on the frequency and crest factor of the input AC for a measurement to have a certain accuracy using a multimeter.
The methods used for measuring the result of the math function and the way the math function is implemented are several. Sometimes the math function (average, peak, or RMS) is performed digitally after the AC input is sampled with an ADC. Most often the AC input is rectified and applied to a LP filter to extract the average value of the AC input. There are specalized ICs that perform an RMS function of the AC input. There are numerous circuit examples on the Internet, easily found. AC current is usually converted to AC voltage by passing the current though a low value resistor and measuring the voltage across the resistor.
A couple of novel/clever circuit ideas for a wattmeter are:
**broken link removed**
**broken link removed**