Domestic meters measure only true enery usage: i.e. they multiply the instantaneous voltage by the instantaneous current, to calculate the instantaneous power. They then take the integral w.r.t. time of that, to calculate the total power usage as kilowatt-hours (kWh).
This means that if you converted a full-wave light to half wave, it would produce half the light, use half the energy, and be billed half as much.
-----
Industrial electricity meters are often more complex - they often take into account, not just the actual amount of energy used, but how hard the grid has to work in order to deliver it. They measure 2 things - the actual power used, using the calculation above, and the 'apparent power'.
The aparent power is calculated by measuring the 'average' RMS voltage, and RMS current - and multiplying them. This 'apparent power' is how hard the grid has to work to deliver your electricity. The difference between apparent power and true power is called 'reactive power' (this is a bit of a misnomer, as reactive power carries no power at all - it's a reflection of how difficult it is to supply electricty to your load). The meter then calculates the integral of 'reactive power' as kilo volt-amp-hours reactive (kVAhr). When an industrial user gets their bill, they will be billed for the energy used (kWh) and also any reactive load (kVArh).
Example:
Let's assume we have a full-wave circuit that takes 10 A at 100 V AC. It consumes 1000 W. The meter records 1 kWh for each hour of operation - and because the voltage and current and power are all consistent, the meter records 0 kVAhr. If you run the load for 1000 hours a year - you will receive a bill for 1000 kWh, and 0 kVAhr.
However, if we change that same circuit to half wave, then we cut the power consumption to 500 W. The voltage stays at 100 VAC. The meter records 0.5 kWh for each hour of operation. However, because the half-wave current is not the optimal waveform for an AC system, the RMS current is actually about 7 A (not 5 A) - in other words, if you hooked up an AC current meter, it would read 7 A. The electricity meter does the same thing and records a discrepancy of 0.2 kVAhr for each hour of operation. After 1000 hours of operation -you receive a bill for 500 kWh and 200 kVAhr.
The kVAhrs are cheaper than the kWh - because you have to burn coal/gas/uranium to generate the kWh, but the kVAhrs just reflect how much load the grid has come under.
This is the basis of 'power factor' - if the current waveform doesn't match the voltage waveform in an AC system, the RMS current x RMS voltage is higher than the actual number of Watts. Electronic devices (e.g. PC PSUs) have a very highly distorted current waveform - this means that they load down the grid more than they need to, for the energy they use. For industrial users, e.g. large offices, using PCs with PFC PSUs can reduce electricity bills significantly - not because the PSUs are more efficient, but because they avoid clocking up huge numbers of kVAhr on an industrial meter.