The explanations given are pretty much on target, but let me clarify a couple of points.
When we talk about voltages and currents in an AC system, we use their RMS (root-mean-sqaure) values which are actually the peak value of the sine wave divided by the square root of two. The RMS values are used because multiplying the RMS voltage times the RMS current will give you the average watts. By average here, I mean the average power delivered over a complete cycle. At each instant, the actual power is equal to the actual voltage times the actual current. Assuming the voltage and current are in phase, the resulting plot of instantaneous power looks like a rectified sine wave. Again, the average value of that rectified sine wave is equal to the RMS voltage times the RMS current. Pretty similar to simpe DC circuits so far...
Now if you introduce a phase shift of 90 degress between the current and voltage waves (either way; it doens't matter), then you'll see something interesting. The instantaneous power values will be positive for one half of the cycle and negative for the other half of the cycle, making the average power delivery zero! Phase shifts between 0 and 90 degrees will show more positive power and less negative power, so that the average power delivery is positive but less than what it would be at 0 degrees. So we need to take the phase angle between the current and voltage waves into account when calculating the actual power delivered. The equation becomes RMS voltage times RMS current times the cosine of the angle between them. And the value of the cosine is also called the "power factor".
Of course, this is important in determining how much (RMS) current will actually be needed to deliver a desired amount of power to a load that has significant inductive characteristics (such as any electrical motors), and therefore in anticipating volatge drops and in the selection of transformer and wire sizes. Conductors and transformers are essentially capable of carrying a certain amount of current regardless of the power that current ends up delivering to a load. Transformers usually have so-called KVA (kilo-volt-amp) rating which are just their rated RMS voltage times their rated RMS current.
So, using Fandu's numbers, a 1 KVA load with a 0.9 PF would actually draw just 900 watts of power (on average over each cycle). The difference between the 1000 volt-amps and the 900 watts does NOT mean that 100 watts is "lost" somewhere. It does mean that the motor is pulling more current than the minimum that would be required to deliver that same 900 watts to a purely resistive load (where the PF = 1), and that means some loss in efficiency due to greater resistive losses in the conductors and higher voltage drops. As Arcas says, this difference between volt-amps and watts is being used to build up and drain off the magnetic fields from inductances (and voltage fields from capacitances) every cycle; real power is being taken and stored during one half of the cycle but then delivered back during the next half cycle -- making the average power delivered by this "extra" current equal to zero.
This may be more than anyone wants to know, but another way of talking about this difference between volt-amps and watts is to introduce the concept of "VARs" (Volt-Amps-Reactive). The calculation for VARs is RMS voltage times RMS current times the sine of the phase angle between them. Flows on electrical power lines are most often measured in megawatts and megavars.
One more aside: customer meters only measure the average watts delivered; typical residential customers are not charged more because of the larger current needed to deliver power to reactive loads (i.e. not charged for VARs).
I hope this answers the questions even more completely.
