Can anyone define this:
"with a true power factor of 0.9 or greater"
I don't know how much electrical theory and maths you know, hopefully this explanation is good enough.
Power factor is a curious property of AC circuits. Essentially, volts and amps are constantly changing in an AC circuit. The voltage rises to a peak, then falls to zero, then goes negative and back again. Because A and V are both changing continuously, so does Power (Volts x amps). As a result when one talks about '120 V' or '100 W' in an AC circuit, what they actually mean is the average taken over a period of 1/60 second. Ideally you want the amps to be proportional to the volts. If A is proportional to V, then you get optimal transfer of power from the grid to the device. If they aren't proportional, then there are some times when the amps flowing in the circuit are excessive for the amount of power being transferred.
This means you can calculate 2 values: 'real power' (or true power) which is calculated by instantaneously measuring V and I thousands of times a second, multiplying each measurement, and then calculating an average. This is an accurate measure of the actual energy being used. The other is 'apparent power' which is the 'average' volts x 'average' amps. This represents the amount of load on wiring, circuit breakers, transformers, etc. In an ideal circuit, these 2 measures will be equal. In circuits where A and V aren't proportional (because the device acts as a short term energy store), the 'apparent power' will be higher than the 'real power'.
The ratio between 'apparent' and 'real' power is 'power factor'. PF = 1 means optimal transfer of power. PF = 0.5 means that the 'average' current (amps) is twice what you would expect for the amount of 'average' power (watts) being transferred.
Can we get anyone with an engineering background (i.e. not a PSU marketing site) to say why we'd care about .9 TPF vs. .8 TPF, or .1 TPF, for that matter?
For home users, the benefit is pretty marginal:
The lower PF is, the more amps need to flow in the circuit to deliver power to your device. If you've got an uber-PC using 500W, if it has PF = 1, then it will need 500/120 = 4.2 A to run. If however, it has a more typical PF = 0.6, then it will need 500 / 120 / 0.6 = 7 A.
So, if you have a 15 A circuit for your computer room - you'd only be able to run 2 low PF PCs, but 3 high PF PCs from it. This may also be of benefit if you have a UPS, the lower current demands from the UPS will reduce operating temperatures and prolong battery life.
If you've got some kind of alternative energy system (e.g. off grid solar, gas generator, etc.) then it may also be in your interest to keep current down. That's pretty much it for home users.
Power/energy meters (this includes the main electricity meter at your property) always measure 'real power' (even the cheapest plug-in energy meters will measure real power). However, multimeters always measure apparent power because they measure volts and amps separately.
In the US, and many other countries, home electricity customers are charged only for 'real power'. The meters ignore the 'apparent power'. So, home users don't save a significant amount of money by changing to high PF (there is a marginal saving because you get less voltage sag in the wires in your home, but this is very marginal).
The people who win are businesses. Business electricity meters usually measure both 'real' and 'apparent' power, and bill them both seperately.
The next it gets pretty hardcore, so you can skip it if you want.
For highly technical reasons, 'true power factor' can be split into two components (both of which usually co-exist to some extent). 'Harmonic power factor' due to 'non-linear' electronic devices is far more troublesome than 'displacement power factor' due to motors or fluorescent lamps (inductive load) or capacitors. Some people make an effort of specifying 'true power factor' as some older techniques for calculating or measuring PF ignored harmonics. TPF basically means the actual ratio as described earlier.
Harmonic power factor is a problem, and it is for this reason, that some countries (notably European countries) now require electronic PSUs have PF > 0.95. This requirement isn't law in the US, and it's left up to the power companies to clear up the problems it causes. Office blocks have burned down because electricians and contractors didn't fully understand the significance of harmonic power factor, due to computers, on building power grids. Similarly, non-linear loads can interfere with other devices on the power grid - e.g. big transformers can overheat even when they are well within their 'apparent power' rating, and the performance of motors can be severely degraded if they share a grid with non-linear loads (the non-linear loads produce an electrical braking effect on the motor causing it to run badly - less torque, more vibration, more heat, poor starting). [Apart from the cost saving item, these additional benefits are only significant if a business has huge amounts of electronic devices - e.g. hundreds or thousands of PCs, or some extremely heavy duty electronic devices - e.g. industrial variable speed motors].