What? The thread you started over at XtremeSystems wasn't good enough?
Watts is Amps multiplied by Volts. The wattage you see on a label as part of the model name, etc. is only the sum of all of the rails combined. Since your power supply supplies different voltages on different "rails", that total power output really doesn't mean much when doing side by side comparisons.
For example: 5V @ 40A is 200W. And 12V @ 17A is 200W. Which is more important? Well... considering that your CPU gets it's juice from a regulated +12V lead (either a 4-pin or 8-pin power connector) and your video cards regulate the GPU's core voltage off of the +12V rail (either from a PCI-e connector or through the slot) I'd say the +12V rail. Then of course you have fans, drive motors, Firewire... all use +12V.
+5V isn't used much anymore, but it's still important. Some boards regulate RAM voltage from the +5V (most still use +3.3V.) The logic boards of all of your drives use +5V and PCI cards used +5V. USB uses +5V. But the loads of all of those devices together don't even come close to the load put on a +12V rail. So if looking at two power supplies of equal build quality and price and the both have the same "total output wattage" claim and one has more amperage available on the +12V rail than the other, I'm going to pick the one with more amperage on the +12V rail.