I never understood why GPUs don't rate for only the device itself. Rather they rate for the whole computer, without actually knowing what the whole computer uses. It would be like a dryer saying that you need a 200 amp electrical panel when the machine does not actually use close to 200 amps.
When I bought my two video cards it said 400w, so I figured it meant each card required 400w, so naturally, I got a 1000w psu figuring at very max load it leaves 200w for the rest of the system. Later on learned that video card ratings assume the usage of the PC as well. So how much do they REALLY use then? That's the value they should be putting on the box. It should be up to the consumer to figure out how much total wattage they need since assuming a value based on hardware they don't know about only makes things confusing.
The only real way to know is to oversize on the PSU (but don't cheap out) then do a load test with a clamp on meter to confirm you are not too close or exceeding the PSU's rating. Keep in mind what you are pulling from the wall is more than what you are pulling from the PSU but will give a general idea.