>There is no formula.
I'm with Mark R on this.
It is just a rating. Some manufacturers are conservative, some are realistic, some are BSers, some are crooks.
The rating does not represent anything definite. The power supply may continue to supply power just fine at over it's rating if it is well designed, or it may overheat and burst into flames before it gets there if it is one of those $20 400W wonders.
A real professional power supply would have a garanteed rating. They would say how close the voltage is to nominal, with how much ripple, at their garanteed output current. In order to meet that garantee it would have to be designed to exceed that. They would state definately whether all outputs can be used at their maximum continuously (usually not.)
The obstacle to putting out high wattage is the power consumed by the power supply, the power that never makes it to the outside. This wasted power overheats everything. The ferrite cores are too small, the heatsinks are too small, and the fans too weak to get rid of the waste heat if these home power supplies were really to put out 400W.
I would like to see the test conditions under which these home PC supplies get their ratings. There does not appear to be any. I looks like what they do is find the junkiest 230W supply they can find, and then say that if that power supply is a 230, then mine is a 400.