Originally posted by: yh125d
Originally posted by: dclive
Originally posted by: happy medium
Originally posted by: dclive
Originally posted by: Azn
So tell me do you use a 300/350 watt power supply to power a GTX 280? If you don't why do you suggest running such power supply with such video cards when you haven't had any experience with it other than just reading someone else's testing which doesn't tell much other than that they are running a much higher rated power supply to test for wattage?
Someone suggested that a GTX 280 used 236W.
I showed two tests - from two good sources - that showed it doesn't. Both tests show the entire system run using under 300W.
Are Anandtech and Tom's Hardware unreliable?
No they are not unreliable but do give the imppression to novice users that you can use a cheap 300/350 watt psu to run a high end gaming system just because it only draws 275 watts from the wall. It depends on 12v+ line amperage , psu efficiency and the ambient temperature which these components operate in.
The op has a 300watt (el chepo) psu and should not even consider running anything more then a 4770 if that even.
Why does PSU efficiency have anything to do with this? It seems like you're now saying it's _nothing_ to do with wattage (which I agree with - but PSU efficiency is usually tied to wattage) and everything to do with the 12V rail power - is that right?
Efficiency always matters. Especially when you're judging based on AC draw. For example...
System A and B both require 250w DC at full load. System A has an 85% efficient PSU at that load. System B has a 75% efficient PSU at that load. System A will pull 250w/.85 = 294w AC from the wall. System B will pull 250w/.75 = 333w AC from the wall. By only changing efficiency, system B appears to be using 40w more than system A, when its really using the same power.
Also, an efficient PSU will run cooler, as more AC power is being converted into DC power rather than thermal power. Cooler running PSU's deliver cleaner, stable, quieter power and operation