Why does NVIDIA recommend having such a high-wattage PSU?

PCJake

Senior member
Apr 4, 2008
319
0
0
Well, I ran my planned build, which includes an EVGA GTX Titan, through the eXtreme PSU calculator, and it's suggesting that I need around 475 watts from my PSU. However, on the EVGA GTX Titan Newegg product page, it say under requirements: "600 watt or greater power supply."

Maybe it's EVGA recommending this and not NVIDIA. Regardless, which one should I go by?
 

Saylick

Diamond Member
Sep 10, 2012
3,881
9,011
136
Well, I ran my planned build, which includes an EVGA GTX Titan, through the eXtreme PSU calculator, and it's suggesting that I need around 475 watts from my PSU. However, on the EVGA GTX Titan Newegg product page, it say under requirements: "600 watt or greater power supply."

Maybe it's EVGA recommending this and not NVIDIA. Regardless, which one should I go by?


I'd recommend taking a look at the amperage that is delivered by your power supply and making sure it has enough amps from the 12V rail to feed your entire rig.

Assuming a GTX Titan uses 260W max, that's roughly 22A required for the graphics card alone, and if you assume an i7 3770K will use 130W under load, that's another 11A. So far, that's 33A for just the CPU and GPU. Factor in your harddrives and whatnot and you're looking at upwards of 36A, hence the reason why nVidia recommend a PSU that can deliver upwards of 38A (they don't know what CPU you are using, but 38A is definitely on the conservative side).
 

Red Squirrel

No Lifer
May 24, 2003
69,677
13,316
126
www.betteroff.ca
I never understood why GPUs don't rate for only the device itself. Rather they rate for the whole computer, without actually knowing what the whole computer uses. It would be like a dryer saying that you need a 200 amp electrical panel when the machine does not actually use close to 200 amps.

When I bought my two video cards it said 400w, so I figured it meant each card required 400w, so naturally, I got a 1000w psu figuring at very max load it leaves 200w for the rest of the system. Later on learned that video card ratings assume the usage of the PC as well. So how much do they REALLY use then? That's the value they should be putting on the box. It should be up to the consumer to figure out how much total wattage they need since assuming a value based on hardware they don't know about only makes things confusing.

The only real way to know is to oversize on the PSU (but don't cheap out) then do a load test with a clamp on meter to confirm you are not too close or exceeding the PSU's rating. Keep in mind what you are pulling from the wall is more than what you are pulling from the PSU but will give a general idea.
 

smakme7757

Golden Member
Nov 20, 2010
1,487
1
81
Basically the don't want to say "You need exactly 476.8W to run a GTX Titan and <insert other hardware here>.".

They don't know what system the GPU is going to be installed in. 600W is a pretty decent amount of power for most systems and they take that as an average of what would be required for a system running a GTX titan. I'm sure they also take into account that most people dumping that much money in a GPU might also have a pretty chunky system to go along with it.

It's just common knowledge that you don't underestimate when you sell a product thats going to be sold and used in an unknown enviroment by people who most likely don't know any better.

I never understood why GPUs don't rate for only the device itself. Rather they rate for the whole computer, without actually knowing what the whole computer uses. It would be like a dryer saying that you need a 200 amp electrical panel when the machine does not actually use close to 200 amps.

When I bought my two video cards it said 400w, so I figured it meant each card required 400w, so naturally, I got a 1000w psu figuring at very max load it leaves 200w for the rest of the system. Later on learned that video card ratings assume the usage of the PC as well. So how much do they REALLY use then? That's the value they should be putting on the box. It should be up to the consumer to figure out how much total wattage they need since assuming a value based on hardware they don't know about only makes things confusing.

The only real way to know is to oversize on the PSU (but don't cheap out) then do a load test with a clamp on meter to confirm you are not too close or exceeding the PSU's rating. Keep in mind what you are pulling from the wall is more than what you are pulling from the PSU but will give a general idea.

I agree and disagree.

They should put BOTH numbers on the box!

:)
 

PCJake

Senior member
Apr 4, 2008
319
0
0
I agree and disagree.

They should put BOTH numbers on the box!

:)

Yeah, I can see this. There's some value in using their "guesstimate" - a less technical builder could pretty safely go by their recommendation without having to use a more in-depth calculator.

But they really should say what the GPU itself uses o_O
 

krnmastersgt

Platinum Member
Jan 10, 2008
2,873
0
0
I'm thinking its more because the vast majority of people are pretty stupid, especially when it comes to electricity.

That and to shield themselves from constant complaints about "I have a 470w psu and it doesn't work!" because they got a piece of crap that can't output anywhere near its numbers etc. And to shield themselves from the stupid lawsuits that come along with it, because boy do we love a good lawsuit against a corporation.

And maybe because people don't account for all the power draw in their system, efficiency of their PSU, aging, amperage, etc. :p
 

Saylick

Diamond Member
Sep 10, 2012
3,881
9,011
136
Yeah, I can see this. There's some value in using their "guesstimate" - a less technical builder could pretty safely go by their recommendation without having to use a more in-depth calculator.

But they really should say what the GPU itself uses o_O

That would be great if they did. I'd still check with a reputable review site like Anandtech to confirm though. ;)
 

lagokc

Senior member
Mar 27, 2013
808
1
41
I never understood why GPUs don't rate for only the device itself. Rather they rate for the whole computer, without actually knowing what the whole computer uses. It would be like a dryer saying that you need a 200 amp electrical panel when the machine does not actually use close to 200 amps.

Any time you sell an appliance you want to write the directions for the most stupid possible individual you can. It's the same reason cameras are measured in megapixels instead of simply giving the x by y resolution, manufacturers actually expect that a significant amount of potential buyers are too stupid to compare two pairs of numbers.

With power supplies, the worst case scenario is a consumer with a Chinese "600 watt" power supply that's capable of 250w on a good day and the consumer also has no clue how much power the rest of the components in his system need. So the best thing for manufacturers to do is recommend a completely excessive power draw amount, knowing that even if the customer has a dishonest power supply and a heavy system load they'll still probably be alright.

For those of us that built computers with high quality 80+ rated power supplies it's acceptable to use a smaller power supply than is recommended by video cards.
 

Essence_of_War

Platinum Member
Feb 21, 2013
2,650
4
81
Any time you sell an appliance you want to write the directions for the most stupid possible individual you can. It's the same reason cameras are measured in megapixels instead of simply giving the x by y resolution, manufacturers actually expect that a significant amount of potential buyers are too stupid to compare two pairs of numbers.

With power supplies, the worst case scenario is a consumer with a Chinese "600 watt" power supply that's capable of 250w on a good day and the consumer also has no clue how much power the rest of the components in his system need. So the best thing for manufacturers to do is recommend a completely excessive power draw amount, knowing that even if the customer has a dishonest power supply and a heavy system load they'll still probably be alright.

For those of us that built computers with high quality 80+ rated power supplies it's acceptable to use a smaller power supply than is recommended by video cards.

Yeah, I always figured this was the case. They have to assume that everyone who buys their product is dealing with the least reputable power supply drawing from the least reputable electrical utility.
 

biostud

Lifer
Feb 27, 2003
19,443
6,487
136
Because there's a lot of users with crap PSU's with a label that says 600w, but can only deliver 450W.
 

Larnz

Senior member
Dec 15, 2010
247
1
76
Yeah they way over estimate ratings IMO.

I recently bought a Corsait AX 860i which has the corsair link software so I can see exactly how much input/output I draw. You can see from my sig I have an OC'd i7 and SLI 680's and the highest draw I have seen so far is in Digital Ira tech demo which was only 430Watts... most of my gaming HON/BF3 it sits around 300w-350watts.

Yet they suggest 750+ for 680 SLI so yeah over estimated to be safe I think, you can get some really cheap and nasty 800w PSU's that are more like a decent 500w PSU etc. but if you have a high qaulity PSU you wont need as much grunt imo.

Also I couldn't recommend the Corsair AX series enough, great unit and it doesn't even make a sound, fan only spins if it reaches 50% draw which it like never does.
 
Last edited:

Automaticman

Member
Sep 3, 2009
176
0
71
I like having a PSU with plenty of headroom, so it never has to work hard driving its maximum output. Plus, most PSUs have the highest efficiency around the 50% max output mark anyway.
 

lehtv

Elite Member
Dec 8, 2010
11,897
74
91
Plus, most PSUs have the highest efficiency around the 50% max output mark anyway.

The cost of running a lower wattage unit over several years is nowhere near the cost of paying for a higher wattage unit, so that shouldn't really factor in. You buy the power that you need for long term stability at the lowest price
 

lagokc

Senior member
Mar 27, 2013
808
1
41
The cost of running a lower wattage unit over several years is nowhere near the cost of paying for a higher wattage unit, so that shouldn't really factor in. You buy the power that you need for long term stability at the lowest price

Stone soup: you spend more on a quality higher wattage unit because it will be built with more solid components as a side effect of its higher rating and will therefor be more reliable.

Also the nice thing about 80+ rated power supplies isn't that they are more energy efficient, the best feature is that 80+ certification requires them to actually be capable of delivering the power they advertise.
 

Torn Mind

Lifer
Nov 25, 2012
12,004
2,748
136
Stone soup: you spend more on a quality higher wattage unit because it will be built with more solid components as a side effect of its higher rating and will therefor be more reliable.

Also the nice thing about 80+ rated power supplies isn't that they are more energy efficient, the best feature is that 80+ certification requires them to actually be capable of delivering the power they advertise.

This heuristic does not apply to every 80+ unit, as some units are still awful despite the higher wattage rating and certification, such as some units from Coolmax.
 

lehtv

Elite Member
Dec 8, 2010
11,897
74
91
Stone soup: you spend more on a quality higher wattage unit because it will be built with more solid components as a side effect of its higher rating and will therefor be more reliable.

No. Higher wattage doesn't mean it has more solid components. Why would it mean that? It just has more components and bigger capacitors etc. to deal with the higher load.

There's a point at which is makes no sense to buy a higher wattage unit for reliability reasons, and that point is not 50% load at normal system load. It is about 70% load for good quality units. That is to say that for a system that uses 350W, a quality 500W unit is just as reliable as a quality 700W unit in the long term, all other things being equal.

Also the nice thing about 80+ rated power supplies isn't that they are more energy efficient, the best feature is that 80+ certification requires them to actually be capable of delivering the power they advertise.

Yes, maybe at 25C degree ambient, which is abnormally low for any closed-case system where the unit is stressed to its full capacity. And even then some 80+ rated units can't deliver their rated wattage cleanly at low temperatures. Decent manufacturers rate their units at 40-50C.
 

XiandreX

Golden Member
Jan 14, 2011
1,172
16
81
I will Chime in and concur about quality > quantity. I Highly recommend the Xfx 550 Pro ( not sure of exact model). I did a tremendous amount of research and for the money it was amazing. Should meet your needs perfectly.

Just my 0.2
 
Last edited: