[GamersNexus] How much wattage does your PC and your GPU actually need?

Actaeon

Diamond Member
Dec 28, 2000
8,657
20
76
Great article, thanks for sharing. I do think people go way overboard with stating power requirements and I hope this clears it up.

Would have loved to seen OC'd 390Xs in CF as a worst case scenario, but with a single card well short of 500, I can't imagine a CF setup being over 800.
 

Pinstripe

Member
Jun 17, 2014
197
12
81
I did measure my total system power consumption (minus monitor obviously) the other day with a Wattmeter and was surprised by it's relatively moderate power draw.

My system:

Core i7 3770 "Ivy Bridge" w/o OC
16Gb RAM
MSI R9 290 4G w/o OC
Bunch of HDDs and other crap

Idle Desktop usage was around 65-70 Watt.

Unigine Valley (Ultra settings, 1080p, 8x MSAA, 60 fps cap via RTSS)

~320-330 Watt @100% GPU usage @Stock clocks (977Mhz)
~250 Watt @ 850Mhz downclock, still barely manages the same framerate.

Fallout 4 and Unreal Tournament (UE4) test with 60 fps cap and High-to-Ultra settings:
~250 Watt @Stock clocks (977Mhz)
~220 Watt @850 mhz downclock

All in butter fluid 60 fps. I'm happy with that.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Would have loved to seen OC'd 390Xs in CF as a worst case scenario, but with a single card well short of 500, I can't imagine a CF setup being over 800.

Ya, you are going to need to have heavily overclocked + overvolted GPUs + an overvolted and overclocked 6-10 core CPU. A lot of PC gamers make the mistake of taking maximum TPD or power consumption and start to manually add it up. It usually goes something like this: My CPU uses 150W, and each of my cards uses 250-300W so that means at peak my system will use 150+ 300Wx2? No, because there is no 'mainstream' real world application that uses 100% of the CPU + 100% of each GPU simultaneously. The only programs that can recreate such scenarios are distributed computing. The guys who run distributed computing on their CPUs+GPUs (say Folding @ Home, Seti@Home, Prime Grid, MilkyWay@Home, etc.) are already well researched and made the commitment to this cause to know well in advance the extreme power supply requirements necessary for system stability under such stress.

Hardware Canucks tested R9 295X2 with this system:

Processor: Intel i7 4930K @ 4.7GHz
Memory: G.Skill Trident 16GB @ 2133MHz 10-10-12-29-1T
Motherboard: ASUS P9X79-E WS

R9295X2-2-78.jpg


I suppose if someone is doing video rendering/encoding on a 6-10 core CPU and is playing a game, then it's possible to get 99-100% CPU usage but you aren't going to get 100%x2-3 SLI/Tri-SLI scaling in games either way. Also, there is a complex relationship between CPU & GPU bottlenecking and driver efficiency that itself impacts power usage of CPU vs. the GPU in the build. PC users just need to be aware on what exactly they are doing with their PC.

power-3.png



:thumbsup:

Something else I want to bring up -- how imo the entire PC reviewing community is not measuring performance/watt from an end user's/gamer's point of view. Instead, they use what I call the "Engineer's performance/watt perspective". What does that mean? An engineer designs a particular component with certain perf/watt characteristics and goals. An end user/gamer does not use just one component (tool for simplicity's sake) to achieve the end result. In practice, that means when we use our PC for some application, whether it's rendering, video games, watching movies, etc. perf/watt to us the end users:

End User's or Gamer's Performance per Watt = Productivity, Work result or Measured Performance / Total or Full system power usage

This is a very important concept. Here is why.

watt-draw-bench-gpus.png


Any individual component may have superior perf/watt to another but that alone doesn't tell the full picture to the end user since we cannot run games on just the CPU or just the GPU or just the motherboard or just the HDD/SSD. We need all of these components in order to generate a visual frame.

i7 4790K
950 = 65% / 199W = 0.326
960 = 75% / 216W = 0.347
280 (~ let's assume 270X performance) = 75% / 231W = 0.325
380 = 88% / 260W = 0.338
380X = 96% / 300.8W = 0.319
vs.
390 (~ assuming 290X performance) = 133% / 341W = 0.390
http://www.techpowerup.com/reviews/ASUS/R9_380X_Strix/23.html (using 1080P as just an example to illustrate the point).

The whole point about NV "blowing AMD out of the water in perf/watt" is often Engineer's Marketing because in practice you cannot just take a stand-alone graphics card, put it on your desk and launch a game on it, can you? The difference in perf/watt between a Core i7 4790K rig when paired with a 950/960/280/380/380X is practically non-existent. Surprised?

As you can see, the perf/watt the End User or Gamer experiences when using the system is NOT perf/watt of any individual component. So why do 100% of professional reviewers keep feeding us the Engineer's Perf/watt metrics? It's mostly irrelevant for choosing a proper PSU choice or figuring out your total system efficiency from an end user's point of view -- i.e., assessing how much performance we are getting in an app for a given amount of total system power.

What reviewers need to focus instead is Total System power usage (that helps me to calculate my exact costs of running an application) and End User's Perf/Watt metric to assess the efficiency of the entire system in a particular application because a PC is a collection of parts, not any one part, that must work together to produce some measured end result. They need to do this for whatever application they are testing. If they are trying to assess what is the most efficient rig for rendering, encoding, they need to test the entire system's power usage, not just the CPU's. If they are trying to assess the most efficient gaming rig, they need to test the entire system's power usage and calculate the End User's Perf/Watt ratings respectively.

It's time we stand up and call them out on this marketing BS and start asking them to include the End User's perf/watt in benchmarks and if they choose to keep the Engineer's perf/watt, hen fine include it for NV/AMD/Intel engineers and marketers among the readers but give the rest us of useful results that matter!

Most importantly, using my suggested approach, a gamer would no longer have to guess and estimate how much total system power his i3/i5/i7 or AMD rig will use with any modern GPU since the data will be there. Think about this, so many gamers are freaking out if their i3/FX8300 would work with a 400W PSU if adding an R9 280X/290/980Ti? Well, with Total system power usage, you never have to guess and with End User's Perf/watt benchmark, you always know the most efficient gaming rig.

If you just look at Videocard perf/watt, it does not conclusively tell us what the most efficient gaming rig is while if you just look at the stand-alone videocard power usage, it also doesn't tell you what the actual total system power usage is, which makes it much harder to estimate the bare minimum PSU required. ;)

For years I've been trying to explain this and not one review site is doing this as far as I am aware.
 
Last edited:

Mondozei

Golden Member
Jul 7, 2013
1,043
41
86
Ya, you are going to need to have heavily overclocked + overvolted GPUs + an overvolted and overclocked 6-10 core CPU. A lot of PC gamers make the mistake of taking maximum TPD or power consumption and start to manually add it up. It usually goes something like this: My CPU uses 150W, and each of my cards uses 250-300W so that means at peak my system will use 150+ 300Wx2? No, because there is no 'mainstream' real world application that uses 100% of the CPU + 100% of each GPU simultaneously. The only programs that can recreate such scenarios are distributed computing. The guys who run distributed computing on their CPUs+GPUs (say Folding @ Home, Seti@Home, Prime Grid, MilkyWay@Home, etc.) are already well researched and made the commitment to this cause to know well in advance the extreme power supply requirements necessary for system stability under such stress.

Hardware Canucks tested R9 295X2 with this system:

Processor: Intel i7 4930K @ 4.7GHz
Memory: G.Skill Trident 16GB @ 2133MHz 10-10-12-29-1T
Motherboard: ASUS P9X79-E WS

R9295X2-2-78.jpg


I suppose if someone is doing video rendering/encoding on a 6-10 core CPU and is playing a game, then it's possible to get 99-100% CPU usage but you aren't going to get 100%x2-3 SLI/Tri-SLI scaling in games either way. Also, there is a complex relationship between CPU & GPU bottlenecking and driver efficiency that itself impacts power usage of CPU vs. the GPU in the build. PC users just need to be aware on what exactly they are doing with their PC.

power-3.png




:thumbsup:

Something else I want to bring up -- how imo the entire PC reviewing community is not measuring performance/watt from an end user's/gamer's point of view. Instead, they use what I call the "Engineer's performance/watt perspective". What does that mean? An engineer designs a particular component with certain perf/watt characteristics and goals. An end user/gamer does not use just one component (tool for simplicity's sake) to achieve the end result. In practice, that means when we use our PC for some application, whether it's rendering, video games, watching movies, etc. perf/watt to us the end users:

End User's or Gamer's Performance per Watt = Productivity, Work result or Measured Performance / Total or Full system power usage

This is a very important concept. Here is why.

watt-draw-bench-gpus.png


Any individual component may have superior perf/watt to another but that alone doesn't tell the full picture to the end user since we cannot run games on just the CPU or just the GPU or just the motherboard or just the HDD/SSD. We need all of these components in order to generate a visual frame.

i7 4790K
950 = 65% / 199W = 0.326
960 = 75% / 216W = 0.347
280 (~ let's assume 270X performance) = 75% / 231W = 0.325
380 = 88% / 260W = 0.338
380X = 96% / 300.8W = 0.319
vs.
390 (~ assuming 290X performance) = 133% / 341W = 0.390
http://www.techpowerup.com/reviews/ASUS/R9_380X_Strix/23.html (using 1080P as just an example to illustrate the point).

The whole point about NV "blowing AMD out of the water in perf/watt" is often Engineer's Marketing because in practice you cannot just take a stand-alone graphics card, put it on your desk and launch a game on it, can you? The difference in perf/watt between a Core i7 4790K rig when paired with a 950/960/280/380/380X is practically non-existent. Surprised?

As you can see, the perf/watt the End User or Gamer experiences when using the system is NOT perf/watt of any individual component. So why do 100% of professional reviewers keep feeding us the Engineer's Perf/watt metrics? It's mostly irrelevant for choosing a proper PSU choice or figuring out your total system efficiency from an end user's point of view -- i.e., assessing how much performance we are getting in an app for a given amount of total system power.

What reviewers need to focus instead is Total System power usage (that helps me to calculate my exact costs of running an application) and End User's Perf/Watt metric to assess the efficiency of the entire system in a particular application because a PC is a collection of parts, not any one part, that must work together to produce some measured end result. They need to do this for whatever application they are testing. If they are trying to assess what is the most efficient rig for rendering, encoding, they need to test the entire system's power usage, not just the CPU's. If they are trying to assess the most efficient gaming rig, they need to test the entire system's power usage and calculate the End User's Perf/Watt ratings respectively.

It's time we stand up and call them out on this marketing BS and start asking them to include the End User's perf/watt in benchmarks and if they choose to keep the Engineer's perf/watt, hen fine include it for NV/AMD/Intel engineers and marketers among the readers but give the rest us of useful results that matter!

Most importantly, using my suggested approach, a gamer would no longer have to guess and estimate how much total system power his i3/i5/i7 or AMD rig will use with any modern GPU since the data will be there. Think about this, so many gamers are freaking out if their i3/FX8300 would work with a 400W PSU if adding an R9 280X/290/980Ti? Well, with Total system power usage, you never have to guess and with End User's Perf/watt benchmark, you always know the most efficient gaming rig.

If you just look at Videocard perf/watt, it does not conclusively tell us what the most efficient gaming rig is while if you just look at the stand-alone videocard power usage, it also doesn't tell you what the actual total system power usage is, which makes it much harder to estimate the bare minimum PSU required. ;)

For years I've been trying to explain this and not one review site is doing this as far as I am aware.

Very high quality post. Thanks for putting in the time and effort. :thumbsup:
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
RS your assessment isn't fully correct, just showing the Full System wattage won't inform the buyer how much wattage they need from the 12V rail which is a huge thing if you are buying a top of the line card. There are many crap PSUs which have 1000W but can't consistent supply 60w from the 12V rail, so yeah we need to see the video card wattage up there.
 

Mondozei

Golden Member
Jul 7, 2013
1,043
41
86
There are many crap PSUs which have 1000W but can't consistent supply 60w from the 12V rail, so yeah we need to see the video card wattage up there.


Right, but isn't this is a function of older generations of PSUs? IIRC, even bronze PSUs that are being sold today and are mainstream are fairly good on this.

And this is obscuring the fact that you can get high quality PSUs like the EVGA G2 series for much less today than even 5 years ago, so a lot more mainstream buyers are getting awesome PSUs today than before. It's no longer unreachable for the mass market.

But maybe I'm wrong? I'd love for you to expand on this.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
Right, but isn't this is a function of older generations of PSUs? IIRC, even bronze PSUs that are being sold today and are mainstream are fairly good on this.

And this is obscuring the fact that you can get high quality PSUs like the EVGA G2 series for much less today than even 5 years ago, so a lot more mainstream buyers are getting awesome PSUs today than before. It's no longer unreachable for the mass market.

But maybe I'm wrong? I'd love for you to expand on this.

TBH my experience with this is completely different, many peeps will spend a fortune on other components but get all cheap while buying a PSU,it is just a damn power supply right? and believe me they rarely go look up a PSU review before buying. The data is also very important if you are planning to go sli or xfire route.
 

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
Ya, you are going to need to have heavily overclocked + overvolted GPUs + an overvolted and overclocked 6-10 core CPU. A lot of PC gamers make the mistake of taking maximum TPD or power consumption and start to manually add it up. It usually goes something like this: My CPU uses 150W, and each of my cards uses 250-300W so that means at peak my system will use 150+ 300Wx2? No, because there is no 'mainstream' real world application that uses 100% of the CPU + 100% of each GPU simultaneously. The only programs that can recreate such scenarios are distributed computing. The guys who run distributed computing on their CPUs+GPUs (say Folding @ Home, Seti@Home, Prime Grid, MilkyWay@Home, etc.) are already well researched and made the commitment to this cause to know well in advance the extreme power supply requirements necessary for system stability under such stress.

Hardware Canucks tested R9 295X2 with this system:

Processor: Intel i7 4930K @ 4.7GHz
Memory: G.Skill Trident 16GB @ 2133MHz 10-10-12-29-1T
Motherboard: ASUS P9X79-E WS

R9295X2-2-78.jpg


I suppose if someone is doing video rendering/encoding on a 6-10 core CPU and is playing a game, then it's possible to get 99-100% CPU usage but you aren't going to get 100%x2-3 SLI/Tri-SLI scaling in games either way. Also, there is a complex relationship between CPU & GPU bottlenecking and driver efficiency that itself impacts power usage of CPU vs. the GPU in the build. PC users just need to be aware on what exactly they are doing with their PC.

power-3.png




:thumbsup:

Something else I want to bring up -- how imo the entire PC reviewing community is not measuring performance/watt from an end user's/gamer's point of view. Instead, they use what I call the "Engineer's performance/watt perspective". What does that mean? An engineer designs a particular component with certain perf/watt characteristics and goals. An end user/gamer does not use just one component (tool for simplicity's sake) to achieve the end result. In practice, that means when we use our PC for some application, whether it's rendering, video games, watching movies, etc. perf/watt to us the end users:

End User's or Gamer's Performance per Watt = Productivity, Work result or Measured Performance / Total or Full system power usage

This is a very important concept. Here is why.

watt-draw-bench-gpus.png


Any individual component may have superior perf/watt to another but that alone doesn't tell the full picture to the end user since we cannot run games on just the CPU or just the GPU or just the motherboard or just the HDD/SSD. We need all of these components in order to generate a visual frame.

i7 4790K
950 = 65% / 199W = 0.326
960 = 75% / 216W = 0.347
280 (~ let's assume 270X performance) = 75% / 231W = 0.325
380 = 88% / 260W = 0.338
380X = 96% / 300.8W = 0.319
vs.
390 (~ assuming 290X performance) = 133% / 341W = 0.390
http://www.techpowerup.com/reviews/ASUS/R9_380X_Strix/23.html (using 1080P as just an example to illustrate the point).

The whole point about NV "blowing AMD out of the water in perf/watt" is often Engineer's Marketing because in practice you cannot just take a stand-alone graphics card, put it on your desk and launch a game on it, can you? The difference in perf/watt between a Core i7 4790K rig when paired with a 950/960/280/380/380X is practically non-existent. Surprised?

As you can see, the perf/watt the End User or Gamer experiences when using the system is NOT perf/watt of any individual component. So why do 100% of professional reviewers keep feeding us the Engineer's Perf/watt metrics? It's mostly irrelevant for choosing a proper PSU choice or figuring out your total system efficiency from an end user's point of view -- i.e., assessing how much performance we are getting in an app for a given amount of total system power.

What reviewers need to focus instead is Total System power usage (that helps me to calculate my exact costs of running an application) and End User's Perf/Watt metric to assess the efficiency of the entire system in a particular application because a PC is a collection of parts, not any one part, that must work together to produce some measured end result. They need to do this for whatever application they are testing. If they are trying to assess what is the most efficient rig for rendering, encoding, they need to test the entire system's power usage, not just the CPU's. If they are trying to assess the most efficient gaming rig, they need to test the entire system's power usage and calculate the End User's Perf/Watt ratings respectively.

It's time we stand up and call them out on this marketing BS and start asking them to include the End User's perf/watt in benchmarks and if they choose to keep the Engineer's perf/watt, hen fine include it for NV/AMD/Intel engineers and marketers among the readers but give the rest us of useful results that matter!

Most importantly, using my suggested approach, a gamer would no longer have to guess and estimate how much total system power his i3/i5/i7 or AMD rig will use with any modern GPU since the data will be there. Think about this, so many gamers are freaking out if their i3/FX8300 would work with a 400W PSU if adding an R9 280X/290/980Ti? Well, with Total system power usage, you never have to guess and with End User's Perf/watt benchmark, you always know the most efficient gaming rig.

If you just look at Videocard perf/watt, it does not conclusively tell us what the most efficient gaming rig is while if you just look at the stand-alone videocard power usage, it also doesn't tell you what the actual total system power usage is, which makes it much harder to estimate the bare minimum PSU required. ;)

For years I've been trying to explain this and not one review site is doing this as far as I am aware.
The problem with this concept is you need to test lots and lots of games to get a conclusive result.

Say a 980Ti uses less than a 390, but because the framerate on the 980Ti is higher it stresses the cpu more, so the total powerconsumption is higher.

But another game may stress the cpu more, so the system with the 390 will consume more and the 980Ti will run understressed because of a cpu bottleneck.

It's safer to take the measured gpu power consumption and add an estimated cpu consumption in case there's a game that manages to stress both cpu and gpu. I've set my cpu to max 100W in the bios, so that's an easy value for max cpu load.
 

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
I confess that I am that guy that only buys the cheapest power supply possible to satisfy the minimum requirements. I don't believe I have ever spent over $60 on a PSU. In over 20 years, not one failed PSU, fried component, or fire.

I am not an LN2 or quad-SLI/CF type person either, so my use cases aren't the extreme. My most extreme system was probably my OC'd Q6600 with GTX 470 SLI. Since then, I just do simple air OC and a single OC GPU.
 

MrTeal

Diamond Member
Dec 7, 2003
3,638
1,860
136
I cheaped out on my last PSU, Lepa G 1600 and it failed after 2.5 years of use. Got myself an EVGA Supernova 1600 T2 now and its rock solid.

http://forums.anandtech.com/showthread.php?t=2455424

The Lepa G1600 isn't a cheap PSU at all. It's pretty much the exact same unit as the Enermax MaxRevo 1500, and was one of the best 2HP supplies available a few years ago. There's better Platinum and Titanium units available now, but they're still pretty rocksolid units IMO. Anecdotal I know, but I've had a couple running on 240V outputting ~1500w 24/7 since 2012 and they're still working.
 

MrTeal

Diamond Member
Dec 7, 2003
3,638
1,860
136
Great article, thanks for sharing. I do think people go way overboard with stating power requirements and I hope this clears it up.

Would have loved to seen OC'd 390Xs in CF as a worst case scenario, but with a single card well short of 500, I can't imagine a CF setup being over 800.

Running P95 and Furmark my system will pull 1000W at the plug, so it's definitely possible. Gaming loads are obviously much lower.
 

KaRLiToS

Golden Member
Jul 30, 2010
1,918
11
81
The Lepa G1600 isn't a cheap PSU at all. It's pretty much the exact same unit as the Enermax MaxRevo 1500, and was one of the best 2HP supplies available a few years ago. There's better Platinum and Titanium units available now, but they're still pretty rocksolid units IMO. Anecdotal I know, but I've had a couple running on 240V outputting ~1500w 24/7 since 2012 and they're still working.

3/5 rating on new egg and with the experience I had for the past 3 years I would give it a 2. It is my second RMA. Been shutting down for a while even during light load. It is a whinning unit.

Didnt like it much except for its small form factor.

The EVGA superNova 1600 is freaking awsome and solid.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
I confess that I am that guy that only buys the cheapest power supply possible to satisfy the minimum requirements. I don't believe I have ever spent over $60 on a PSU. In over 20 years, not one failed PSU, fried component, or fire.

I am not an LN2 or quad-SLI/CF type person either, so my use cases aren't the extreme. My most extreme system was probably my OC'd Q6600 with GTX 470 SLI. Since then, I just do simple air OC and a single OC GPU.

It's often not the psu that outright fails - you run a psu, especially a cheap one close to it's limit and you get a lot more voltage fluctuation, increasing over time as the psu wears out. That stresses all the voltage regulators in your gpu, motherboard, etc. They they in turn become less reliable and the rest of your system becomes less stable.

Hence I would always recommend buying a decent quality psu with safe margin so you aren't running it at the limit. That gives you the best chance of a long stable life for your PC.
 

MrTeal

Diamond Member
Dec 7, 2003
3,638
1,860
136
3/5 rating on new egg and with the experience I had for the past 3 years I would give it a 2. It is my second RMA. Been shutting down for a while even during light load. It is a whinning unit.

Didnt like it much except for its small form factor.

The EVGA superNova 1600 is freaking awsome and solid.

To be fair, even the AX1500i has 4/5 eggs and that's pretty much considered the best 1500W+ supply available. I won't disagree with you that the new Superflower units are awesome, just disagreeing that the G1600 is a cheap PSU. It really isn't, and it was a better unit (IMO) than many of the other cheaper big PSUs at the time like the Rosewill Lightning 1300 and Thermaltake Toughpower. Sucks that you've had so much trouble with it though.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
I did measure my total system power consumption (minus monitor obviously) the other day with a Wattmeter and was surprised by it's relatively moderate power draw.

My system:

Core i7 3770 "Ivy Bridge" w/o OC
16Gb RAM
MSI R9 290 4G w/o OC
Bunch of HDDs and other crap

Idle Desktop usage was around 65-70 Watt.

Unigine Valley (Ultra settings, 1080p, 8x MSAA, 60 fps cap via RTSS)

~320-330 Watt @100% GPU usage @Stock clocks (977Mhz)
~250 Watt @ 850Mhz downclock, still barely manages the same framerate.

Fallout 4 and Unreal Tournament (UE4) test with 60 fps cap and High-to-Ultra settings:
~250 Watt @Stock clocks (977Mhz)
~220 Watt @850 mhz downclock

All in butter fluid 60 fps. I'm happy with that.

So I'm still lost then. Why do people recommend at LEAST 800+ W for crossfire with the 290?

Shouldn't I be fine with ~700W PSU? I want to try, I just don't want to end up getting unlucky and having my whole system die.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
My Current gaming system (specs in sig) draws just over 380W peak with the CPU and GPU pegged (oh, but GPU at stock clocks when i checked that).

my previous system which had a 4GHz AMD 965BE and the same video card would hit I think 420W(ish).

Like others have said, as long as you don't have a POS PSU, you don't need some giant peak output.
 

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
ive got a 3770K at 4.6@1.35v and 2 980ti's that boost to 1455 at around 1.18v, and I've never seen it break 620w (660w at the wall).
 
Last edited:

Hi-Fi Man

Senior member
Oct 19, 2013
601
120
106
So I'm still lost then. Why do people recommend at LEAST 800+ W for crossfire with the 290?

Shouldn't I be fine with ~700W PSU? I want to try, I just don't want to end up getting unlucky and having my whole system die.

Because they are overzealous usually. I remember when I worked at TigerDirect as an A+ tech, this lady came in and wanted to buy a simple graphics card for running multiple monitors. Her system consisted of a 45nm Core 2 Duo, mATX board, one HDD, one DVD drive and a 200w PSU.

I sold her on the GeForce GT 210 and as I was putting the card in one of my obnoxious co-workers comes along and starts asking me if I checked what PSU this card needs and blah blah blah. I told my co-worker it's just a 210, It'll work fine with a 200w PSU. My co-worker then proceeded to look at the box and say to me that it needs a 350w PSU because the box says so... I've never face palmed so hard in real life.

So part of the problem is graphics card boxes overstating requirements to cover their a** and people not knowing how much these parts really draw.
 

KaRLiToS

Golden Member
Jul 30, 2010
1,918
11
81
To be fair, even the AX1500i has 4/5 eggs and that's pretty much considered the best 1500W+ supply available. I won't disagree with you that the new Superflower units are awesome, just disagreeing that the G1600 is a cheap PSU. It really isn't, and it was a better unit (IMO) than many of the other cheaper big PSUs at the time like the Rosewill Lightning 1300 and Thermaltake Toughpower. Sucks that you've had so much trouble with it though.

Yeah sucks because I used to love it but in my case, a multi rail PSU is not for me. Single rail is much better for overclocking each cards. For heavy benching session, I now have the 1600 T2 ans a G2 1000watts... No need to balance the rails anymore. :thumbsup:
 

amenx

Diamond Member
Dec 17, 2004
4,124
2,405
136
Its been known for ages that we can get by with far less than we need. Still I usually go for twice or more my power requirements. For the simple reason that the PSU will remain cool and unstressed even under sustained peak power conditions. So for increased longevity basically.
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
Honestly I fret way more about the PSU in my media servers than the one in my gaming machine. My media servers have nice gold Seasonic PSUs while my best gaming rig has a refurbished bronze Corsair PSU. My reasoning is that my media servers run 24/7, and they have to be able to take the hit of up to 16 drives on a single rail when I do a full spin-up. Meanwhile my gaming rig gets turned off when not in use.
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
I skimmed the article, but wanted to see a chart where they take the same system and just swap out the PSU to see how that affects power consumption at the wall. Like compare a no-name to bronze, silver, gold, platinum efficiencies to see how much you save.

As overall typical usage system power consumption goes down, I think it becomes less important to have a super efficient PSU, or at least it doesn't make sense spending huge amounts of money on it, because even if your PSU is inefficient, when you are talking about overall consumption of less than 100 watts most of the time, it would take years and years to pay off the difference in price between PSUs?
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
My 1480mhz core, 7900mhz vram gtx 980, 4ghz 4770K, HDD, SSD, 16gb ram, and 3 case fans draws a max of ~330 watts at the wall in a benchmark loop of Metro Last Light with vsync off. I'm running a Silverstone 450 watt gold PSU.