• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

Recommend a PSU for new build

jjj807

Senior member
Jun 20, 2004
395
0
71
Hey all, I'm stuck on what size and type of PSU to get

Here will be the specs, this is a theoretical build. But i'd like to square away the PSU. There will be no OC'ing, all stock

i7 2600k
GTX580 1GB
16GB DDR3
128GB SSD
1TB HDD
1 Blu Ray Drive
1 DVD Burner
Antec 300 case
Asus z68 deluxe

This is a theoretical build like I said. Let me make it clear I might be going SLI gtx580 in the future or SLI something else high-end. So i would like to have enough power to run any GPU's in SLI for the next year.

I preferably I would like to keep it under 200.

Thanks!
 

lehtv

Elite Member
Dec 8, 2010
11,897
74
91
For one GTX580 and no OC, 600-650W is optimal. For SLI GTX580 and no OC, 850W would handle it, 1000W max.

GTX 580 is bad value for money not only because it's the highest performing GPU, but also because the release of the new generation of GPUs is pretty close. If you're gaming at 1080p, I recommend max. GTX 570 / HD 6970. When it's not fast enough anymore, replace it with a new generation single GPU. If you still want to keep open the possibility of 6xx series SLI, an 850W PSU likely would do just fine. Assuming you're buying from the US:

midrange $50 620W: http://www.newegg.com/Product/Produc...82E16817371048 (no point going below 600W because this is so cheap)
midrange $70 650W: http://www.newegg.com/Product/Produc...82E16817139020
high-end $100 650W: http://www.newegg.com/Product/Produc...82E16817139012
high-end $110 650W: http://www.newegg.com/Product/Produc...82E16817151088

high-end $120 850W http://www.newegg.com/Product/Produc...82E16817371053 <-- excellent deal right here, oh yeah baby (can't recommend any other 850W, that pricetag...)

midrange $130 950W http://www.newegg.com/Product/Produc...82E16817703028
midrange $140 950W http://www.newegg.com/Product/Produc...82E16817139013
high-end $210 1050W http://www.newegg.com/Product/Produc...82E16817139034

P.S. If you don't OC, why 2600K? You don't need the unlocked multiplier so you could just get 2600 for $15 less. I assume you're planning to do encoding or something similarly hyperthreaded?
 
Last edited:

dpk33

Senior member
Mar 6, 2011
687
0
76
Just out of curiousity, why do you have a 2600k and z68/getting a 2600k and z68 if you aren't overclocking?

And I personally wouldn't go anything under 750 watts on a quality power supply like seasonic.
 

jjj807

Senior member
Jun 20, 2004
395
0
71
I might begin my overclock adventure with the 2600k, i want to leave the option open to OC
 

TemjinGold

Diamond Member
Dec 16, 2006
3,050
65
91
Your money would be way better spent on a 2500 (no k) if you aren't OCing. It's WAY cheaper and you won't notice the performance difference for gaming.

To your question though, my rig draws less than 300 watts from the wall at load. Your will be less power hungry if you don't OC. Don't listen to the guy saying at least 750 watts because that's super overkill. You can SLI the 580 at 750 watts.
 

lehtv

Elite Member
Dec 8, 2010
11,897
74
91
It's WAY cheaper
It's $10 cheaper...

my rig draws less than 300 watts from the wall at load.
Doing what? My rig draws 330W in 3DMark Vantage even though my GPU uses over 50W less than yours.

You can SLI the 580 at 750 watts.
Not the wisest choice. Dual GTX 580 use 500W at full load by themselves. That's 67&#37; of rated wattage. Add i5-2500 near max load and you're already near 80% rated wattage. Then add a motherboard, memory, storage, fans...
 
Last edited:

TemjinGold

Diamond Member
Dec 16, 2006
3,050
65
91
It's $10 cheaper...

Doing what? My rig draws 330W in 3DMark Vantage even though my GPU uses over 50W less than yours.

Not the wisest choice. Dual GTX 580 use 500W at full load by themselves. That's 67% of rated wattage. Add i5-2500 near max load and you're already near 80% rated wattage. Then add a motherboard, memory, storage, fans...

You do realize your rig has a ton more stuff than mine or the OP's, right? Regardless, my 300 and your 330 is "from the wall," which is not the same as what the components actually use. The wattage on a PSU is for what the components actually use. The OP is not well-served with a 750-watter either way.

Dual 580s can't use 500 watts "by themselves": A 580 with 2 8-pins can physically only draw 225 watts. That doesn't mean they always do, that's their upper limit. 500 watts for a full system under normal load sounds about right though.
 

lehtv

Elite Member
Dec 8, 2010
11,897
74
91
You do realize your rig has a ton more stuff than mine or the OP's, right?
Erm... No. I have a sound card, another HDD and a couple more low-RPM fans. That's maybe what, 20-25 watts. I don't think I had the fan controller when I made the wattage measurements.

Regardless, my 300 and your 330 is "from the wall,"
My 330 is not from the wall, it's adjusted for PSU efficiency (0.9). Wall reading was ~365W.

The OP is not well-served with a 750-watter either way.
I completely agree

A 580 with 2 8-pins can physically only draw 225 watts.
Some of the power is drawn directly from the PCI-E slot. Officially, 580 has a TDP of 244W. Guru3D calculated ~522W at load for GTX 580 SLI (the cards only, not the system).
 

TemjinGold

Diamond Member
Dec 16, 2006
3,050
65
91
You do realize that the PCI-E 2.0 slot provides a max of 75 watts while each 8-pin also provides 75 watts a piece, right? That's 225 watts. If your system is pulling 330 adjusted for efficiency, then you are doing something far more intensive than me. My reading is done at full Crysis load, which I have seen to be a bit of a power standard on reviews.
 

lehtv

Elite Member
Dec 8, 2010
11,897
74
91
If your system is pulling 330 adjusted for efficiency, then you are doing something far more intensive than me
Nope, just running 3Dmark vantage.

You do realize that the PCI-E 2.0 slot provides a max of 75 watts while each 8-pin also provides 75 watts a piece, right? That's 225 watts.
http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_580/25.html completely invalidates your claim that only 225W can be supplied to the GPU.
GPU-only power consumption: 225W in 3DMark03, 304W in Furmark with power limiter off.

Also, how come a GTX 590 consumes 340W at load (card only) according to Guru3D? That's 115W more than "theoretically possible" with two 8-pin connectors and a pci-e slot. Surely it's some mistake.

So you claim that load power consumption in your system doesn't exceed 300W. Let's see what other systems with GTX580 get in similar circumstances:
http://www.bit-tech.net/hardware/graphics/2010/11/09/nvidia-geforce-gtx-580-review/8 363W in 3DMark06. CPU = i7-965 @ 3.2.
http://techreport.com/articles.x/19934/5 412W in L4D2, CPU = i7-965 @ 3.2
http://www.tomshardware.com/reviews/geforce-gtx-580-gf110-geforce-gtx-480,2781-15.html ~400W in Metro2033. CPU = 980X @ 3.73
http://www.hexus.net/content/item.php?item=27307&page=16 352W in CoD MW2. CPU = 980X @ 3.33.

Do you also claim to invalidate Guru3D's SLI results?

Maybe you should do your readings again. Or bin your watt meter. My readings are perfectly in accordance with what to expect from a system like this. ~190W GPU, ~100W CPU (not stressed to max in vantage, obviously), ~40W for the rest.
 
Last edited:

yours truly

Golden Member
Aug 19, 2006
1,026
1
81
Hello, I was thinking about adding a second GTX 580 Lightning to my rig but I'm not 100&#37; sure my Corsair AX850 would be able to handle it.

Some say it's more than enough, some not enough.

The last thing I want is to become reckless and stress the PSU so much that it'll fry my entire system.

Would I be OK?

Thanks
 

lehtv

Elite Member
Dec 8, 2010
11,897
74
91
It's not more than enough, but it's not not enough either. It'll handle it, but it'll be stressed. Shouldn't really worry about frying your system, it's a Seasonic built PSU that can deliver 1000W at 45 degrees C ambient (yes, despite its 850W rating) not counting capacitor aging, and even if you had three GTX 580 to achieve such a load, it's very unlikely anything would happen, it'd just shut down the PC down.

Estimating your absolute max. system power consumption in more detail:

580 SLI ~ 520W (Guru3D review estimate. Overclocked units, so 550W.)
2500K ~ 130W (95W TDP + 33&#37; OC.)
Everything else ~ 75W or so (this is the trickiest part to estimate but also the part where errors matter the least)

Total ~ 755W = 89% of rated wattage. Your system will practically never reach this high power consumption, it's just a theoretical max. In a real world scenario e.g. Crysis or 3DMark, you'll only use a part of your CPU power, and your GPUs won't be working to absolute 100% either. Something like 650W continuous power is probable, that's just 76% of rated wattage which is alright.

P.S. what do you need two GTX 580 for on a single monitor >_>
 
Last edited:

yours truly

Golden Member
Aug 19, 2006
1,026
1
81
Hello lehtv, many thanks for your response. I was reading a few comments elsewhere, people using those wall meter things and some are pulling 950w during benchmarks.

I was thinking if I was playing BF3 at ultra settings for say 8 hours a day (slight exaggeration! -maybe weekends) then I could be putting a huge strain on the PSU.

I'm wondering if I should just call it a day and stick with 1 580.

P.S. what do you need two GTX 580 for on a single monitor >_>

I was mulling over whether I should get a U3011 before Christmas. Gaming on that would be great!

If it were you, would you risk 2 580's with a AX850? I know the Lightnings draw a little less power than regular 580's.

edit: my PSU has: +12V~70A Would that make any difference?
 
Last edited:

lehtv

Elite Member
Dec 8, 2010
11,897
74
91
people using those wall meter things and some are pulling 950w during benchmarks.
Could you link? I'd be interested to know what their specs were. In particular their PSU efficiency and CPU/GPU overclocks. And what benchmarks.

I'm wondering if I should just call it a day and stick with 1 580.
Absolutely. One 580 will run BF3 fine, maybe not 60fps smooth all maxed out, but fine. A second 580 is bad value for money not only because of the price premium you pay for the highest performing GPU, but also because of how close 28nm GPUs are. Upgrade early next year when we have HD 7000 and GTX 600. I would wager (just speculating here but not without reason) that given the 28nm production technique, the fastest GPUs will use less power than GTX580, so your 850W should be spot on for, say, GTX 670 SLI. Upgrade your monitor at the same time (or if we get lucky AMD 7000 will be out before xmas).

If it were you, would you risk 2 580's with a AX850? I know the Lightnings draw a little less power than regular 580's.
They do? In that case, I wouldn't call it a risk at all. I don't see why it couldn't handle it.

my PSU has: +12V~70A Would that make any difference?

70A in watts is 70A * 12V = 840W.
 
Last edited:

yours truly

Golden Member
Aug 19, 2006
1,026
1
81
Could you link? I'd be interested to know what their specs were. In particular their PSU efficiency and CPU/GPU overclocks. And what benchmarks.

I beg your pardon it was 897w. Don't know where I got 950w, http://www.evga.com/forums/tm.aspx?m=1078520&mpage=2 Comment #51 He's running an i7 950, not sure if that makes a huge difference. but I'm sure I read somewhere a benchmark with furmark, got near 950w. I can't find it though.


Absolutely. One 580 will run BF3 fine, maybe not 60fps smooth all maxed out, but fine. A second 580 is bad value for money not only because of the price premium you pay for the highest performing GPU, but also because of how close 28nm GPUs are. Upgrade early next year when we have HD 7000 and GTX 600. I would wager (just speculating here but not without reason) that given the 28nm production technique, the fastest GPUs will use less power than GTX580, so your 850W should be spot on for, say, GTX 670 SLI. Upgrade your monitor at the same time (or if we get lucky AMD 7000 will be out before xmas).

Really appreciate your advice and it makes sense. Why pay a premium for technology almost a year old? Thanks you made my decision easier. :)
 

lehtv

Elite Member
Dec 8, 2010
11,897
74
91
I beg your pardon it was 897w. Don't know where I got 950w, http://www.evga.com/forums/tm.aspx?m=1078520&mpage=2 Comment #51 He's running an i7 950, not sure if that makes a huge difference. but I'm sure I read somewhere a benchmark with furmark, got near 950w. I can't find it though.

Yep, that's furmark. GPUs running it use more power than any game would. He got 897W from the wall, so adjusted for 90% efficiency (optimistic) that's 807W. Still under the 850W spec, so using it for a while just to benchmark wont have any lasting effect on a quality PSU.

Any game on his setup would eat closer to 700W power. The i7 950 @ 4.2GHz consumes at least 30W more than 2500K since the TDP of 950 is 30W higher than that of 2500K to begin with. And being a bit weaker than 2500K, it will be stressed more to drive GTX580 SLI, additionally increasing its power consumption in Furmark compared to a 2500K setup. Finally he's got a water cooling system which uses some power. Being water cooled, his GTX 580's are probably overclocked too.

Really appreciate your advice and it makes sense. Why pay a premium for technology almost a year old? Thanks you made my decision easier. :)
No problem :)
 
Last edited:

TemjinGold

Diamond Member
Dec 16, 2006
3,050
65
91
Nope, just running 3Dmark vantage.

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_580/25.html completely invalidates your claim that only 225W can be supplied to the GPU.
GPU-only power consumption: 225W in 3DMark03, 304W in Furmark with power limiter off.

Also, how come a GTX 590 consumes 340W at load (card only) according to Guru3D? That's 115W more than "theoretically possible" with two 8-pin connectors and a pci-e slot. Surely it's some mistake.

So you claim that load power consumption in your system doesn't exceed 300W. Let's see what other systems with GTX580 get in similar circumstances:
http://www.bit-tech.net/hardware/graphics/2010/11/09/nvidia-geforce-gtx-580-review/8 363W in 3DMark06. CPU = i7-965 @ 3.2.
http://techreport.com/articles.x/19934/5 412W in L4D2, CPU = i7-965 @ 3.2
http://www.tomshardware.com/reviews/geforce-gtx-580-gf110-geforce-gtx-480,2781-15.html ~400W in Metro2033. CPU = 980X @ 3.73
http://www.hexus.net/content/item.php?item=27307&page=16 352W in CoD MW2. CPU = 980X @ 3.33.

Do you also claim to invalidate Guru3D's SLI results?

Maybe you should do your readings again. Or bin your watt meter. My readings are perfectly in accordance with what to expect from a system like this. ~190W GPU, ~100W CPU (not stressed to max in vantage, obviously), ~40W for the rest.

I stand corrected in light of this evidence and will own up to that. I don't necessarily believe my readings are wrong though because I imagine my 2500k is a lot less power hungry than those listed systems (and your cpu) and my OC is fairly mild with no voltage adjustment.
 

alkemyst

No Lifer
Feb 13, 2001
83,769
19
81
I have had good luck with Corsair PSUs....also OP you say no OC in the beginning, but that you are going to try later.

You want to have the PSU rated for your max settings, the main failure in OCing is someone building the rig to just stock specs then going beyond them.
 

lehtv

Elite Member
Dec 8, 2010
11,897
74
91
I stand corrected in light of this evidence and will own up to that. I don't necessarily believe my readings are wrong though because I imagine my 2500k is a lot less power hungry than those listed systems (and your cpu) and my OC is fairly mild with no voltage adjustment.

ua1ZO.jpg
 
Last edited: