Future-proofing your power supply?

SymphonyX7

Member
Oct 1, 2009
35
0
0
I'm in a dilemma right now that involves future-proofing and my power supply. Long story short, I want to upgrade my rig's graphics card and I want to upgrade to a power supply that will last me until my next upgrade, not just this one. But I fear that the power supply I'll get is simply overkill and may consume additional electricity (is this even the case?). I don't want to get an in-between just to sell it again when I upgrade in 12+ months time.

I've read this article "Everything you need to know about power supplies" by Hardware Secrets (http://www.hardwaresecrets.com/article/Everything-You-Need-to-Know-About-Power-Supplies/181/1), but it doesn't answer my questions.

First question. Will getting a power supply with a much higher rating than my rig's actual draw consume more electricity, compared to say a power supply with just enough overhead? i.e. 500W vs 700W PSU for a rig that eats 400W. I'm assuming the 700W PSU will consume more power.

Second question. Is future-proofing your power supply even advised? I'm under the assumption that the PSU's efficiency deteriorates over time, especially when under sustained heavy load over long periods. Also, the fact that PC graphics seem to be increasing in power consumption with each iteration despite the emphasis on mobile graphics these days, which focuses on efficiency. The release of the PS4 and Xbox One may exacerbate this too as developers have a much more powerful target platform now (spoiling developers for the short term), and PC ports haven't been exactly optimized to the same degree as the console builds. We're still early in this console's lifecycle, so if the console builds aren't that optimized, what more the PC ports? This would necessitate even faster, more power hungry graphics from AMD and Nvidia.

Is future-proofing your power supply a bad idea or am I just over analyzing it? I see a lot of people overshooting their power requirements, either to leave room just to be safe or for a future upgrade (which is where I'm at).

Long story is this: after two years, I've returned to PC gaming. But this time I'm going to go with a multi-monitor setup, so I'll really need a much faster graphics card and I'm planning to get the Radeon R9 280X. I'm pretty sure my current PSU can't handle that for sustained periods. Below is my current PSU:

IMG_20140504_051007_SVGA_zps7c08473e.jpg


This is a 80+ bronze rated PSU that has been with me since 2011. It has 2x PCI-E molex pins, one 6 pin and one 8 pin (6+2), evenly distributed on the 12V rail. Although I use my rig sparingly, it's been rock solid stable during marathon sessions. My rig has the following specs:

Intel Core i7-2600
2x 4 GB DDR3-1333 RAM
Sapphire Radeon HD 6850
WD Blue 1 TB
Seagate Barracuda 320 GB
Lite-On DVD-RW drive
Encore ENMAB-8CM
2x 120 mm with LED, 2x 80 mm with LED

Given my current PSU, the best graphics card I can upgrade to is the R9 270X and possibly a 280X, but the 280X won't leave much headroom and that's dangerous. The 280X alone can spike to 300W. I initially planned to upgrade to a power supply in the 600W region, the FSP Raider 550W or 650W in particular. I'm talking about the new Raider variant, which is also called the 'Raider S' in the US. The new variant (2013-) is silver rated, but actually performs like the gold rated Aurum. The old bronze rated variant of the Raider got a fail mark from HardOCP due to high ripple on load, but the new variant eliminates that and is much better. Would've went Seasonic, but it's too expensive where I live for their gold rated models. However, even the 650W Raider might not suffice if I decide to upgrade to an even more powerful card in the future. So I want to buy the 750W Raider or an equivalent model now.

On an off-topic but related note, I could also upgrade to a Radeon R9 270X 4GB instead so my current PSU will suffice and upgrade my PSU when I finally get that truly "uber" fast graphics card. However, I don't think the 270X is powerful enough to run games on high (not ultra), even at just 4800x900 (3x 1600x900) resolution. I really feel like I need the 280X at least for now. But I don't want to upgrade my PSU twice, once for the 280X and another for the truly uber card (possibly the 290X and 780 Ti's successor, or an R9 295X2).

Any inputs will be highly appreciated. Moreso expert ones.
 
Last edited:

_Rick_

Diamond Member
Apr 20, 2012
3,943
69
91
You've got 36A of 12 V coming out of that thing.
That's indeed a bit tight, and makes the 550W peak power a bit of a misnomer, as most power supplies will happily give 95% of their rated power over the 12V lines.
As a 400W dual-rail PSU it's not ideally suited to the asymmetric power demands a 300W GPU and 80-90W CPU will put on it.

I would go with a decent 500W 80+ Gold unit (Actually, I did just that with an OC i5 2500k and GTX770/GTX 560Ti with a big OC).

If you get a decently efficient PSU, you shouldn't have to worry about over-provisioning massively - these higher end devices are happy to run even at 110% load for short intervals.

In the medium term, I wouldn't expect a single GPU to draw more than 350W peak load. Even though the 280X pushed the envelope, I still think we won't see triple 8-pin connectors on graphics cards.
Given that your GPU-less system can easily run at 100W under full load, spec'ing more than 500W seems unlikely to be beneficial, unless you go with more than one card in the future.

Between the different 500W gold-rated PSUs, decide on noise, fan, and cable management options. Make sure you get 2x PCIe-8 pin - but that should be on all of them.
 

SymphonyX7

Member
Oct 1, 2009
35
0
0
Thanks for the prompt response. The 36A available on the 12V rail is really pushing it for the 280X. It says here http://forum-en.msi.com/faq/article/power-requirements-for-graphics-cards that the 280X needs 30A. My i7-2600 has a TDP of 95W. 95/12 = 7.92A right? Not enough in the worst case scenario.

Initially, my gut told me to go with the 550W FSP Raider S, which practically performs like a gold rated. Gold rated units in my country are expensive. 500W gold rated units like the FSP Aurum, which is the cheapest gold rated PSU here, starts at $100. The 550W FSP Raider S (or just the Raider here) costs $60.

Problem is there isn't much headroom for an upgrade in the horizon. But if I go for, say the 750W FSP Raider S, won't it consume more electricity than the 550W variant? I'm not sure that desktop power supplies are fully regulated. I also assume it will last longer since it won't be stressed at all until I get that uber fast graphics card in the future.
 
Last edited:

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
Thanks for the prompt response. The 36A available on the 12V rail is really pushing it for the 280X. It says here http://forum-en.msi.com/faq/article/power-requirements-for-graphics-cards that the 280X needs 30A. My i7-2600 has a TDP of 95W. 95/12 = 7.92A right? Not enough in the worst case scenario.

30A must be for the entire system, there is no way a 280X pulls 360W on its own.

Plus TDP is more of a measurement of how much heat needs to be dissipated.

Apart form that I think you are on the right track.
 

_Rick_

Diamond Member
Apr 20, 2012
3,943
69
91
The worst number for a 280X under load that I've found is for Sapphire's Toxic version, which will pull around 25A (285W) under a heavy gaming load, with less extreme versions actually drawing less than 20A (down to just above 210W for the Sapphire 280x-VaporX).

Under artificial power loads, that Toxic will draw 30A (I can scarcely believe it :awe:), and even the VaporX will hit almost 300W.

These values are according to ht4u, who measure current/power just for the grahpics card. http://ht4u.net/reviews/2014/sapphire_radeon_r9_280x_vapor-x_im_test/index44.php
 

zir_blazer

Golden Member
Jun 6, 2013
1,217
507
136
First question. Will getting a power supply with a much higher rating than my rig's actual draw consume more electricity, compared to say a power supply with just enough overhead? i.e. 500W vs 700W PSU for a rig that eats 400W. I'm assuming the 700W PSU will consume more power.
False. You can take the Watts (Or rail Ampers) value as the maximum capacity of the Power Supply to transform Alternate Current to Direct Current at the required Voltage values. Actual power consumption is dependent on efficiency. This efficiency varies at different points of the load curve. Assuming the hypothetical 700W unit has a better efficiency at the place of the curve where you leave it with a 400W load, it would consume less from the wall than the 500W unit.
Also, as you go up the ladder of Power Supplies, they usually mandate for higher efficiency. As there is limited space for cooling (Bigger and heavier Heatsinks, Fans, etc), so the only way to fit it all that capacity on those units is to aim for very high efficiency to reduce the power wasted as heat and thus cooling requirements. So I would say that you will see very efficient Power Supplies at the high end capacity range than at the low end simply because they can't get around by using lower quality components and dissipating tons of extra heat.

Second question. Is future-proofing your power supply even advised? I'm under the assumption that the PSU's efficiency deteriorates over time, especially when under sustained heavy load over long periods.
That depends on how much money you have to burn and the price difference involved. As on high end you usually pay exponential money for diminishing returns, it may be worth to go for something bigger, but not a total overkill.
I know that Power Supplies are supposed to lose capacity and efficiency with time, but not how much the load weights in the wear, nor if that load is flat or related to maximum capacity. For example, if you got a sustained 24/7 400W load on a 500W Power Supply, how much shorter will its lifespan be compared to the 700W one (Assuming general component and build quality is comparable, obviously).
 

_Rick_

Diamond Member
Apr 20, 2012
3,943
69
91
Assuming the hypothetical 700W unit has a better efficiency at the place of the curve where you leave it with a 400W load, it would consume less from the wall than the 500W unit.

That's not really the case if you're already looking for efficient PSU's anyway, which usually have an extremely flat curve from 20%-110%, with usually only 1-2% of variation in efficiency, and hitting the same maximum as units with a higher rated continuous supply.
On the other hand, the sub-10% drop of of efficiency will be slightly less drastic, thus saving a Watt or two in idle, when the computer only draws 10-15W out of the PSU. You may still end up losing out because it takes 10W more at a 500W load (if 2% less efficient), but you'll also pay much less when buying the unit, and 500W loads are pretty rare. Also, my computers are probably beyond that 10:1 idle to load ration that's required to make the high power PSU a sensible investment. If you're running constant loads, those needs may shift ever so slightly.

In the end you need to consider your own budget and do your own min-maxing of factors, but personally I prefer to go up one level of quality over going up one level of advertised capacity, if they're available at the same increase over base price, and both fit my budget.
 

SymphonyX7

Member
Oct 1, 2009
35
0
0
Actual power consumption is dependent on efficiency. This efficiency varies at different points of the load curve. Assuming the hypothetical 700W unit has a better efficiency at the place of the curve where you leave it with a 400W load, it would consume less from the wall than the 500W unit.

This is one of my concerns regarding the higher capacity units consuming more power. Most of the time the cumputer won't be drawing that much energy. At low loads, power supplies aren't that efficient. When you increase the capacity, doesn't the curve widen as well? Problem is, many manufacturers don't provide load curve, particularly for models that aren't high-end. The reason why I like FSP power supplies is because their load curve tends to be flat from 10% to 100% according to Asian reviews of FSP products. Even for their lower-end models. The other brands of similar price (do note that FSP is cheap/budget oriented) tend to have efficiency peaking between 50% and 60% load. But under 10% or 20% they're usually at around 75% efficiency at best, and they drop several percent from peak as they reach 100% load.

Anyway, I pulled the trigger on the new variant of the Sapphire 280X Dual-X which has 6+8 pin molex connectors (old one had 8+8). I've decided to keep my PSU for now and save my money on a modular 600 to 650W gold or platinum rated PSU. I'm thinking FSP Aurum 92, but I'm open to other suggestions if the additional cost isn't too great.

As for the 280X on my 500W PSU, it works surprisingly fine! I don't think it'll handle half a day marathon gaming sessions though, so I decided to use PowerTune ('power control' slider in CCC, under AMD Overdrive menu) and benchmarked some of my games at -10% power and 0% power. The difference in actual performance is minimal, not even 10% on average. So I kept my setting at -10% power just to be safe and reduced the max GPU clock from 1020 Mhz to 1000 Mhz. Max memory is still at 1500 Mhz. I played for over 5 hours straight and it's been stable so far. 62C max GPU temp with ambient temp being 33C. Fan speed is set to auto but maxes out at 43% according to Open Hardware Monitor and GPU-Z. This is still good I suppose. Thank god for PowerTune!

If you look at this PowerTune test done by a site, http://www.geeks3d.com/20101224/tested-radeon-hd-6970-powertune-technology/, at -20% of the power control slider, there's a significant drop in power consumption. But I assume the reduction in performance would already be at least in the 20% range due to the aggressive throttling, so -10% in the power control slider is enough for me.

Thanks for all the inputs btw! :)
 
Last edited:

_Rick_

Diamond Member
Apr 20, 2012
3,943
69
91
If your current PSU does fine, then a true 500W (i.e. 500W on 12V) single rail (due to the asymmetric loading) PSU should be all you need. Dual rail will do fine too, as long as both rails are hooked up to the graphics cards, or one of the rails is rated to 25A.

I wouldn't worry about 600+W, if you're going to take an FSP Aurum anyway.
The 500W Aurum S has 39A on 12V. That's plenty for your needs
A 550W Seasonic S12G doesn't cost much more, and might be an alternative.
 

philipma1957

Golden Member
Jan 8, 2012
1,714
0
76
All of the above answers were pretty good. But they fail to mention warranty and price.

For example you do not need a sea sonic 756 watt platinum psu to run your gear. But when a sea sonic 750 watt platinum goes on sale it is under 100 bucks and has a 7 year warranty. So you should buy that if it is on a sale.

http://www.newegg.com/Product/Produc...82E16817151120

not on sale.

you could get any decent 650watt to 850watt psu. they would all be enough power wise . I could list 10 psu's that work. the trick is the sales price.



http://www.newegg.com/Product/Produc...82E16817151088


http://www.newegg.com/Product/Produc...82E16817151087

http://www.newegg.com/Product/Produc...82E16817151102


here is a deal on sale I have 2 of these under 85 bucks with rebate. not modular but 85 bucks and a 3 year warranty.
.
http://www.newegg.com/Product/Produc...82E16817371056



today a shell shocker for this item 109.99

http://www.newegg.com/Product/Product.aspx?Item=N82E16817438018


1 pm to 3 pm west coast time. A good time 10 year warranty so you are future proofed!
 
Last edited:

mikeymikec

Lifer
May 19, 2011
19,903
14,133
136
IMO, when you buy a PSU, you ought to buy it with your immediate needs in mind with a bit of spare horsepower (once you've calculated the maximum use scenario with as much research as possible), because the efficiency of PSUs drops over time. If you know for a fact that you're going to buy a new specific bit of power-consuming hardware, then the PSU choice should be affected by that.

In short, I wouldn't spec out a PSU choice based on "I might buy a much more hungry GPU at some point in the future".
 

WhoBeDaPlaya

Diamond Member
Sep 15, 2000
7,414
402
126
IMO, when you buy a PSU, you ought to buy it with your immediate needs in mind with a bit of spare horsepower (once you've calculated the maximum use scenario with as much research as possible), because the efficiency of PSUs drops over time. If you know for a fact that you're going to buy a new specific bit of power-consuming hardware, then the PSU choice should be affected by that.

In short, I wouldn't spec out a PSU choice based on "I might buy a much more hungry GPU at some point in the future".
Nothing wrong with that. Got several BFG EX-1000s for $50 when they were going out of business and can drop in any hardware I wish anytime without worrying about power requirements.