The future of PSUs

Rhezuss

Diamond Member
Jan 31, 2006
4,118
34
91
Hiya!

I post this in the PSU forums since it's "power" related...

I had some thinking about replacing my aging OCZ ModStream 450W, which have been running for 2.5 years.

While shopping for PSUs and after shopping and buying a new mobo and CPU, I noticed, as most of you, that CPUs need less and less power to operate. The new CPUs are 45nm and smaller while being more powerful, running sometimes at 95W and even 80W and 65W.

GPUs are another story. Just take for example the HD 5870 versus the new HD 5970. The former need a 500W PSU and the later a 650W PSU.

- All in all, do you think that in the future power needs will lessen?

- For someone who won't Crossfire but will keep his hardware up to date, how much power would you recommend for the next 3 years?
 

vshah

Lifer
Sep 20, 2003
19,003
24
81
if you're not going to crossfire, i would be happy with a good 600w psu
 

pitz

Senior member
Feb 11, 2010
461
0
0
I suspect the trend in the future is not towards more powerful PSU's, but rather to increasingly intelligent monitoring functions on the PSU's. For instance, real-time voltage/current/power readouts, that can be read by the computer, and used to adjust component specs accordingly.

For instance, let's say, you have a 300W PSU, and you want to throw 2 high-end Radeon (or Nvidia) boards onto it, along with a 140W CPU, etc. At some point, the interface between the PSU and the motherboard will be able to recognize that there is an inadequacy, and will adjust clock rates on components accordingly so that the system remains operational.

The same machines will also be able to down-throttle themselves in response to load shedding commands issued by electric utilities when power is abnormally expensive or scarce. And the PSU's will be able to monitor and report back stats on their own health, much like SMART does with hard drives. For instance, collecting data on ripple as a function of load can give clues as to capacitor life. Collecting data on AC power quality, etc.

Adding these functions in the next few years, of course, will require a few additional pins to the power connector, or some sort of USB interface, although I suspect the industry will just add it as a pair of pins to the power interface, for data transmission between the PSU and the mobo.
 
Last edited:

Absolution75

Senior member
Dec 3, 2007
983
3
81
The industry is moving towards more power efficient processors - or at least performance/watt.

Recent reviews have had a negative spin on power sucking units (HD5830) and I can see this continuing. I doubt you'll need anything more than 600watts unless you're running crossfire/sli + large cpu OC.

The only place I see PSU makers going is lowering ripple. I haven't bought a PSU since my nForce 3 (RMA'ed an old OCZ because the fan controller was bad, ran 100% all the time).
 

Rhezuss

Diamond Member
Jan 31, 2006
4,118
34
91
The industry is moving towards more power efficient processors - or at least performance/watt.

Recent reviews have had a negative spin on power sucking units (HD5830) and I can see this continuing. I doubt you'll need anything more than 600watts unless you're running crossfire/sli + large cpu OC.

The only place I see PSU makers going is lowering ripple. I haven't bought a PSU since my nForce 3 (RMA'ed an old OCZ because the fan controller was bad, ran 100% all the time).

600/650W thats what I aim for but my current OCZ 450W has been running for the last 3 years now. My next one will have to be sufficient for my next 2-3 hardware upgrades.

I tend to go for at least 650W just because, for instance, the new HD 5970 ask a minimum PSU of 650W. If in 1 or 2 years from now I decide to upgrade my HD 4850, I don't want to replace a newly bought PSU.

But maybe the HD 6xxx series will be less power hungry and a 500W to 650W will be enough.
 

yottabit

Golden Member
Jun 5, 2008
1,671
874
146
I've been thinking about this too mostly becuase I'm amazed that some enthusiast rigs are approaching the limit of what a typical US wall socket can provide... I never thought I would see the day of 1000+ W power supplies for the consumer! I mean lots of wall sockets are 10 A so that means you can draw maybe 1200 Watts before blowing a fuse. Yes there are 15 and 20A 120V sockets but I just find the idea of blowing a fuse or tripping a circuit breaker with your computer ridiculous!

That being said most people like to get bigger power supplies than they actually need, so the idea of hitting 1200 W draw is pretty crazy.

So I think there is a practical upper limit to it in that sense.. I'd be surprised to ever see 2000W power supplies that require a 20A breaker becuase that's not very common. If GPU's keep following this trend maybe we'll see dual-PSU systems. Or auxilliary PSUs made specifically for video cards?

I imagine GPU vendors don't feel the pressure to maximize performance per watt on their high end cards because the enthusiast is the main consumer in that area and they don't care about the power usage. It's not like with CPU's where performance per watt really matters in the server arena. Rather the energy efficient offerings are things like Nvidia Ion and AMD Fusion that are geared toward the mainstream and low end. Although maybe GPGPU will change that and put pressure on manufacturers to make performance per watt a priority, as more and more people use those in clusters.
 

Daemas

Senior member
Feb 20, 2010
206
0
76
I've been thinking about this too mostly becuase I'm amazed that some enthusiast rigs are approaching the limit of what a typical US wall socket can provide... I never thought I would see the day of 1000+ W power supplies for the consumer! I mean lots of wall sockets are 10 A so that means you can draw maybe 1200 Watts before blowing a fuse. Yes there are 15 and 20A 120V sockets but I just find the idea of blowing a fuse or tripping a circuit breaker with your computer ridiculous!

That being said most people like to get bigger power supplies than they actually need, so the idea of hitting 1200 W draw is pretty crazy.

So I think there is a practical upper limit to it in that sense.. I'd be surprised to ever see 2000W power supplies that require a 20A breaker becuase that's not very common. If GPU's keep following this trend maybe we'll see dual-PSU systems. Or auxilliary PSUs made specifically for video cards?

I imagine GPU vendors don't feel the pressure to maximize performance per watt on their high end cards because the enthusiast is the main consumer in that area and they don't care about the power usage. It's not like with CPU's where performance per watt really matters in the server arena. Rather the energy efficient offerings are things like Nvidia Ion and AMD Fusion that are geared toward the mainstream and low end. Although maybe GPGPU will change that and put pressure on manufacturers to make performance per watt a priority, as more and more people use those in clusters.

I'd say that 1000-1250 will be the limit for a long long time.

It wouldn't be feasible for intel to release a part greater than 130w TDP due to OEM sales (which is most of their sales) and GPU makers have a pretty hard limit at 300w (which is the ATX spec). The only way you're going to hit 1250w is

CPU oced to 250w (4.0GHz i7 or similar)
2x 5970s
6x DDR3 dimms
High end desktop mobo
2x SATA HDDs
1x SSD
1x Blu Ray Burner
4x random USB devices
2x PCIe Cards (Sound card and Intel NIC or raid card)
fan controller
card reader
15x 120mm fans
2x Laing DDC 3.2s

100% PEAK LOAD WITH 30% CAP AGING
is only 1170w without OCed graphics cards, but HardOCP only got 20 extra watts per card OCed, so add 40 to get 1210. Certainly towards the limit, but definately still good.

I mean, that right there is the best of the best of the best, and then taking that best and OCing the crap out of it. I mean, if I did that I would certianly get like an Enermax Galaxy Evo 1250w and then a Seasonic X-750 (because it's awesome, not because of the wattage rating), but most people aren't going to even come close. I mean, that rig is probably looking at 1300-1350w peak. But even Crysis (or Crysis 2 or Metro 2033) isn't going to max out the processor and 4 GPUs all the time at the same time. You'd have to be running IBT and FAH or something.

it's just not going to happen
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
I suspect the trend in the future is not towards more powerful PSU's, but rather to increasingly intelligent monitoring functions on the PSU's. For instance, real-time voltage/current/power readouts, that can be read by the computer, and used to adjust component specs accordingly.

For instance, let's say, you have a 300W PSU, and you want to throw 2 high-end Radeon (or Nvidia) boards onto it, along with a 140W CPU, etc. At some point, the interface between the PSU and the motherboard will be able to recognize that there is an inadequacy, and will adjust clock rates on components accordingly so that the system remains operational.

The same machines will also be able to down-throttle themselves in response to load shedding commands issued by electric utilities when power is abnormally expensive or scarce. And the PSU's will be able to monitor and report back stats on their own health, much like SMART does with hard drives. For instance, collecting data on ripple as a function of load can give clues as to capacitor life. Collecting data on AC power quality, etc.

Adding these functions in the next few years, of course, will require a few additional pins to the power connector, or some sort of USB interface, although I suspect the industry will just add it as a pair of pins to the power interface, for data transmission between the PSU and the mobo.

When that happens, it would be the end of ATX as a power-supply spec, I figure. I dunno, perhaps it's time for something like that to happen.