The limits of power

KIAman

Diamond Member
Mar 7, 2001
3,342
23
81
With the HD5970 and the upcoming Fermi knocking on the limits of a PSU power output and PCI-E standards, how can the GPU industry continue on their current path?

Are they now at the stage where they must create new standards for compliance and work with PSU manufacturers for special GPU-only lanes?

Look 2 years in the future where there could reasonably be a GPU that is 2x as fast as the HD5970. How much power would such a card take? Will it be non-standard?
 

Phil1977

Senior member
Dec 8, 2009
228
0
0
We should move to 28nm soon and that will alow for faster video cards that still stay within the current power draw limits...

I also see an end to higher resolutions. I believe Full HD to become the mainstream resolution for years to come. Sure there are 30" LCDs but I don't see them as mainstram.

Wheras in the last few years people went from 1280 x 1024 to 1680 x 1050 and now 1920 x 1080.
 

Lean L

Diamond Member
Apr 30, 2009
3,685
0
0
power consumption only goes down as manufacture process becomes better
 

KIAman

Diamond Member
Mar 7, 2001
3,342
23
81
So the limit of GPU speed is wholly dependent on die shrinks? Look at the history of GPUs for the last 6 years. Even with die shrinks, the power consumption has continued to rise.

I can't possibly predict but something more radical needs to happen for Moore's GPU laws (is there such a thing?!?) to continue.
 

Daedalus685

Golden Member
Nov 12, 2009
1,386
1
0
So the limit of GPU speed is wholly dependent on die shrinks? Look at the history of GPUs for the last 6 years. Even with die shrinks, the power consumption has continued to rise.

I can't possibly predict but something more radical needs to happen for Moore's GPU laws (is there such a thing?!?) to continue.

Well, it is partially our fault.. It would certainly be possible to keep creating slightly faster, much lower power cards. But the market for an upgrade for power reasons is nearly non existent. Though what seems to be common is the mainstream performacne cards of each generation are lower power/heat, higher feature set versions of the previous high end.

As far as Moore's law goes.. we are always increasing the density.. I don't think anyone ever expected that trend to mean lower power as well, at least not lower power of the total package (though power/performance has been getting better over the years).

What I think they will do is more than likely continue with each generations high end cards becoming more and more power hungry to a point. By the time pcie 4.0 comes out we might have drastically different architecture to play with, so who knows. We are at a point now where we need to migrate to pcie 3.0 more for power constraints on the high end than bandwidth.. which I find rather silly..

I can't think of a situation where more performance wouldn't be sought after, even if resolution starts to reach a plateau there are other things we can compute on a GPGPU. Unfortunately the more transistors we make the more power it takes to run them, and our improvements in manufacturing efficiency will likely never catch up to our increases in transistor count so far as to plateau energy consumption. That is to say until we migrate to something totally different.
 

bamacre

Lifer
Jul 1, 2004
21,029
2
81
So the limit of GPU speed is wholly dependent on die shrinks? Look at the history of GPUs for the last 6 years. Even with die shrinks, the power consumption has continued to rise.

But performance per watt has increased as well. I've got a slightly neutered GTX 260 running fine on my Dell OEM 375W psu.
 

Phil1977

Senior member
Dec 8, 2009
228
0
0
So the limit of GPU speed is wholly dependent on die shrinks? Look at the history of GPUs for the last 6 years. Even with die shrinks, the power consumption has continued to rise.

Pretty much. Sure there are optimizations in hardware and software. But the bottom line is there is a huge correlation between number of transistors and performance.

Without fail, ever new generation had more transistors. Usually double than the last one.

Basically the die shrinks couldn't keep up with performance demanded. That's why power draw requirements kept kreeping up...

First cards didn't need a power plug. Then they had floppy connectors. Then they had molex plugs. Then PCIe raised the slot power draw. Then they had 6 pin plugs. Then two. And now 6+8.

I believe that because we won't see any higher resolutions than Full HD in the next few years, we will be fine with just die shrinks and without raising power requirements.

The last few years we wanted more performance while also increasing resolution at the same time...
 
Last edited:

Daedalus685

Golden Member
Nov 12, 2009
1,386
1
0
I believe that because we won't see any higher resolutions than Full HD in the next few years, we will be fine with just die shrinks and without raising power requirements.

The last few years we wanted more performance while also increasing resolution at the same time...

The thing is no company will release a card that isn't greatly better on performance for fear of being laughed out of business.

While we may not "need" more powerful cards to play games at 1080P (until the new consoles come out anyway) companies are still going to offer them.

For the average 1080P consumer the best bet would be to avoid extreme high end anyway. But you better believe that the extreme high end will still be there (continuing not for the product and profit, but for the advertisement having the "best" brings).

Just look at the reactions to the 5770 (no better than a 4870, and more or less hated for a long time despite the power requirements) Imagine what would happen if Fermi was no faster than a gtx285, or even 5870... it could use no power at all and people would freak.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Look 2 years in the future where there could reasonably be a GPU that is 2x as fast as the HD5970. How much power would such a card take? Will it be non-standard?

You are forgetting about the effect of the "die shrink" on power consumption.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
With the HD5970 and the upcoming Fermi knocking on the limits of a PSU power output and PCI-E standards, how can the GPU industry continue on their current path?

Are they now at the stage where they must create new standards for compliance and work with PSU manufacturers for special GPU-only lanes?

I think the biggest area of improvement will be on these X2 cards. HD5970 is a good example. It has two of the highest binned Cypress Cores for only 50% more money than HD5870. Therefore it makes sense something like that is ripe for 2x8 pin and maybe watercooling (as standard).
 

scooterlibby

Senior member
Feb 28, 2009
752
0
0
Interesting thread. I like the different opinions. I'm at 27.5 inch 1920x1200 now, but am setting up an HTPC/light gaming center downstairs for our 32 inch 720p television. I am curious to see what the difference is like especially if your theory about hi super high res being dead turns out to be true.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
I believe that because we won't see any higher resolutions than Full HD in the next few years, we will be fine with just die shrinks and without raising power requirements.

The last few years we wanted more performance while also increasing resolution at the same time...

If resolution becomes static (most people preferring a single 1080p monitor) then a think we will need to see more breakthroughs in graphics APIs in order to challenge the GPUs.

As far as triple monitors go, I think a lot of people want to see the bezels get removed and/or clean up the inputs/outputs (Display port adapter is just too messy for them)
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
II am curious to see what the difference is like especially if your theory about hi super high res being dead turns out to be true.

Well there are diminishing returns to increasing resolution @ fixed dot pitch size.

I run two monitors now, but eventually I would like to go triple 1080p in portrait configuration. However, that is probably the limit for me.
 
Last edited:

Arglebargle

Senior member
Dec 2, 2006
892
1
81
I am guessing dedicated GPU power supplies as the workaround. People will already pay huge sums for fancy GPUs: An extra power supply for that alone is just another expense.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
I am guessing dedicated GPU power supplies as the workaround. People will already pay huge sums for fancy GPUs: An extra power supply for that alone is just another expense.

Why not go one step further?

Exterrnal GPU enclosure, new Video card form factor, dedicated PSU. Something like that could have more efficient cooling and three (or more) normal sized DVI outputs for multiple monitors. Hook it up to a "budget box" or laptop from Best buy and the person would be good to go for High resolution gaming/multi-tasking.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
I am guessing dedicated GPU power supplies as the workaround. People will already pay huge sums for fancy GPUs: An extra power supply for that alone is just another expense.

Power isn't just some element that you can care about on its own.
Power means heat, and heat needs to be drawn away from the GPU and expelled.

You can't just freely keep increasing power with no consideration of the impacts it will have, dedicated GPU power supply or not.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Power isn't just some element that you can care about on its own.
Power means heat, and heat needs to be drawn away from the GPU and expelled.

You can't just freely keep increasing power with no consideration of the impacts it will have, dedicated GPU power supply or not.

Yeah but with a dedicated PSU in an external box couldn't cooling be vastly improved? (especially with new form factor PCB).
 

Arglebargle

Senior member
Dec 2, 2006
892
1
81
Hasn't an external solution been tried before? What were the problems that surfaced in those attempts? It sounds like a good idea to me, and it could be a nice adjunct to onboard video.
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
With the HD5970 and the upcoming Fermi knocking on the limits of a PSU power output and PCI-E standards, how can the GPU industry continue on their current path?

PCI-e standards can be changed :awe: Whenever pci-e 3.0 arrives they could always move to 2x 8 pin connectors, which would give 375w I believe (75w from the slot, 150 + 150 from the 8 pin connectors). I'd rather have the GPU manufacturers focus on not making ~400w gpus though.
 

Zap

Elite Member
Oct 13, 1999
22,377
7
81
Power draw shouldn't be a problem with current power supply connectors. Obviously the solution is to increase the number of power connectors. The 7900 series needed one connector and the 8800 series needed two. Some manufacturers have continued this trend.
 

Attachments

  • funnycard.gif
    funnycard.gif
    14.4 KB · Views: 23

cbn

Lifer
Mar 27, 2009
12,968
221
106
Hasn't an external solution been tried before? What were the problems that surfaced in those attempts? It sounds like a good idea to me, and it could be a nice adjunct to onboard video.

http://www.eteknix.com/news/ati-xgp-welcomes-hd5830-at-ces-2010/

To me the biggest problem with that external platform is the need for a proprietary connection/connector built into the laptop.

In order for this to work (on a mass scale) the video card would need to share a common high speed connection with something that needs a lot of bandwidth. Maybe Large External SSDs (with internal RAID controllers) will eventually create the need for such a connection?
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
PCI-e standards can be changed :awe: Whenever pci-e 3.0 arrives they could always move to 2x 8 pin connectors, which would give 375w I believe (75w from the slot, 150 + 150 from the 8 pin connectors). I'd rather have the GPU manufacturers focus on not making ~400w gpus though.

Well MSI has already started to break the ATX form factor. The HD5870 "Lightning" has an extra wide PCB and two 8 pins power connectors standard.
 

x3sphere

Senior member
Jul 22, 2009
722
24
81
www.exophase.com
We should move to 28nm soon and that will alow for faster video cards that still stay within the current power draw limits...

I also see an end to higher resolutions. I believe Full HD to become the mainstream resolution for years to come. Sure there are 30" LCDs but I don't see them as mainstram.

Wheras in the last few years people went from 1280 x 1024 to 1680 x 1050 and now 1920 x 1080.

Well, Dell just released a 27" with 2560x1440 res recently, so it's not like the monitor industry is remaining stagnant when it comes to pushing resolutions higher than 1080p. Pricey now, but in another year or two it might be affordable for people outside the realm of enthusiasts/graphics professionals.

However, there have been no new 30" monitors recently which is interesting. I wonder if we'll see anything that goes beyond 2560x1600 at a consumer level this year.
 
Last edited:

blanketyblank

Golden Member
Jan 23, 2007
1,149
0
0
I say we no longer care about graphics card power requirements in the future since all the heavy processing will be done via server and all you need is a more minimal system with a very fast internet connection.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Well, Dell just released a 27" with 2560x1440 res recently, so it's not like the monitor industry is remaining stagnant when it comes to pushing resolutions higher than 1080p. Pricey now, but in another year or two it might be affordable for people outside the realm of enthusiasts/graphics professionals.

I don't think the price will drop on 2560x1440 unless Dell decides to go to a TN panel.

Why does Dell not want to use TN panels on the large monitors? I have no idea? Maybe once a monitor gets to a certain size viewing angles become more important.