Get ready for 200 Watt GPUs

Pabster

Lifer
Apr 15, 2001
16,986
1
0
I'm not so sure. Current solutions do run awful close to the 100W ceiling. And if they don't shrink to 65nm I can see these figures being plausible.
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
awesome, the rumor was 300watts next gen GPU's, now its "only" 200! think of all the power were saving!
 

Smartazz

Diamond Member
Dec 29, 2005
6,128
0
76
I wonder how expensive these will be. Didn't AMD and Intel get critisized for doing the same thing a while back?
 

sunzt

Diamond Member
Nov 27, 2003
3,076
3
81
I remember reading an article that refers to the current next gen of GPUs (G80, ATI's whatever it's called) as being extremely power hungry where as the following generations of GPUs (the fully DX10 ones) will focus on power/performance efficiency like what AMD and Intel are doing.
 

Smartazz

Diamond Member
Dec 29, 2005
6,128
0
76
Originally posted by: sunzt
I remember reading an article that refers to the current next gen of GPUs (G80, ATI's whatever it's called) as being extremely power hungry where as the following generations of GPUs (the fully DX10 ones) will focus on power/performance efficiency like what AMD and Intel are doing.

Yeah I read the same thing, and I think the new ATI chip is the R600.
 

jazzboy

Senior member
May 2, 2005
232
0
0
To be honest I heard this sort of article before the 6xxx and 7xxx series video cards launched and they ended up consuming LESS power than their previous series cards (6800u consumed less than FX 5800u and 7800 used less power than 6800u).

Short story - I'll believe it when I see it in official reviews.

Of course, if for once, this does turn out to be true then I hope people just avoid these new cards because that sort of power consumpsion is nothing more than utter disgraceful and both ATI and Nvidia should be ashamed of themeselves. I'd also hate to think what sort of cooling would be required.
 

Sonikku

Lifer
Jun 23, 2005
15,893
4,901
136
Anandtech already did an article on this. With vendors making seperate PSU's that fit in a HDD bay for the soul purpose of feeding cards.

Cards have gotton larger, hotter, more expensive and more power hungry over time.

http://img.photobucket.com/albums/v600/Foxy_McCloud/01.jpg

For the love of the Force, Nividia! The X1900XT is big enough as it is!

As long as consumers keep buying, they'll keep pushing the same tricks until consumers say no.
 

gxsaurav

Member
Nov 30, 2003
170
0
0
wow, this much Power requirment, maybe i should buy the nuclear power plant next door, for my home :p
 

Raduque

Lifer
Aug 22, 2004
13,140
138
106
At those power ratings, I'll have to start turning off my computer when I'm not using it. :(

Sonikku, what card is that in the middle? The top one is a 7950GX2 and the bottom is an X1900XTX, right?
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Raduque
At those power ratings, I'll have to start turning off my computer when I'm not using it. :(

Sonikku, what card is that in the middle? The top one is a 7950GX2 and the bottom is an X1900XTX, right?

That could be "8850GX2"??? 2xmobile G80's? That thing is ginormous.

 

fbrdphreak

Lifer
Apr 17, 2004
17,555
1
0
When you are at more than twice the power of the CPU, it is time to hit yourself on the head with the same high velocity cluestick that AMD and Intel have already experienced.
Hehehehe

I'm surprised it will come to this, as a lot of NV's desktop development comes from the mobile side which currently has a top range of 65W TDP IIRC. Obviously the desktop side goes a bit more as all the same power saving features aren't implemented, but stuff like clock gating certainly is. I hope NV and ATI don't get into a high end pissing match to develop the most disgustingly over-powered thing they can, the true difficulty will be in making a powerful energy efficient GPU.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: keysplayr2003
Originally posted by: Raduque
At those power ratings, I'll have to start turning off my computer when I'm not using it. :(

Sonikku, what card is that in the middle? The top one is a 7950GX2 and the bottom is an X1900XTX, right?

That could be "8850GX2"??? 2xmobile G80's? That thing is ginormous.
I think he asked a serious question... That is the 7900GX2, the first dual gpu, quad-SLI attempt made by NVIDIA earlier this year, which was replaced by the 7950GX2 before it ever made it to mass production. It is not a next-gen card...
 

Golgatha

Lifer
Jul 18, 2003
12,396
1,068
126
"Either way, this is simply too much. The current 100+W is borderline insane, and adding half again to it is just silly, speed is OK, but I think we have just passed the point of being ludicrous. When you are at more than twice the power of the CPU, it is time to hit yourself on the head with the same high velocity cluestick that AMD and Intel have already experienced."

Not sure I agree with this whole quote. At extremely high resolution and IQ setting, pretty much all games are GPU bound (even with SLI or Crossfire). CPUs are all going 65nm and some are there right now. We need to shrink the die on GPUs before getting them out the door, but the vicious product cycle in the GPU arena doesn't really allow for that, so either deal with the power requirements or wait for the refreshed product early next year IMO.
 

fbrdphreak

Lifer
Apr 17, 2004
17,555
1
0
Originally posted by: tanishalfelven
a new die shrink is just round the corner. these rumors will not pan out.
90nm-->80nm won't do much. They need to hit 65nm to make any big difference.