8800gtx preview

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

wanderer27

Platinum Member
Aug 6, 2005
2,173
15
81
I'd be interested in seeing what kind of temps this thing runs at. Looks like there's a pretty decent cooling system on this, and I like that it vents it out of the case.

 

Skott

Diamond Member
Oct 4, 2005
5,730
1
76
Originally posted by: Cabages
Will this case be able to fit this?

Is the GS version any smaller?


I'd be leary of saying yes on the GTX version. You need to find someone that has that case and take measurements to be sure. One way is to slide a 8.5x11.5 piece of paper in there and seeing if it touches the back of the hd cage. I running on the theory that if the case is 20" or greater in depth there wont be any problems. 19" or shorter in depth and there is potential for problems. Of course it also depends on the layout of the inside of the case. Thats my current working theory right now till we have more concrete info.
 

TheRyuu

Diamond Member
Dec 3, 2005
5,479
14
81
Seems a little faster then just "30%", at least in some of the games tested.

This thing runs faster then my 7900GT's in SLI.
I'll probably sell the GT's and get a G80. One will suffice for now.

I think of it this way:
The X1950XTX was almost as fast as my 7900GT's but still slower in most cases.
X1950XTX in CF raped my 7900GT's.
Single G80 will match or better X1950XTX's in CF.

Thats at least my way of thinking about it. I've wanted a chip with 512mb of vram (this case 768mb) for a long time. The newer games at my resolutions were eating up a lot of memory.

A game that stands out is the Dark Messiah game that just came out.
Totally rapes my 7900GT's and I have no idea why.
The normal source games with 4xAA/16xAF wern't that demanding.

Oh well.
 

Cabages

Platinum Member
Jan 1, 2006
2,918
0
0
Originally posted by: Skott
Originally posted by: Cabages
Will this case be able to fit this?

Is the GS version any smaller?


I'd be leary of saying yes on the GTX version. You need to find someone that has that case and take measurements to be sure. One way is to slide a 8.5x11.5 piece of paper in there and seeing if it touches the back of the hd cage. I running on the theory that if the case is 20" or greater in depth there wont be any problems. 19" or shorter in depth and there is potential for problems. Of course it also depends on the layout of the inside of the case. Thats my current working theory right now till we have more concrete info.

I already have the case. I will try that paper thing, and see if it fits.

Edit: Just tried the paper. 8.5" by 11.5" barely fit, so I dont think I have a chance to fit one in.
 
Mar 11, 2004
23,444
5,852
146
Originally posted by: Childs
Originally posted by: josh6079
Those scores look really degrading to the X1950XTX.

Here Anandtech scored ~20 more than they did with the same card at a higher resolution with the same amount of AA in Quake 4.

You have to remember AT's review used X6800 (2.93Ghz), while DT used a QX6700 (I think 2.66Ghz). I think its reasonable to assume that a 8800GTX in a X6800 would be even faster.

There's no way that a slightly faster CPU is going to cause it to run almost 20FPS faster at a higher resolution with the same settings. When I read their little preview, I thought to myself that something must be off, as I don't believe that an X1950XT would only manage that framerate at 16x12 in HL2.

Text

Look at that link. About 50% improved performance from less than 300Mhz on the CPU? Its getting about the framerates Dailytech is reporting for 16x12-4xAA at 20x15-4xAA.

There's definitely something off there. I'd be very curious to know what settings were used and why (did they force the 8800 into HQ? maybe nvidia told them that some lower setting for them is equal image quality wise to a higher setting that ATi has?).
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Is 8800GTS shorter than a GTX?
Yes - by all accounts it seems to be about the same length as a X1950/7900GTX.

I think I'll be getting one of those this generation (if it's 50% faster than my 7900 GTX I'll be more than happy).
 

Childs

Lifer
Jul 9, 2000
11,313
7
81
Originally posted by: darkswordsman17

There's no way that a slightly faster CPU is going to cause it to run almost 20FPS faster at a higher resolution with the same settings. When I read their little preview, I thought to myself that something must be off, as I don't believe that an X1950XT would only manage that framerate at 16x12 in HL2.

Text

Look at that link. About 50% improved performance from less than 300Mhz on the CPU? Its getting about the framerates Dailytech is reporting for 16x12-4xAA at 20x15-4xAA.

There's definitely something off there. I'd be very curious to know what settings were used and why (did they force the 8800 into HQ? maybe nvidia told them that some lower setting for them is equal image quality wise to a higher setting that ATi has?).


Different CPU, and a different demo benchmarked, results in different results. Lost Coast vs EP1. With the HDR and 4XAA 16XAF @ 1600x1200 60 fps is reasonable.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Looks very nice!! Lets get some games that can really use this power.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: tanishalfelven
Originally posted by: munky
Originally posted by: nitromullet
Originally posted by: munky
In case nobody noticed, the rumored power consumption specs always turn out to be blown out of proportion compared to the actual power consumption. It's good to see the g80 not using ridiculuusly a lot of power, even my 430 watt psu would easily handle it.

I agree, but why the dual 6-pin molex plugs if it only draws 4% more juice than an X1950XT?

I read somewhere that the second one is optional, like on the 6800u. I just cant remember where I saw it...

But even so, with a quad core cpu and the g80 drawing ~350 watts at load for the whole system is not too bad at all.

*edited for the grammer polise ;)

the spelling police could also use a tip off.

What about the sarcasm police?
 

Dethfrumbelo

Golden Member
Nov 16, 2004
1,499
0
0
Originally posted by: munky
Originally posted by: tanishalfelven
Originally posted by: munky
Originally posted by: nitromullet
Originally posted by: munky
In case nobody noticed, the rumored power consumption specs always turn out to be blown out of proportion compared to the actual power consumption. It's good to see the g80 not using ridiculuusly a lot of power, even my 430 watt psu would easily handle it.

I agree, but why the dual 6-pin molex plugs if it only draws 4% more juice than an X1950XT?

I read somewhere that the second one is optional, like on the 6800u. I just cant remember where I saw it...

But even so, with a quad core cpu and the g80 drawing ~350 watts at load for the whole system is not too bad at all.

*edited for the grammer polise ;)

the spelling police could also use a tip off.

What about the sarcasm police?

Wow, that's like double irony. I think my head is going to explode.


 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Lets get some games that can really use this power.
I'm playing through Fear Extraction Point right now and let me tell you, a single 7900 GTX just isn't enough to get good performance at decent settings.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: BFG10K
Lets get some games that can really use this power.
I'm playing through Fear Extraction Point right now and let me tell you, a single 7900 GTX just isn't enough to get good performance at decent settings.
i know what i mean by that ... what do you mean?
[i am guessing: 16x12 with 4xAA/16AF; everything maxed/no ss ... dips below 30 fps?]

 

TanisHalfElven

Diamond Member
Jun 29, 2001
3,512
0
76
Originally posted by: munky
Originally posted by: tanishalfelven
Originally posted by: munky
Originally posted by: nitromullet
Originally posted by: munky
In case nobody noticed, the rumored power consumption specs always turn out to be blown out of proportion compared to the actual power consumption. It's good to see the g80 not using ridiculuusly a lot of power, even my 430 watt psu would easily handle it.

I agree, but why the dual 6-pin molex plugs if it only draws 4% more juice than an X1950XT?

I read somewhere that the second one is optional, like on the 6800u. I just cant remember where I saw it...

But even so, with a quad core cpu and the g80 drawing ~350 watts at load for the whole system is not too bad at all.

*edited for the grammer polise ;)

the spelling police could also use a tip off.

What about the sarcasm police?

*looks behind.
"oh there they are. "
* runs away.

anyway back to the thread like i said benchhing these old games (and prey which is based off an old game's engine) is quite useless.
give us the real benches using oblivion and other newer games.

 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: Cooler
Maybe AMD can add its Cool and quite technology.

Video cards already do something similar to CnQ. Both lower clock speeds when the gpu/cpu isnt under load.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
DT =/= AT.
True, and I never claimed so. I only claimed that an X1950XTX will get more of an average frame rate than 34fps in Q4 with 4xAA@1600x1200.
Remember, different sites use different methods, settings and timedemos.
This is also true. However the changes shouldn't be that different. 20 frames more on the setup that is using 2 less CPU cores on a multi-threaded game and a higher resolution with the same amount of AA? Doesn't sound right. DT's system should have brought more to the table with 4 CPU cores and a lower resolution than AT's benches.
They could for one haved used 16xHQ AF along with TRAA/AAA as well in a different timedemo.
Except that TrAA/AAA is a Direct3D feature, not an OpenGL one.

I'm not saying the G80's scores are wrong but to me DT's score for the X1950XTX is too low. My X1900XT gets a better average than that.
 

Rubycon

Madame President
Aug 10, 2005
17,768
485
126
Anything's going be better than that flickerpiss nosecum Rageon extee ninteen hundred extee-ex. ;)
 

KeithTalent

Elite Member | Administrator | No Lifer
Administrator
Nov 30, 2005
50,231
118
116
Originally posted by: MS Dawn
Anything's going be better than that flickerpiss nosecum Rageon extee ninteen hundred extee-ex. ;)

You are the only one I have ever seen complain about this on AT. I have never experienced flickering of any kind, but maybe I am just lucky :D
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
i know what i mean by that ... what do you mean?
I'm currently running it at 1440x1080 with 4xAA, 16xAF and TrAA SS and there are visible slowdowns in many places on my single 7900 GTX.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: BFG10K
i know what i mean by that ... what do you mean?
I'm currently running it at 1440x1080 with 4xAA, 16xAF and TrAA SS and there are visible slowdowns in many places on my single 7900 GTX.
What frame rate do those visible slowdowns correlate to?
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
I haven't checked the counter but I'd estimate they'd be well below 40 FPS.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Yeah, when I feel like a game gets a slow down, it normally ends up being about 20-25 frames (excluding Oblivion and NWN2 of course).