8800GTX Tested in SLI

Jun 14, 2003
10,442
0
0
11k in 3D06?

i thought that was the score 1 8800GTX pulled with a good core2duo?

yeah somethings off there. a 7950GX2 scores 10k according to them, so 2 x a card that on its own is already more powerfull than a 7950GX2 (which is technically 2 gpus in SLI) only nets 1000 extra points?

either they did it wrong or the SLI profile is bust or they made a typo.

i would of expected closer to 20k in 06 with 2 of these
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Massive frame rates are great but also useless if they haven't fixed their Vsync problems that accompany them. Although, I think FEAR is a game that SLI and Vsync now work well on. NWN2 is a different story.
 

Dethfrumbelo

Golden Member
Nov 16, 2004
1,499
0
0
I may have to take back what I said about the GTX not being a good value. Relative to the GTS it may be the better bang for the buck.

The GTS should be more like $400. Fortunately, it seems like an excellent overclocker which may add another 20-30%.
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
Originally posted by: Dethfrumbelo
Originally posted by: ShadowOfMyself
First review from a chinese site, not that inq bullshit

http://www.pconline.com.cn/diy/graphics/reviews/0611/898744.html


Hmmm... the GTS is barely faster than an X1950XT (at stock) - except in Oblivion where it's much faster, while the GTX pulls way out in front of everybody (30-50% faster than the GTS on average).

Yep.. The GTS seems to be average in raw speed, it only pulls ahead with AA on probably because of the 320 bit bus and so... Also interesting how far behind the 7900 GTX is compared to the X1950XTX in most games

Anyway the 8800GTX is looking to be a killer, specially with its new "perfect" AF and new AA modes
 

Centurin

Member
Sep 13, 2006
155
0
71
Originally posted by: DethfrumbeloHmmm... the GTS is barely faster than an X1950XT (at stock) - except in Oblivion where it's much faster, while the GTX pulls way out in front of everybody (30-50% faster than the GTS on average).

If those benchmarks are accurate, the 8800GTS is going to be a flop. Even the 1950XTX is on par or faster in all the benchmarks except oblivion. Might as well just get myself a 7950GX2 when the price drops. So disappointing. :(
 
Oct 4, 2004
10,515
6
81
Oblivion performance seems staggering for the 8800GTX :shocked:
It totally demolishes the 7900GTX in NFS: Carbon
And I see HDR+AA is working fine in Serious Sam 2 (I can't read Chinese but I see they didn't bench the 7900GTX with that setting. So, it must be bug/artifact free with the 8800 for them to use that setting)

I guess Oblivion will now finally get integrated support for HDR+AA

Centurin: Wait for more benches and newer drivers. Hopefully, the confusion will wear off.
I can't wait for the Anandtech/Rage3D/bit-tech reviews.
 

moonboy403

Golden Member
Aug 18, 2004
1,828
0
76
G80's pretty darn impressive especially in oblivion!!!!
too bad they didn't test out 8aa/16aa performance

but i guess i'll spend that $650 on PS3 instead
 

Dethfrumbelo

Golden Member
Nov 16, 2004
1,499
0
0
Originally posted by: Centurin
Originally posted by: DethfrumbeloHmmm... the GTS is barely faster than an X1950XT (at stock) - except in Oblivion where it's much faster, while the GTX pulls way out in front of everybody (30-50% faster than the GTS on average).

If that's the case, then I'm better off getting a 7950Gx2. I'm so confused. :confused:

To be honest, I wouldn't bother with the 7950GX2. You won't get it any cheaper than $500 and it's just not worth it - the G80s have much better AF/AA quality and there are incompatibility issues with certain mobos.

I really want to see some overclocked game benchmarks for the 8800GTS. I know it takes 3DMock05 from 16K to 18.7K.


 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: moonboy403
G80's pretty darn impressive especially in oblivion!!!!
too bad they didn't test out 8aa/16aa performance

but i guess i'll spend that $650 on PS3 instead

I was wondering why they didn't bench Oblivion with AA...

It's also odd that they didn't do any SLI benching either. Based in the numerous pictures before the benchmarks, they would have had the cards to do it.
 

Giffen

Member
Aug 3, 2006
33
0
0
This looks like the cpu is limiting the graphics, because
1) A single 8800gtx gets almost 11k 2006 3dmarks,
2) even if a 7900gx2 only gets 30k in '05 where the sli 8800gtx gets about 50k marks...SLI 7900gtx's are faster than a 7900gx2 so they must get like 40k...not much of an improvement over the old 7900gtx tech.

If these are true it must be due to the cpu limitation or that they aren't cranking up the details.
 

moonboy403

Golden Member
Aug 18, 2004
1,828
0
76
Originally posted by: nitromullet
Originally posted by: moonboy403
G80's pretty darn impressive especially in oblivion!!!!
too bad they didn't test out 8aa/16aa performance

but i guess i'll spend that $650 on PS3 instead

I was wondering why they didn't bench Oblivion with AA...

It's also odd that they didn't do any SLI benching either. Based in the numerous pictures before the benchmarks, they would have had the cards to do it.

based on what the article says, it's because g80 cannot do aa +hdr in oblivion since there is no "chuck patch" for it

hence they chose to do only hdr + no aa
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: moonboy403
Originally posted by: nitromullet
Originally posted by: moonboy403
G80's pretty darn impressive especially in oblivion!!!!
too bad they didn't test out 8aa/16aa performance

but i guess i'll spend that $650 on PS3 instead

I was wondering why they didn't bench Oblivion with AA...

It's also odd that they didn't do any SLI benching either. Based in the numerous pictures before the benchmarks, they would have had the cards to do it.

based on what the article says, it's because g80 cannot do aa +hdr in oblivion since there is no "chuck patch" for it

hence they chose to do only hdr + no aa

Thanks... I was thinking that it must be something like that... Isn't Oblivion a TWIMTBP game? If so, I would assume that Bethesda will have a patch that allows HDR+AA shortly.
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
according to the article, appearantly 8800gtx is everything it promised to be, but not 8800gts.. Nvidia lowered every aspect of its specs to really distinguishs it from the gtx model.

 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Too bad they didn't mention which part of Oblivion they tested. Looking at the numbers, they seem too high for the heavy foliage area, which is what really hurts the video cards.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: munky
Too bad they didn't mention which part of Oblivion they tested. Looking at the numbers, they seem too high for the heavy foliage area, which is what really hurts the video cards.

QFT. Not to mention how they didn't use any AA in the Oblivion testing.

Nitro, yes Oblivion is a TWIMTBP game but I'd expect the driver of the card to be able to make HDR+AA available. Although a patch wouldn't surprise me either. I just hope all of the upcoming benchmarks address this instead of skip over it like it's a normal nVidia Oblivion benchmark and exclude AA while bumping up the resolution.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: beggerking
according to the article, appearantly 8800gtx is everything it promised to be, but not 8800gts.. Nvidia lowered every aspect of its specs to really distinguishs it from the gtx model.

Well, it all depends on how much the GTS costs... If the GTX is close to $700 and the GTS is $400, it really isn't a bad card by comparison for the money.
 

Centurin

Member
Sep 13, 2006
155
0
71
I would buy a 8800GTS for $400 bucks in a heartbeat, but I highly doubt you'll see it for less than $500.
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: nitromullet
Well, it all depends on how much the GTS costs... If the GTX is close to $700 and the GTS is $400, it really isn't a bad card by comparison for the money.

well, Nvidia got smarter, lesson learned from ATI's 1900xtx vs 1900xt.
 

enz660hp

Senior member
Jun 19, 2006
242
0
0
man....people should really be using the top end parts when benching, especially when it comes to a beast such as the g80. It says they used the fastes known amd (not quad core intel)
 

Dethfrumbelo

Golden Member
Nov 16, 2004
1,499
0
0
Originally posted by: beggerking
Originally posted by: nitromullet
Well, it all depends on how much the GTS costs... If the GTX is close to $700 and the GTS is $400, it really isn't a bad card by comparison for the money.

well, Nvidia got smarter, lesson learned from ATI's 1900xtx vs 1900xt.

If the GTS shows up weak in the benchmarks vs. the GTX (40-50% under) then I think most people with open budgets will just grab the GTX.

The X1950XT 256MB sells for $280, so anyone on a budget would probably opt for that.


 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: beggerking
according to the article, appearantly 8800gtx is everything it promised to be, but not 8800gts.. Nvidia lowered every aspect of its specs to really distinguishs it from the gtx model.


Watch for a GT model in 80 or 65nm. ;)
Seems to me that there is room to "slip" a model in there between the GTS and GTX.