Xbitlab's intensive review of 7 games

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
As for single-chip single-card solutions, the GeForce 7900 GTX is in the lead but not by much. The 24 TMUs help this card feel confident in high resolutions with enabled FSAA. This is also the case when the game contains a lot of pixel shaders with multiple texture lookups or just a lot of high-resolution textures. On the other hand, the Radeon X1900 XTX, though having a somewhat lower average performance in comparison with the GeForce 7900 GTX, often surpasses the latter in minimum speed thanks to its ability to process more pixel shaders simultaneously. Thus, it provides a bigger speed reserve in games that make wide use of visual effects created by means of mathematics-heavy shaders. So, your choice will probably depend on what particular games you are going to play.

:thumbsup:
 

Kromis

Diamond Member
Mar 2, 2006
5,214
1
81
What the hell? How can an X1800XT lose to a 7900GT in Oblivion!??! Or maybe that's just dungeon areas...
 

LittleNemoNES

Diamond Member
Oct 7, 2005
4,142
0
0
Every single review changes things around. Before x1900 Crossfire was faster than 7900GTX SLI in Oblivion -- now Nvidia had the edge. I don't know, man. Anand had a nice thorough review so I'll stick with that. It always seems like some sites have something to prove: "Hey the new beta driver is out -- hope it can beat so and so card. Hey, it did! Lets publish it! "
 

zendari

Banned
May 27, 2005
6,558
0
0
What I don't get is why the x1900s crushed the 7900s in Oblivion on ATs review, and lose in this one.....

AT had the x1900xt > 7900gt by 51%, and this is closer to 23%.

Interesting selection of games though, I've never even heard of Titan Quest.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Silly review. No mention of IQ, and besides I am totally at loss of what exotic Nvidia settings give equal IQ to Ati default or whatever. ;)
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
We set up the ATI and Nvidia drivers as follows:

ATI Catalyst:

Catalyst A.I.: Standard
Mipmap Detail Level: Quality
Wait for vertical refresh: Always off
Adaptive antialiasing: Off
Temporal antialiasing: Off
Quality AF: Off
Other settings: default

Nvidia ForceWare:

Image Settings: Quality
Vertical sync: Off
Trilinear optimization: On
Anisotropic mip filter optimization: Off
Anisotropic sample optimization: On
Gamma correct antialiasing: On
Transparency antialiasing: Off
Other settings: default

:roll:

Both were standard settings which is horrible for a review of those kinds of high end systems. Why didn't they put both on "High Quality"?
 

Praxis1452

Platinum Member
Jan 31, 2006
2,197
0
0
X1900XTX=7900gtx in outdoors... what oblivion settings were they using??? I stopped on the third page... how sad.
 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: josh6079
We set up the ATI and Nvidia drivers as follows:

ATI Catalyst:

Catalyst A.I.: Standard
Mipmap Detail Level: Quality
Wait for vertical refresh: Always off
Adaptive antialiasing: Off
Temporal antialiasing: Off
Quality AF: Off
Other settings: default

Nvidia ForceWare:

Image Settings: Quality
Vertical sync: Off
Trilinear optimization: On
Anisotropic mip filter optimization: Off
Anisotropic sample optimization: On
Gamma correct antialiasing: On
Transparency antialiasing: Off
Other settings: default

:roll:

Both were standard settings which is horrible for a review of those kinds of high end systems. Why didn't they put both on "High Quality"?

Like it takes some stretch of the mind to figure out why xbit never changes IQ levels. Oblivion is a game that has alot of different areas that result in alot of different fps but most of them stayed the same between the xbit and the anandtech except the 7900gtx it got 29fps on anandtech and 43.3fps on xbit same res and settings others increased too.
the x1900xtx went from 32.6fps anandtech to 42.2fps

Crossfire and SLI got a bost too.
Crossfire Anandtech 46.1fps Xbit 56.3fps
SLI Anandtech 43.5fps Xbit 56.2

So
x1900xtx difference 9.6fps
7900GTX difference 14.3fps

Xfire difference 10.2fps
SLI difference 12.7fps

I know if is a little hard to compare performance results between to benchs I just wanted to see what the break down was and what performed differently on Xbit's so that Nvidia was better in Oblivion. One thing I thought was strange was they used Nvidia's 91.31 drivers but didn't use ATI's 6.8
The Anandtech bench used the 6.4 w/chuck patch and the Nvidia 84.43 so that bench is out of date on both sides. But I just wonder why Xbit choose not to test with ATI's newest drivers if they are going to test with Nvidia's?
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: redbox
Originally posted by: josh6079
We set up the ATI and Nvidia drivers as follows:

ATI Catalyst:

Catalyst A.I.: Standard
Mipmap Detail Level: Quality
Wait for vertical refresh: Always off
Adaptive antialiasing: Off
Temporal antialiasing: Off
Quality AF: Off
Other settings: default

Nvidia ForceWare:

Image Settings: Quality
Vertical sync: Off
Trilinear optimization: On
Anisotropic mip filter optimization: Off
Anisotropic sample optimization: On
Gamma correct antialiasing: On
Transparency antialiasing: Off
Other settings: default

:roll:

Both were standard settings which is horrible for a review of those kinds of high end systems. Why didn't they put both on "High Quality"?

Like it takes some stretch of the mind to figure out why xbit never changes IQ levels. Oblivion is a game that has alot of different areas that result in alot of different fps but most of them stayed the same between the xbit and the anandtech except the 7900gtx it got 29fps on anandtech and 43.3fps on xbit same res and settings others increased too.
the x1900xtx went from 32.6fps anandtech to 42.2fps

Crossfire and SLI got a bost too.
Crossfire Anandtech 46.1fps Xbit 56.3fps
SLI Anandtech 43.5fps Xbit 56.2

So
x1900xtx difference 9.6fps
7900GTX difference 14.3fps

Xfire difference 10.2fps
SLI difference 12.7fps

I know if is a little hard to compare performance results between to benchs I just wanted to see what the break down was and what performed differently on Xbit's so that Nvidia was better in Oblivion. One thing I thought was strange was they used Nvidia's 91.31 drivers but didn't use ATI's 6.8
The Anandtech bench used the 6.4 w/chuck patch and the Nvidia 84.43 so that bench is out of date on both sides. But I just wonder why Xbit choose not to test with ATI's newest drivers if they are going to test with Nvidia's?

werent the 6.8's just released today or yesterday?
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: redbox
But I just wonder why Xbit choose not to test with ATI's newest drivers if they are going to test with Nvidia's?

The title of the article is "Seven Games and One Week". 6.8 just came out today. Xbit was probably testing all last week. They could have used 91.45 as well.

Good review overall, it gives a nice baseline without all the little odd tweaks (that each card has) being messed with. I like how they use a broader range of games instead of a handful of FPS games and 3Dmark like most sites.






 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: Wreckage
Originally posted by: redbox
But I just wonder why Xbit choose not to test with ATI's newest drivers if they are going to test with Nvidia's?

The title of the article is "Seven Games and One Week". 6.8 just came out today. Xbit was probably testing all last week. They could have used 91.45 as well.

Good review overall, it gives a nice baseline without all the little odd tweaks (that each card has) being messed with. I like how they use a broader range of games instead of a handful of FPS games and 3Dmark like most sites.

True, I like their range of games as well, but I don't see why they use those default settings when someone would have a top of the line Crossfire or SLI setup. Who honestly has two 7900GTX's in SLI and uses the default "Quality" setting in the driver?
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: josh6079
Originally posted by: Wreckage
Originally posted by: redbox
But I just wonder why Xbit choose not to test with ATI's newest drivers if they are going to test with Nvidia's?

The title of the article is "Seven Games and One Week". 6.8 just came out today. Xbit was probably testing all last week. They could have used 91.45 as well.

Good review overall, it gives a nice baseline without all the little odd tweaks (that each card has) being messed with. I like how they use a broader range of games instead of a handful of FPS games and 3Dmark like most sites.

True, I like their range of games as well, but I don't see why they use those default settings when someone would have a top of the line Crossfire or SLI setup. Who honestly has two 7900GTX's in SLI and uses the default "Quality" setting in the driver?

Well like I said, it give a good baseline. I bet nobody has their setup the same. This is the best way to test both cards apples to apples.

I bet all those people who bought SLI Dell's left the settings at default. :p
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
I bet all those people who bought SLI Dell's left the settings at default.:p

QFT.

Well like I said, it give a good baseline. I bet nobody has their setup the same. This is the best way to test both cards apples to apples.

Wouldn't setting both to "High Quality" be an apples to apples comparison? I know keeping everything at default is a good base line, but Xbit is a site for enthusiasts who research particular pieces of hardware, not average Joe's with SLI Dell systems that came in a box.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: josh6079
I bet all those people who bought SLI Dell's left the settings at default.:p

QFT.

Well like I said, it give a good baseline. I bet nobody has their setup the same. This is the best way to test both cards apples to apples.

Wouldn't setting both to "High Quality" be an apples to apples comparison? I know keeping everything at default is a good base line, but Xbit is a site for enthusiasts who research particular pieces of hardware, not average Joe's with SLI Dell systems that came in a box.

pretty silly to have super-high resolutions and then dumb down the IQ

And they test low end cards at 19x12 which is really stupid . . . we get the see the 'fastest' of the 'slowest' at completely UNplayable FPS.:p
:thumbsdown:

what are they smoking?
 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: Wreckage
Originally posted by: redbox
But I just wonder why Xbit choose not to test with ATI's newest drivers if they are going to test with Nvidia's?

The title of the article is "Seven Games and One Week". 6.8 just came out today. Xbit was probably testing all last week. They could have used 91.45 as well.

Good review overall, it gives a nice baseline without all the little odd tweaks (that each card has) being messed with. I like how they use a broader range of games instead of a handful of FPS games and 3Dmark like most sites.

I just went to Nvidia.com are the 91.45 beta? It took me to the 91.31's They could have used the 6.7 cats couldn't they have? I don't think it would have given them any more perfromance but you never know. I do like the different games benchmarked, I just wish they would push both cards to the limit. Which can be done with the IQ settings which are anything but odd little tweaks being messed with. If they are going to be apples to apples might as well put both IQ settings to max. A true baseline would be IQ maxed or IQ min. Those ceilings and floors can't be passed therefore they are convenient baselines. I support this type of benching, but many sites don't use it. Still not a bad bench in the end. I just find Xbit to be a bit strange nothing really new has happend in the gpu area yet they feel the need to do a new bench:confused: ? I could see if a brand new hard to run game came out, or a brand new GPU, but to do a new bench just because it's the middle of summer? It just seams a bit weird.
 

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
Originally posted by: josh6079
We set up the ATI and Nvidia drivers as follows:

ATI Catalyst:

Catalyst A.I.: Standard
Mipmap Detail Level: Quality
Wait for vertical refresh: Always off
Adaptive antialiasing: Off
Temporal antialiasing: Off
Quality AF: Off
Other settings: default

Nvidia ForceWare:

Image Settings: Quality
Vertical sync: Off
Trilinear optimization: On
Anisotropic mip filter optimization: Off
Anisotropic sample optimization: On
Gamma correct antialiasing: On
Transparency antialiasing: Off
Other settings: default

:roll:

Both were standard settings which is horrible for a review of those kinds of high end systems. Why didn't they put both on "High Quality"?


Because XBit Lab cators for the noobs now!!

When VR-Zone did decent bechmarking , the idiot editor didn't realise that he was using High Quadity Setting :( ahh the irony



 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: tuteja1986
Originally posted by: josh6079
We set up the ATI and Nvidia drivers as follows:

ATI Catalyst:

Catalyst A.I.: Standard
Mipmap Detail Level: Quality
Wait for vertical refresh: Always off
Adaptive antialiasing: Off
Temporal antialiasing: Off
Quality AF: Off
Other settings: default

Nvidia ForceWare:

Image Settings: Quality
Vertical sync: Off
Trilinear optimization: On
Anisotropic mip filter optimization: Off
Anisotropic sample optimization: On
Gamma correct antialiasing: On
Transparency antialiasing: Off
Other settings: default

:roll:

Both were standard settings which is horrible for a review of those kinds of high end systems. Why didn't they put both on "High Quality"?


Became XBit Lab cators for the noobs now!!

When VR-Zone did decent bechmarking , the idiot editor didn't realise that he was using High Quadity Setting :( ahh the irony

Wait a sec... are you calling the editor an idiot or Shamino? Cause I am pretty sure Shamino knew what he was doing the guy is a world class overclocker and is pretty far from what I would call an idiot. There was more wrong in that bench than Image Quality settings and he knew it which is why he pulled it down.
 

Praxis1452

Platinum Member
Jan 31, 2006
2,197
0
0
actually it mainly got pulled cause of ATI. However he believed there was something wrong too.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Both were standard settings which is horrible for a review of those kinds of high end systems. Why didn't they put both on "High Quality"?
In terms of image quality ATi's default quality with optimizations on matches or exceeds nVidia's High Quality with optimizations off.
 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: BFG10K
Both were standard settings which is horrible for a review of those kinds of high end systems. Why didn't they put both on "High Quality"?
In terms of image quality ATi's default quality with optimizations on matches or exceeds nVidia's High Quality with optimizations off.

When you change optimizations to on for Nvidia does it change IQ? I also thought that at default ATI had optimizations off?