Xbitlab's intensive review of 7 games

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
Originally posted by: redbox
Originally posted by: tuteja1986
Originally posted by: josh6079
We set up the ATI and Nvidia drivers as follows:

ATI Catalyst:

Catalyst A.I.: Standard
Mipmap Detail Level: Quality
Wait for vertical refresh: Always off
Adaptive antialiasing: Off
Temporal antialiasing: Off
Quality AF: Off
Other settings: default

Nvidia ForceWare:

Image Settings: Quality
Vertical sync: Off
Trilinear optimization: On
Anisotropic mip filter optimization: Off
Anisotropic sample optimization: On
Gamma correct antialiasing: On
Transparency antialiasing: Off
Other settings: default

:roll:

Both were standard settings which is horrible for a review of those kinds of high end systems. Why didn't they put both on "High Quality"?


Because XBit Lab cators for the noobs now!!

When VR-Zone did decent bechmarking , the idiot editor didn't realise that he was using High Quadity Setting :( ahh the irony

Wait a sec... are you calling the editor an idiot or Shamino? Cause I am pretty sure Shamino knew what he was doing the guy is a world class overclocker and is pretty far from what I would call an idiot. There was more wrong in that bench than Image Quality settings and he knew it which is why he pulled it down.


check the thread , the dude was confused why Nvidia got a low frame rate in alot of bechmark ;*(
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: BFG10K
Both were standard settings which is horrible for a review of those kinds of high end systems. Why didn't they put both on "High Quality"?
In terms of image quality ATi's default quality with optimizations on matches or exceeds nVidia's High Quality with optimizations off.

Both ATI and NVIDIA have shimmering in certain games, HQ reduces this effect. ATI has yet to fix this problem.

Your blanket statement which is clearly your opinion does not change the numbers.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,976
126
When you change optimizations to on for Nvidia does it change IQ?
Of course; that's what this whole issue is about. Even more importantly is that nVidia's out-of-box settings are already set to those settings and hence most reviewers use the eye-gouging settings when they benchmark.

I also thought that at default ATI had optimizations off?
Yeah, what I'm saying is even if you turn them on ATi cards still look better than nVidia cards with optimizations off.

My 7900 GTX looks great under High Quality except with distant busy water shaders like Halo, Far Cry and Serious Sam 2 where the shimmering is very noticeable at long distances.

On my X800XL even when I turn on both Direct3D optimizations the water looks perfect.
 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: tuteja1986
Originally posted by: redbox
Originally posted by: tuteja1986
[

Because XBit Lab cators for the noobs now!!

When VR-Zone did decent bechmarking , the idiot editor didn't realise that he was using High Quadity Setting :( ahh the irony

Wait a sec... are you calling the editor an idiot or Shamino? Cause I am pretty sure Shamino knew what he was doing the guy is a world class overclocker and is pretty far from what I would call an idiot. There was more wrong in that bench than Image Quality settings and he knew it which is why he pulled it down.


check the thread , the dude was confused why Nvidia got a low frame rate in alot of bechmark ;*(

I don't need to check the thread I was involved in it any way. Ya the guy was confused why Nvidia got low frames because he ran the test three times and still got results that where too far off to be accountable with IQ differences. Even the ATI numbers where off. But I guess your such a smart person you can tell us all why all the numbers where off right? It wasn't a problem with the IQ settings the whole bench was screwed.
 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: BFG10K
When you change optimizations to on for Nvidia does it change IQ?
Of course; that's what this whole issue is about. Even more importantly is that nVidia's out-of-box settings are already set to those settings and hence most reviewers use the eye-gouging settings when they benchmark.

I also thought that at default ATI had optimizations off?
Yeah, what I'm saying is even if you turn them on ATi cards still look better than nVidia cards with optimizations off.

My 7900 GTX looks great under High Quality except with distant busy water shaders like Halo, Far Cry and Serious Sam 2 where the shimmering is very noticeable at long distances.

On my X800XL even when I turn on both Direct3D optimizations the water looks perfect.

So then I take it optimizations make IQ worse but improve performance?
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,976
126
Both ATI and NVIDIA have shimmering in certain games, HQ reduces this effect. ATI has yet to fix this problem.
Again ATi with optimizations on shimmers less than nVidia with High Quality + optimizations off. If High Quality is nVidia's "fix" then they're still inferior to ATi.

Your blanket statement which is clearly your opinion does not change the numbers.
It's neither opinion and it most certainly does change the numbers. That you choose to sweep the problem under rug is your problem, not mine.

Let's try something simple like Pacific Fighters:

In previous tests, NVIDIA has a significant advantage over ATI in this game but at the expense of quality. Indeed, NVIDIA uses a more aggressive optimisation for texture filtering. The higher the performance gain is, the more impact on graphic quality. This is easily confirmed by looking at the images displayed. We decided to test NVIDIA´s cards in HQ mode just to be fair and this brings graphic quality to a similar level as the Radeon. NVIDIA´s performance advantage disappears with this mode.
Now we can see nVidia's mythical performance advantage disappear as the X1900 XTX is on par with the 7900 GTX. The masses would have you believe everything from "inferior OpenGL" to "lack of vertex fetch" is holding back ATi's cards but in reality the issue is simply nVidia's butt-ugly default settings.

Look at the rest of the benchmarks like Quake 4 - even with just anisotropic filtering optimization turned off (the main one responsible for the bow shimmering on nVidia hardware) but nothing else the X1900 XTX is faster in Quake 4.

Turn off all optimizations on the 7900 GTX and it'll get absolutely torched in what standard benchmarks would have you believe is an nVidia dominated game.
 

schneiderguy

Lifer
Jun 26, 2006
10,769
52
91
Originally posted by: BFG10K
Both ATI and NVIDIA have shimmering in certain games, HQ reduces this effect. ATI has yet to fix this problem.
Again ATi with optimizations on shimmers less than nVidia with High Quality + optimizations off. If High Quality is nVidia's "fix" then they're still inferior to ATi.

wait, so you're saying that ATI with all optimizations on with (ie high performance mode) shimmers less than nvidia with HQ + no optimizations? :confused:


 

BFG10K

Lifer
Aug 14, 2000
22,709
2,976
126
There are two optimizations under Direct3D for ATi: anisotropic filtering and trilinear.

If you enable both of these you will still get better image quality than nVidia running under High Quality with optimizations off.
 

5150Joker

Diamond Member
Feb 6, 2002
5,559
0
71
www.techinferno.com
Originally posted by: josh6079
We set up the ATI and Nvidia drivers as follows:

ATI Catalyst:

Catalyst A.I.: Standard
Mipmap Detail Level: Quality
Wait for vertical refresh: Always off
Adaptive antialiasing: Off
Temporal antialiasing: Off
Quality AF: Off
Other settings: default

Nvidia ForceWare:

Image Settings: Quality
Vertical sync: Off
Trilinear optimization: On
Anisotropic mip filter optimization: Off
Anisotropic sample optimization: On
Gamma correct antialiasing: On
Transparency antialiasing: Off
Other settings: default

:roll:

Both were standard settings which is horrible for a review of those kinds of high end systems. Why didn't they put both on "High Quality"?



Xbit reviews are misleading that's why. They always fail to use high quality settings in the CP thus avoiding's nVidia's achilles heel.
 

5150Joker

Diamond Member
Feb 6, 2002
5,559
0
71
www.techinferno.com
Originally posted by: BFG10K
Both were standard settings which is horrible for a review of those kinds of high end systems. Why didn't they put both on "High Quality"?
In terms of image quality ATi's default quality with optimizations on matches or exceeds nVidia's High Quality with optimizations off.


QFT, I can attest to this since I have both a 7900 Go GTX and X1900 XTX. The ATi card with optimization on looks just as good (or better in some cases with HQ AF) then nVidia with HQ settings. On my 37" 1080p display nVidia's poor default quality is truly an eye sore.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: 5150Joker
Originally posted by: josh6079
We set up the ATI and Nvidia drivers as follows:

ATI Catalyst:

Catalyst A.I.: Standard
Mipmap Detail Level: Quality
Wait for vertical refresh: Always off
Adaptive antialiasing: Off
Temporal antialiasing: Off
Quality AF: Off
Other settings: default

Nvidia ForceWare:

Image Settings: Quality
Vertical sync: Off
Trilinear optimization: On
Anisotropic mip filter optimization: Off
Anisotropic sample optimization: On
Gamma correct antialiasing: On
Transparency antialiasing: Off
Other settings: default

:roll:

Both were standard settings which is horrible for a review of those kinds of high end systems. Why didn't they put both on "High Quality"?



Xbit reviews are misleading that's why. They always fail to use high quality settings in the CP thus avoiding's nVidia's achilles heel.


So that makes almost all hardware sites misleading, since they use quality settings like AT? :confused: even firing squad for example.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: schneiderguy
As for single-chip single-card solutions, the GeForce 7900 GTX is in the lead but not by much. The 24 TMUs help this card feel confident in high resolutions with enabled FSAA. This is also the case when the game contains a lot of pixel shaders with multiple texture lookups or just a lot of high-resolution textures. On the other hand, the Radeon X1900 XTX, though having a somewhat lower average performance in comparison with the GeForce 7900 GTX, often surpasses the latter in minimum speed thanks to its ability to process more pixel shaders simultaneously. Thus, it provides a bigger speed reserve in games that make wide use of visual effects created by means of mathematics-heavy shaders. So, your choice will probably depend on what particular games you are going to play.

:thumbsup:

 

schneiderguy

Lifer
Jun 26, 2006
10,769
52
91
Originally posted by: 5150Joker
Originally posted by: josh6079
We set up the ATI and Nvidia drivers as follows:

ATI Catalyst:

Catalyst A.I.: Standard
Mipmap Detail Level: Quality
Wait for vertical refresh: Always off
Adaptive antialiasing: Off
Temporal antialiasing: Off
Quality AF: Off
Other settings: default

Nvidia ForceWare:

Image Settings: Quality
Vertical sync: Off
Trilinear optimization: On
Anisotropic mip filter optimization: Off
Anisotropic sample optimization: On
Gamma correct antialiasing: On
Transparency antialiasing: Off
Other settings: default

:roll:

Both were standard settings which is horrible for a review of those kinds of high end systems. Why didn't they put both on "High Quality"?



Xbit reviews are misleading that's why. They always fail to use high quality settings in the CP thus avoiding's nVidia's achilles heel.

so by your line of thinking, every review on the internet is misleading because they dont all test a game like say, pacific fighters, thus avoiding ATI's achilles heel? okay :confused:

 

BFG10K

Lifer
Aug 14, 2000
22,709
2,976
126
So that makes almost all hardware sites misleading, since they use quality settings like AT?
I wouldn't call them misleading since they usually don't know better (indeed, nVidia's benchmarking guide insists they use Quality and claims there is no need for High Quality).

so by your line of thinking, every review on the internet is misleading because they dont all test a game like say, pacific fighters, thus avoiding ATI's achilles heel
What achilles heel?

ATi doesn't have an achiles heel with that game, nVidia does when running under High Quality mode. They also have it for many other games.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
so by your line of thinking, every review on the internet is misleading because they dont all test a game like say, pacific fighters, thus avoiding ATI's achilles heel? okay :confused:

The Achillies heel Joker was mentioning wasn't the kinds of games that are benched but rather the driver settings used when they bench them. He wasn't talking about ATI's Achilles heel (the only mortal tendon they have is power consumptions and marketing) but Nvidia's. Nvidia's performance is complimented more so than ATI's when the proper optimizations are met to make the image quality equal.

So that makes almost all hardware sites misleading, since they use quality settings like AT?:confused: even firing squad for example.
Yes. It should be well known that Nvidia uses inferior graphics to achieve its performance crown while in the default "plug'n'play" mode that sites amazingly continue to bench. That's why Nvidia has conjured up ways to put so many GPU's in one system--to accomadate its performance hits when finally reaching a level of image quality comparible to ATI's.
 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
So is every review now going to turn into a Nvidia vs. ATI IQ settings war. I doubt the review sites are going to change because we want them to. Sites like anandtech have to listen to the board manufacuters otherwise they don't get nice hardware to bench for us. I don't think Nvidia would allow them to do it.

Regardless of how sites bench do we all agree that ATI has better IQ and that Nvidia would have to change their IQ settings at the cost of performance to match?
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: redbox
So is every review now going to turn into a Nvidia vs. ATI IQ settings war.

Yep, we are back in one of these cycles. When behind, Ati competes with botched production and when behind Nvidia competes with lower IQ. Is always entertaining at least. :beer:

 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
G80 where art thou?

Xbitlabs should try to add minimum fps on all games. That would be interesting.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: Cookie Monster
G80 where art thou?
Right by the R600--December.
Xbitlabs should try to add minimum fps on all games. That would be interesting.
What's interesting about it? Didn't you get the memo?
On the other hand, the Radeon X1900 XTX, though having a somewhat lower average performance in comparison with the GeForce 7900 GTX, often surpasses the latter in minimum speed thanks to its ability to process more pixel shaders simultaneously. Thus, it provides a bigger speed reserve in games that make wide use of visual effects created by means of mathematics-heavy shaders.
ATI's minimum frames normally are better than Nvidia's minimum frames.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,976
126
Now that I've knocked nVidia's shimmering I feel it only fair to knock ATi's driver quality which has been on a steady downward trend.

The latest 6.8 drivers are absolutely heinous on my X800 XL.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Does older hardware like that benefit from the newest of new drivers? I thought for older cards there comes a point where newer drivers are benefiting newer hardware?

I see your point though. 6.7's just idiot-proofed Crossfire but didn't do anything good performance wise. If 6.8's don't do anything better than 6.6 or 6.7 combined, then I'd just stick with 6.6.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,976
126
Does older hardware like that benefit from the newest of new drivers?
In my case "hell no". Like I said the drivers are getting progressively worse each iteration.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: josh6079
True, I like their range of games as well, but I don't see why they use those default settings when someone would have a top of the line Crossfire or SLI setup. Who honestly has two 7900GTX's in SLI and uses the default "Quality" setting in the driver?

to satisfy nvidia fans by giving nvidia hardware an advantage? it's well known nv drivers have more opts on in "default" mode than ati. the hit nv takes when using quality mode does not put them in a fovorable light.

at the least, they should compare performance, default, and quality modes to give readers a better overall picture of how everything stacks up.