Xbitlab's intensive review of 7 games

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: BFG10K
Does older hardware like that benefit from the newest of new drivers?
In my case "hell no". Like I said the drivers are getting progressively worse each iteration.

What do you think the best ones are?
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: BFG10K
Now that I've knocked nVidia's shimmering I feel it only fair to knock ATi's driver quality which has been on a steady downward trend.

The latest 6.8 drivers are absolutely heinous on my X800 XL.


ATi's drivers have been crappy for me since 6.2
 

Dainas

Senior member
Aug 5, 2005
299
0
0
For some reason Oblivion is a game with so much going on that you cannot see all the little annoyences that slap you in the face in games like UT2004 and Half Life 2. I guess shimmer is hard to track on cobblestone and grass as I play at 1920x1440 with only anis at 8x and it still looks absolutely gorgeous (here), those setting make me puke in any other game.

Not to defend Xbit about their choice of settings, but they usualy do a pretty damn good job as review sites go.
 

LittleNemoNES

Diamond Member
Oct 7, 2005
4,142
0
0
There's no way in hell I'd use Quality over HQ. I tried it yesterday to see the difference with 6.8 and though it wasn't noticeable @ first, As I was running around a path, you can begin to see the texture move slightly. 8xHQAF High quality looks better to me than 16xHQAF quality.

After using my x1900 cards, I cant go back to nvidia @ the moment. My other PC has a 7900 GT in it (stepped up from 7800GT -- on the last day! :Q) and I'm surprised it puts up a fight. However, even with all quality options set to high, (HQ, all optimizations off, LOD clamp, Forced trilinear, etc) The ground looked blurry from every angle I looked @ it from... yuck.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: BFG10K
Now that I've knocked nVidia's shimmering I feel it only fair to knock ATi's driver quality which has been on a steady downward trend.

The latest 6.8 drivers are absolutely heinous on my X800 XL.

Really? I never notice much difference between drivers on a x1900gt or a x800xtpe (agp). Sort of feel like they have steadily improved. I just install them and they seem to work, but I usually have only 1or 2 games going at a time so probably miss all these problems.
 

Todd33

Diamond Member
Oct 16, 2003
7,842
2
81
What freq 7900GT? It's in both mainstream and high end, same card? I think the brands can range from like 450Mhz to like 580Mhz.
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: redbox
Originally posted by: josh6079
We set up the ATI and Nvidia drivers as follows:

ATI Catalyst:

Catalyst A.I.: Standard
Mipmap Detail Level: Quality
Wait for vertical refresh: Always off
Adaptive antialiasing: Off
Temporal antialiasing: Off
Quality AF: Off
Other settings: default

Nvidia ForceWare:

Image Settings: Quality
Vertical sync: Off
Trilinear optimization: On
Anisotropic mip filter optimization: Off
Anisotropic sample optimization: On
Gamma correct antialiasing: On
Transparency antialiasing: Off
Other settings: default

:roll:

Both were standard settings which is horrible for a review of those kinds of high end systems. Why didn't they put both on "High Quality"?

Like it takes some stretch of the mind to figure out why xbit never changes IQ levels. Oblivion is a game that has alot of different areas that result in alot of different fps but most of them stayed the same between the xbit and the anandtech except the 7900gtx it got 29fps on anandtech and 43.3fps on xbit same res and settings others increased too.
the x1900xtx went from 32.6fps anandtech to 42.2fps

Crossfire and SLI got a bost too.
Crossfire Anandtech 46.1fps Xbit 56.3fps
SLI Anandtech 43.5fps Xbit 56.2

So
x1900xtx difference 9.6fps
7900GTX difference 14.3fps

Xfire difference 10.2fps
SLI difference 12.7fps

I know if is a little hard to compare performance results between to benchs I just wanted to see what the break down was and what performed differently on Xbit's so that Nvidia was better in Oblivion. One thing I thought was strange was they used Nvidia's 91.31 drivers but didn't use ATI's 6.8
The Anandtech bench used the 6.4 w/chuck patch and the Nvidia 84.43 so that bench is out of date on both sides. But I just wonder why Xbit choose not to test with ATI's newest drivers if they are going to test with Nvidia's?


Xbitlabs uses a 2-3 months cycle on the time they take to change driver sets, it just so happens that both these driver sets are from the month of June.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: schneiderguy
Originally posted by: 5150Joker
Originally posted by: josh6079
We set up the ATI and Nvidia drivers as follows:

ATI Catalyst:

Catalyst A.I.: Standard
Mipmap Detail Level: Quality
Wait for vertical refresh: Always off
Adaptive antialiasing: Off
Temporal antialiasing: Off
Quality AF: Off
Other settings: default

Nvidia ForceWare:

Image Settings: Quality
Vertical sync: Off
Trilinear optimization: On
Anisotropic mip filter optimization: Off
Anisotropic sample optimization: On
Gamma correct antialiasing: On
Transparency antialiasing: Off
Other settings: default

:roll:

Both were standard settings which is horrible for a review of those kinds of high end systems. Why didn't they put both on "High Quality"?



Xbit reviews are misleading that's why. They always fail to use high quality settings in the CP thus avoiding's nVidia's achilles heel.

so by your line of thinking, every review on the internet is misleading because they dont all test a game like say, pacific fighters, thus avoiding ATI's achilles heel? okay :confused:
Only reviews that don't show ATI as teh best are misleading....of course :roll:
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: Todd33
What freq 7900GT? It's in both mainstream and high end, same card? I think the brands can range from like 450Mhz to like 580Mhz.

Xbitlabs doesn't use Nvidia's overclocked from factory cards in these lineups to my knowledge, so the clockrates will adhere to Nvidia's reference designs.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: BFG10K
so by your line of thinking, every review on the internet is misleading because they dont all test a game like say, pacific fighters, thus avoiding ATI's achilles heel
What achilles heel?

ATi doesn't have an achiles heel with that game, nVidia does when running under High Quality mode. They also have it for many other games.

ATI can not run Pacific Fighters in SM 3.0 mode because ATI cards lack a key feature of SM3.0 called VTF. Since ATI did not fully support the SM3.0 standard, Pacific Fighters has to be run at a lower quality than on NVIDIA cards.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
this review is particularly bad :p

did you like the low-end cards compared at 19x12?

that was a complete waste of testing time and posting results . . . NONE of them are satifactory

it was not 'practical' . . . unlike AT's reviews.
Originally posted by: Wreckage
Originally posted by: BFG10K
so by your line of thinking, every review on the internet is misleading because they dont all test a game like say, pacific fighters, thus avoiding ATI's achilles heel
What achilles heel?

ATi doesn't have an achiles heel with that game, nVidia does when running under High Quality mode. They also have it for many other games.

ATI can not run Pacific Fighters in SM 3.0 mode because ATI cards lack a key feature of SM3.0 called VTF. Since ATI did not fully support the SM3.0 standard, Pacific Fighters has to be run at a lower quality than on NVIDIA cards.
they don't look ANY different . . . what a joke! :p
:thumbsdown:

nvidia's gfx are NOT 'better' using SM3.0 in PF . . . and ATi has a 'legal workaround' for VTF
 

Zstream

Diamond Member
Oct 24, 2005
3,395
277
136
Originally posted by: Wreckage
Originally posted by: apoppin

they don't look ANY different . . . what a joke! :p
:thumbsdown:

nvidia's gfx are NOT 'better' using SM3.0 in PF . . . and ATi has a 'legal workaround' for VTF

http://www.4gamer.net/specials/2005-2006_nvidia/img/05.jpg

Only took me 5 seconds to find that and prove you wrong.

Well that is with VTF on and off, if they used the alternative route or added compatibility for ATI's VTF it would look the same. Remember Nvidias VTF is limited to only isotropically point samples of two different texture formats. ATI's Render to Vertex Buffer you can anisotropically trilinear filter any texture format that ATI supports.

So all in all granted it is not a true VTF but true VTF is broken, that is why no one uses it.




 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: Wreckage
Originally posted by: apoppin

they don't look ANY different . . . what a joke! :p
:thumbsdown:

nvidia's gfx are NOT 'better' using SM3.0 in PF . . . and ATi has a 'legal workaround' for VTF

http://www.4gamer.net/specials/2005-2006_nvidia/img/05.jpg

Only took me 5 seconds to find that and prove you wrong.

Isn't VTF only an issue in Pacific Fighters? If so and you are some Big time Pacific Fighter Whore then get Nvidia and don't worry. Other wise it's a none issue. I do think ATI should have included VTF just as I am disapointed in Nvidia for not including HDR+AA. But for my purposes I will use HDR+AA much more than VTF not to mention HQAF. So the choice is easy for me. Just like xbit said the card you pick kind of depends on what games you play.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Wreckage
Originally posted by: apoppin

they don't look ANY different . . . what a joke! :p
:thumbsdown:

nvidia's gfx are NOT 'better' using SM3.0 in PF . . . and ATi has a 'legal workaround' for VTF

http://www.4gamer.net/specials/2005-2006_nvidia/img/05.jpg

Only took me 5 seconds to find that and prove you wrong.

take five minutes . . . or five hours. . . you ain't proved anything. . . . that was debunked here a long time ago.

Rollo was 'in' on it and he bought PF to prove it looked 'better' on nvidia HW . . . well, it doesn't and he refused to let anyone test it . . . and i STILL have the game if anyone wants to prove it for themselves

Unless, i am mistaken ONLY PF uses nvidia's VF :p
and they purposely 'broke it' for ATi . . . btw, later ATi drivers completely narrowed the performace gap in this game.

. . . so who cares? you like PF? . . . i will sell you mine . . . cheap
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Dont know why we bother. If he loses and argument he just fades to black for a day. I remember him bashing Crossfire and I provided my results -- he didn't even address them.
What would there have been to address? Would he say anything that would really surprise you?
 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
This I found after sifting through that link that Apoppin posted. It was posted by Apoppin in that link it just sums up the VTF thing a little better than providing a link to a thread that we all have to sift through.

ATI's VTF workaround

quoted from that link:
Anyway Richard Huddy confirmed that the X1000 generation of R520 based cards and derived R530 and RV515 ca not directly support vertex texture fetch (VTF). ATI decided to do it as an optional feature of Shader model 3.0. I guess ATI would support it if it could but it found another way around it.

ATI supports a feature called Renter to Vertex Buffer (R2VB) and it allows developers to do anything that they would want of VTF. "In its simplest form you can see that R2VB is capable of everything that VTF can do because you could trivially simply copy the vertex texture data into the vertex stream using R2VB." He added.

He believe that this is better than just doing vertex texture cache. Render to vertex buffer is built on the pixel Shader and therefore you will have all the Shader's texturing capabilities.

When Nvidia is doing Vertex texture fetch it is limited to only isotropically point samples of two different texture formats. If you use ATI's Render to Vertex Buffer you can anisotropically trilinear filter any texture format that ATI supports. ATI believes that Renter to vertex buffer is just more flexible and better way to solve the same problem and even more.

So ATI can do VTF it just does it a different way. Futhermore no one uses VTF except Pacific Fighters and even Rollo said it is a dull game. Point is VTF is far from an Achiles heel, and no one benches Pacific Fighters because few people play it. Nvidia's default IQ settings however affect just about everyone that plays games on a PC. It would make much more sense for review sites to bench both cards at the same IQ levels.
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: redbox
Nvidia's default IQ settings however affect just about everyone that plays games on a PC. It would make much more sense for review sites to bench both cards at the same IQ levels.

just take 5% off of any nvidia card's score. problem solved
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: redbox
even Rollo said it is a dull game.

Way off topic, but I wish you guys would quit quoting a known liar and *#%^ disturber to prove a point. Just brings back negative memories and proves nothing (except that you miss him and hope for a speedy return).

Back on topic, this generation of vid cards is sure playing all my games nicely - time for some new games (dx10) to separate the wheat from the chaff...... or a huge monitor - which I don't want - as like a smaller footprint.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: schneiderguy
Originally posted by: redbox
Nvidia's default IQ settings however affect just about everyone that plays games on a PC. It would make much more sense for review sites to bench both cards at the same IQ levels.

just take 5% off of any nvidia card's score. problem solved

Well that may work for some sites, but is not exactly scientific. :music:
 

anandtechrocks

Senior member
Dec 7, 2004
760
0
76
Originally posted by: 5150Joker
Originally posted by: BFG10K
Now that I've knocked nVidia's shimmering I feel it only fair to knock ATi's driver quality which has been on a steady downward trend.

The latest 6.8 drivers are absolutely heinous on my X800 XL.


ATi's drivers have been crappy for me since 6.2

For me too. Did you try ATI Tool with the 6.8 Cats yet?