Xbitlab's intensive review of 7 games

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: Wreckage
Originally posted by: Ackmed
First review ever I have seen that shows NV faster in Oblivion. That is enough for me to not take it seriously.

Translation: Any review that does not make ATI look good is invalid to Ackmed.

No....he also said that if a review showed ATI to do better in quake 4 than Nvidia he would view it as off as well.

Their "work around" has yet to be proven to even work. MS never officially stated that ATI fully supports SM3.0.

How would you want us to prove that it works? The only game that uses it codes it only for Nvidia. Even if the R2VB works for ATI PF VTF uses NV-specific OGL commands.

How would ATI get SM3.0 stamped on their cards if MS never oked it?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Josh . . . you are . . . wasting your time.

i am 100% sure he does not read anything but SNIPPETS of our posts . . . he only looks for and actually only SEES out of context bits of trivia that he can SIEZE and attack - Rollo pioneered this troll tactic here and we have at least a journeyman exploiting it again.

Clearly Rollo was FORCED to back down . . . on his Precious VTF . . . the one i linked to was his second troll thread on PF.

IF you REMEMBEr, he offered to SELL his game when he couldn't figure out how to bench it . . . and then pretended to be p'o'd at me when i called him on something and took his OFFER BACK . . . so it could NOT be benched
:p

i then made ANOTHER thread that called him out on it and actually purchased PF for review by our members . . . he backed down again.

ATi has VTF . . . . just not PF's broken version of it.
. . . and later Cats really speeded up performance. ;)

 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: BFG10K
just take 5% off of any nvidia card's score. problem solved
:roll:

the 5% number if from my own testing.
Then your testing is obviously wrong.

Q mode in drivers, all optimizations OFF: 57.34 fps
Like I said, your testing is wrong. nVidia doesn't ship their cards with all optimizations off.

:roll:

so any tests that DONT agree with yours are wrong? What if your testing is obviously wrong? hmmm?? you always claim the performance hit from going to Q to HQ is 10 to 15 percent, thats what i tested :roll:
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Originally posted by: josh6079
We set up the ATI and Nvidia drivers as follows:

ATI Catalyst:

Catalyst A.I.: Standard
Mipmap Detail Level: Quality
Wait for vertical refresh: Always off
Adaptive antialiasing: Off
Temporal antialiasing: Off
Quality AF: Off
Other settings: default

Nvidia ForceWare:

Image Settings: Quality
Vertical sync: Off
Trilinear optimization: On
Anisotropic mip filter optimization: Off
Anisotropic sample optimization: On
Gamma correct antialiasing: On
Transparency antialiasing: Off
Other settings: default

:roll:

Both were standard settings which is horrible for a review of those kinds of high end systems. Why didn't they put both on "High Quality"?


Probably because ATI doesn't gain much from lowering settings (stupid thing to do anyway) but Nvidia loses a lot from raising them. I don't like Xbit personally
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Originally posted by: josh6079
I bet all those people who bought SLI Dell's left the settings at default.:p

QFT.

Well like I said, it give a good baseline. I bet nobody has their setup the same. This is the best way to test both cards apples to apples.

Wouldn't setting both to "High Quality" be an apples to apples comparison? I know keeping everything at default is a good base line, but Xbit is a site for enthusiasts who research particular pieces of hardware, not average Joe's with SLI Dell systems that came in a box.

They didn't keep everything at default, they LOWERED ati's settings. Stupid thing to do for a high end card

 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
Originally posted by: schneiderguy
Originally posted by: BFG10K
just take 5% off of any nvidia card's score. problem solved
:roll:

the 5% number if from my own testing.
Then your testing is obviously wrong.

Q mode in drivers, all optimizations OFF: 57.34 fps
Like I said, your testing is wrong. nVidia doesn't ship their cards with all optimizations off.

:roll:

so any tests that DONT agree with yours are wrong? What if your testing is obviously wrong? hmmm?? you always claim the performance hit from going to Q to HQ is 10 to 15 percent, thats what i tested :roll:

Because at least he has a site which backs up his claimed drop % which is www.legitreviews.com

The only site I know of that always uses HQ with nvidia cards... Unfortunately since its the only one, of course Nvidia fans will say its ati biased and we should just ignore it, but I dont see a reason to
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
another quick test, this time going with default settings for quality (optimizations on)

A64 3500+
7600gt
1.5gig ram

CS:Source VST
1280*1024 everything on highest setting except vsync (off)
8xS AA, 16x AF, TRSS

Run #1: HQ mode set in drivers, all optimizations OFF --- 50.4 FPS Screenshot

Run #2: Q mode set in drivers, "Anisotropic Mip Optimization" and "Anisotropic Sample Optimization" both set to "on" --- 51.19 FPS Screenshot

Difference between HQ and Q with optimizations on? Less than 2%

 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
Originally posted by: schneiderguy
another quick test, this time going with default settings for quality (optimizations on)

A64 3500+
7600gt
1.5gig ram

CS:Source VST
1280*1024 everything on highest setting except vsync (off)
8xS AA, 16x AF, TRSS

Run #1: HQ mode set in drivers, all optimizations OFF --- 50.4 FPS Screenshot

Run #2: Q mode set in drivers, "Anisotropic Mip Optimization" and "Anisotropic Sample Optimization" both set to "on" --- 51.19 FPS Screenshot

Difference between HQ and Q with optimizations on? Less than 2%


Maybe it depends on what Cpu and ram you have etc... some systems will have a big drop and others wont.. Since no one tests this we will never know.. unless Nvidia decides to enable HQ by default from G80 on
 

CKXP

Senior member
Nov 20, 2005
926
0
0
Originally posted by: schneiderguy
another quick test, this time going with default settings for quality (optimizations on)

A64 3500+
7600gt
1.5gig ram

CS:Source VST
1280*1024 everything on highest setting except vsync (off)
8xS AA, 16x AF, TRSS

Run #1: HQ mode set in drivers, all optimizations OFF --- 50.4 FPS Screenshot

Run #2: Q mode set in drivers, "Anisotropic Mip Optimization" and "Anisotropic Sample Optimization" both set to "on" --- 51.19 FPS Screenshot

Difference between HQ and Q with optimizations on? Less than 2%

do you have Rivatuner installed? if so try checking the "Driver settings"-> Direct 3D tweaks-> Intellisample Tab....sometimes even if you switch to "HQ" mode from "Q" in the NV CP the Optimizations will still be Enabled in Rivatuner until you disable them.
 

Ackmed

Diamond Member
Oct 1, 2003
8,498
560
126
Originally posted by: Wreckage
Originally posted by: Ackmed
First review ever I have seen that shows NV faster in Oblivion. That is enough for me to not take it seriously.

Translation: Any review that does not make ATI look good is invalid to Ackmed.

Funny how you forgot to mention the second part of that post. I wouldnt believe it either, if a review showed ATi having the lead in Quake 4. Would you believe something like that? Its just not how it is. No other review agrees, and has NV leading ATi in Oblivion in average frames at those settings.

Originally posted by: Wreckage
Originally posted by: apoppin
and NOTHING can be proved - to you . . . you post and then shut your eyes and mind to all reasonable replies and *proof*

No reasonable response or proof has been displayed.

Just denial and complaining.

You've been proven wrong before, and just ignore it, and hide. Example; http://forums.anandtech.com/messageview...adid=1909669&STARTPAGE=2&enterthread=y

 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: CKXP
Originally posted by: schneiderguy
another quick test, this time going with default settings for quality (optimizations on)

A64 3500+
7600gt
1.5gig ram

CS:Source VST
1280*1024 everything on highest setting except vsync (off)
8xS AA, 16x AF, TRSS

Run #1: HQ mode set in drivers, all optimizations OFF --- 50.4 FPS Screenshot

Run #2: Q mode set in drivers, "Anisotropic Mip Optimization" and "Anisotropic Sample Optimization" both set to "on" --- 51.19 FPS Screenshot

Difference between HQ and Q with optimizations on? Less than 2%

do you have Rivatuner installed? if so try checking the "Driver settings"-> Direct 3D tweaks-> Intellisample Tab....sometimes even if you switch to "HQ" mode from "Q" in the NV CP the Optimizations will still be Enabled in Rivatuner until you disable them.

no, i dont have rivatuner installed
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: ShadowOfMyself
Originally posted by: schneiderguy
another quick test, this time going with default settings for quality (optimizations on)

A64 3500+
7600gt
1.5gig ram

CS:Source VST
1280*1024 everything on highest setting except vsync (off)
8xS AA, 16x AF, TRSS

Run #1: HQ mode set in drivers, all optimizations OFF --- 50.4 FPS Screenshot

Run #2: Q mode set in drivers, "Anisotropic Mip Optimization" and "Anisotropic Sample Optimization" both set to "on" --- 51.19 FPS Screenshot

Difference between HQ and Q with optimizations on? Less than 2%


Maybe it depends on what Cpu and ram you have etc... some systems will have a big drop and others wont.. Since no one tests this we will never know.. unless Nvidia decides to enable HQ by default from G80 on

exactly. the performance hit isnt ALWAYS 10-15% like BFG10K claims. im not saying it isnt that high sometimes, but most of the time, it isnt that high.

also, nvidia could have made some improvements in their drivers since BFG did his tests, im using the 91.31's, not sure what he used
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: josh6079
You're his equal in that slant of light. Do you hear the words you speak? Or are you spouting at the mouth and remaining oblivious to what your lips leak out?
Where were you the last 2 weeks?


PF's VTF has nothing to do with X-bit's benchmarks.
Once again for you and yours since you refuse to listen.....My comments regarding VTF had nothing to do with the benchmarks, but were a response to a question about ATI's problem running PF.

They give a wider range of games than some other benches, but overlooked (or left out) key driver settings that level the comparisons. I'm not wanting to see ATI win in all benchmarks. I'm wanting to see both sides compared equally using the same settings.

Both cards were set to near identical settings. That's as a fair a benchmark as you are going to get. Just because it does not fit your view of the world does not make it wrong. 99.9% of people who don't see red when these benchmarks come out agree that xbit is one of the best sites for reviews.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Where were you the last 2 weeks?
Same place I'm at now. I don't see how that deals with your simularity to Ackmed?
Once again for you and yours since you refuse to listen.....
This is ironic. I'm refusing to listen, yet have listened to (and seen first hand) what both sides have to offer for the generations of games we are discussing. Have you?
My comments regarding VTF had nothing to do with the benchmarks...
Exactly. Your ignorant fits about the VTF issue is a textbook example of trolling.
...but were a response to a question about ATI's problem running PF.
You respond to other questions but not to evidence that has been laid in front of you?
Both cards were set to near identical settings. That's as a fair a benchmark as you are going to get.
Incorrect, if they would have maxed everything out on both sides, what would they have missed? Even if they were to turn everything off they would be more accurate. Floating around in the clouds with close to similar settings is not what I would call accurate. As I said, next to no one is going to have an X1900 Crossfire or 7900GTX SLI and not run HQ.
Just because it does not fit your view of the world does not make it wrong. 99.9% of people who don't see red when these benchmarks come out agree that xbit is one of the best sites for reviews.
Indeed, it is very hard to see red in their benchmarks, they're too full of green. ;)

Seriously, do you think that Doom 3 would be in ATI's favor? Or do you now admit that ATI's OpenGL performance has surpassed Nvidia's?
 

Lucu

Member
Apr 26, 2005
25
0
0
Can anyone tell me how nuch better is a 7600gt against a 6600 gt ?

i'm thinking about upgrading but i won't do it unless i can get double the performance of my 6600gt for 150 euros.
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
Originally posted by: Lucu
Can anyone tell me how nuch better is a 7600gt against a 6600 gt ?

i'm thinking about upgrading but i won't do it unless i can get double the performance of my 6600gt for 150 euros.

Its about double yes, probably more.. go for it
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: Lucu
Can anyone tell me how nuch better is a 7600gt against a 6600 gt ?

i'm thinking about upgrading but i won't do it unless i can get double the performance of my 6600gt for 150 euros.
That's a good upgrade. The 7600GT does outperform the 6600 by a substantial margin. You'll like it.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: Frackal
Originally posted by: josh6079
We set up the ATI and Nvidia drivers as follows:

ATI Catalyst:

Catalyst A.I.: Standard
Mipmap Detail Level: Quality
Wait for vertical refresh: Always off
Adaptive antialiasing: Off
Temporal antialiasing: Off
Quality AF: Off
Other settings: default

Nvidia ForceWare:

Image Settings: Quality
Vertical sync: Off
Trilinear optimization: On
Anisotropic mip filter optimization: Off
Anisotropic sample optimization: On
Gamma correct antialiasing: On
Transparency antialiasing: Off
Other settings: default

:roll:

Both were standard settings which is horrible for a review of those kinds of high end systems. Why didn't they put both on "High Quality"?


Probably because ATI doesn't gain much from lowering settings (stupid thing to do anyway) but Nvidia loses a lot from raising them. I don't like Xbit personally


I tested this and I reduced ATI driver settings on my x1900xt to the performance option. Ran a banch, then moved them up to the highest quality and ran a bench. In Fear and Doom3 I gained less than 1fps when reducing the quality.

My other card is a 7800GT and I enabled HQ and benched and then changed it to Quality. There was a difference of about 5-10fps in FEAR and DOOM. There is a margin of error and no 2 runs gave the same score, but this is an average of what I saw.

I cannot comment on the IQ of either because I run on HQ always and during a benchmark I don't notice things in the scenery that look off.

This could be related to the system used. My Nvidia card is running with an AMD64 3000+ at 2.2Ghz while my x1900xt is running under a x2 3800+ at 2.6Ghz.
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Originally posted by: cmdrdredd
Originally posted by: Frackal
Originally posted by: josh6079
We set up the ATI and Nvidia drivers as follows:

ATI Catalyst:

Catalyst A.I.: Standard
Mipmap Detail Level: Quality
Wait for vertical refresh: Always off
Adaptive antialiasing: Off
Temporal antialiasing: Off
Quality AF: Off
Other settings: default

Nvidia ForceWare:

Image Settings: Quality
Vertical sync: Off
Trilinear optimization: On
Anisotropic mip filter optimization: Off
Anisotropic sample optimization: On
Gamma correct antialiasing: On
Transparency antialiasing: Off
Other settings: default

:roll:

Both were standard settings which is horrible for a review of those kinds of high end systems. Why didn't they put both on "High Quality"?


Probably because ATI doesn't gain much from lowering settings (stupid thing to do anyway) but Nvidia loses a lot from raising them. I don't like Xbit personally


I tested this and I reduced ATI driver settings on my x1900xt to the performance option. Ran a banch, then moved them up to the highest quality and ran a bench. In Fear and Doom3 I gained less than 1fps when reducing the quality.

My other card is a 7800GT and I enabled HQ and benched and then changed it to Quality. There was a difference of about 5-10fps in FEAR and DOOM. There is a margin of error and no 2 runs gave the same score, but this is an average of what I saw.

I cannot comment on the IQ of either because I run on HQ always and during a benchmark I don't notice things in the scenery that look off.


Exactly. I'm coming from a 7800GTXOC. And 5-10fps in a case where averages were probably around 50fps is 10-20% loss!
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
so any tests that DONT agree with yours are wrong?
Not necessarily, but yours are. We can see this because by your own admission you claimed you didn't have optimizations on with Quality which is not how nVidia ships their cards.

What if your testing is obviously wrong?
Unlikely.

Furthermore we have evidence like BeHardware, ComputerBase and LegitReviews that show the benchmark scores swing radically away from nVidia's favour because they use High Quality.

the performance hit isnt ALWAYS 10-15% like BFG10K claims.
But I didn't claim that at all as in fact it's often higher than that.

You claimed "just take 5% off of any nvidia card's score. problem solved" which is rather ridiculous.

but most of the time, it isnt that high.
Most of the time? You run one CPU limited benchmark and that somehow allows you to extrapolate what happens in most situations? LOL!

Of course the combination of multiple websites' data and my testing with probably around a dozen games in total mean nothing because you tested one game?

also, nvidia could have made some improvements in their drivers since BFG did his tests,
They could've but more than likely your test is flawed. Also I ran some quick checks on my 7900 GTX/91.31 system and the results pretty much matched what I got in the original tests.