Xbitlab's intensive review of 7 games

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: BFG10K
so any tests that DONT agree with yours are wrong?
Not necessarily, but yours are. We can see this because by your own admission you claimed you didn't have optimizations on with Quality which is not how nVidia ships their cards.

What if your testing is obviously wrong?
Unlikely.

Furthermore we have evidence like BeHardware, ComputerBase and LegitReviews that show the benchmark scores swing radically away from nVidia's favour because they use High Quality.

the performance hit isnt ALWAYS 10-15% like BFG10K claims.
But I didn't claim that at all as in fact it's often higher than that.

You claimed "just take 5% off of any nvidia card's score. problem solved" which is rather ridiculous.

but most of the time, it isnt that high.
Most of the time? You run one CPU limited benchmark and that somehow allows you to extrapolate what happens in most situations? LOL!

Of course the combination of multiple websites' data and my testing with probably around a dozen games in total mean nothing because you tested one game?

also, nvidia could have made some improvements in their drivers since BFG did his tests,
They could've but more than likely your test is flawed. Also I ran some quick checks on my 7900 GTX/91.31 system and the results pretty much matched what I got in the original tests.

p......w.......n.......e......d.......
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: BFG10K

Most of the time? You run one CPU limited benchmark and that somehow allows you to extrapolate what happens in most situations? LOL!

last time i checked, 1280*1024 with 8xS AA and 16x AF isnt cpu limited :roll:

and optimizations for quality mode WERE on with the second benchmark i did... reading comprehension FTW

how is my test flawed? i did exactly the same YOU did, WITH screenshots to back it up.
 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: schneiderguy
Originally posted by: BFG10K

Most of the time? You run one CPU limited benchmark and that somehow allows you to extrapolate what happens in most situations? LOL!

last time i checked, 1280*1024 with 8xS AA and 16x AF isnt cpu limited :roll:

and optimizations for quality mode WERE on with the second benchmark i did... reading comprehension FTW

how is my test flawed? i did exactly the same YOU did, WITH screenshots to back it up.

Your test is flawed because you run ONE bench and claim that those results are universal. You just say "take 5% off and it's all peachy" which gives us a perspective on your testing methods. All we are asking is that the sites test with the same IQ for all the games they bench. It seams that would be a better test than just to take 5% off because that is what the difference is in one old game. Do you need any more explanation on why your test is flawed?
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: redbox
Originally posted by: schneiderguy
Originally posted by: BFG10K

Most of the time? You run one CPU limited benchmark and that somehow allows you to extrapolate what happens in most situations? LOL!

last time i checked, 1280*1024 with 8xS AA and 16x AF isnt cpu limited :roll:

and optimizations for quality mode WERE on with the second benchmark i did... reading comprehension FTW

how is my test flawed? i did exactly the same YOU did, WITH screenshots to back it up.

Your test is flawed because you run ONE bench and claim that those results are universal. You just say "take 5% off and it's all peachy" which gives us a perspective on your testing methods. All we are asking is that the sites test with the same IQ for all the games they bench. It seams that would be a better test than just to take 5% off because that is what the difference is in one old game. Do you need any more explanation on why your test is flawed?

where did i claim my test is universal? all i was saying is that BFG is incorrect in saying its always a 10-15% performance hit from going to Q to HQ

 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: schneiderguy
Originally posted by: josh6079
where did i claim my test is universal?
Right here:
just take 5% off of any nvidia card's score. problem solved

5% is average. sometimes the performance hit is a lot, sometimes its next to nothing.

But that's not what you said, nor can you you claim it to be average with your test as you only benched one game and last time I checked you needed more than one piece of information to do an average. ;) The best solution is just to have sites do the tests with the IQ levels equal. Even taking off 15% or 20% all the time isn't the way it should be done. I don't see why some people have a problem with sites reviewing the cards with equal settings. If you truely want to see how the cards compare then why wouldn't you want equal testing?
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: redbox
Originally posted by: schneiderguy
Originally posted by: josh6079
where did i claim my test is universal?
Right here:
just take 5% off of any nvidia card's score. problem solved

5% is average. sometimes the performance hit is a lot, sometimes its next to nothing.

But that's not what you said, nor can you you claim it to be average with your test as you only benched one game and last time I checked you needed more than one piece of information to do an average. ;) The best solution is just to have sites do the tests with the IQ levels equal. Even taking off 15% or 20% all the time isn't the way it should be done. I don't see why some people have a problem with sites reviewing the cards with equal settings. If you truely want to see how the cards compare then why wouldn't you want equal testing?

yeah, testing at the same IQ settings would be the best. i only suggested the 5% thing because its tiring hearing certain people whine about IQ settings EVERY time a new review comes out when its relatively simple for them to just subtract 5% or whatever they want off the score for the nvidia card :confused:
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
where did i claim my test is universal?
just take 5% off of any nvidia card's score. problem solved
5% is average. sometimes the performance hit is a lot, sometimes its next to nothing.
...its relatively simple for them to just subtract 5% or whatever they want off the score for the nvidia card :confused:
Maybe it's just me, but nothing seems very accurate about your views. If people were to just subtract "whatever they wanted" from a video cards performance they would be living in a dream world. It's not about what kind of performance the consumer wants out of a video card that they're thinking about buying, it's about how much performance is available from the video card they're thinking of buying.
i only suggested the 5% thing because its tiring hearing certain people whine about IQ settings EVERY time a new review comes out...
Now your disregarding your own test and implying that you only suggested the 5% because of your annoyance with people complaining for accuracy in benchmarks?

I don't think your test is completely invalid. The performance hit could very well be around what you found it to be for Source, an aged game that isn't even a push-up for recent video cards. I understand you are wanting to make a point that the decrease in frames isn't as much as one would think all of the time, but newer and more intensive games have came out since Source's Counter Strike.
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: josh6079
Maybe it's just me, but nothing seems very accurate about your views. If people were to just subtract "whatever they wanted" from a video cards performance they would be living in a dream world. It's not about what kind of performance the consumer wants out of a video card that they're thinking about buying, it's about how much performance is available from the video card they're thinking of buying.

taking ~5% off the nvidia card's score is the next best thing to benching the cards at equal IQ settings.

Originally posted by: josh6079
i only suggested the 5% thing because its tiring hearing certain people whine about IQ settings EVERY time a new review comes out...
Now your disregarding your own test and implying that you only suggested the 5% because of your annoyance with people complaining for accuracy in benchmarks?

no, im not disregarding my test. 5% average is what ive seen the performance hit is for my 7600gt when going from Q to HQ some games are less, some games are more. thats where i got the 5% number that the whiners should subtract from the nvidia card score. sorry if that wasnt clear in my other post.

I don't think your test is completely invalid. The performance hit could very well be around what you found it to be for Source, an aged game that isn't even a push-up for recent video cards. I understand you are wanting to make a point that the decrease in frames isn't as much as one would think all of the time, but newer and more intensive games have came out since Source's Counter Strike.

i see your point there... sure, it isnt as resource intensive as other games, but it is widely played and the graphics are on par with most new games that take two x1900xtx's to run above 1280*1024 (ie oblivion, fear). the amount of stress that a game places on the graphics card shouldnt have any affect on the amount of performance loss that going from Q to HQ makes. (as long as its not CPU limited. 1280*1024 with 8xS AA and 16x AF is NOT going to be cpu limited in any game that was released in the past 2 years on my lowly 7600gt, regardless of what BFG thinks)

 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: schneiderguy
taking ~5% off the nvidia card's score is the next best thing to benching the cards at equal IQ settings.

Maybe, but what if I want to take off 20% just cause I think that's what the difference is? Which is what you suggested. We need a standard to discuss cards performance. Simple as that.

no, im not disregarding my test. 5% average is what ive seen the performance hit is for my 7600gt when going from Q to HQ some games are less, some games are more. thats where i got the 5% number that the whiners should subtract from the nvidia card score. sorry if that wasnt clear in my other post.

You were asked to show your results, you showed one bench. We said that isn't enough and you then label us whiners. If you are going to make a claim be prepared to back it up. I think it would be good to show those other bench numbers you ran to get your "average" 5%.

i see your point there... sure, it isnt as resource intensive as other games, but it is widely played and the graphics are on par with most new games that take two x1900xtx's to run above 1280*1024 (ie oblivion, fear). the amount of stress that a game places on the graphics card shouldnt have any affect on the amount of performance loss that going from Q to HQ makes. (as long as its not CPU limited. 1280*1024 with 8xS AA and 16x AF is NOT going to be cpu limited in any game that was released in the past 2 years on my lowly 7600gt, regardless of what BFG thinks)

The amount of stress a game places on a graphics card does have an affect on the performance lost with changing IQ levels. Thats your source engine might be 5% difference vs. Fear which maybe more.

I would tend to agree that your settings aren't cpu limited, but it wouldn't be that hard to find out for sure. Just down clock your cpu and see if it changes your frames a bit, or on the other end overclock if you aren't already and see if it increases your overclock. However that would be a test and from what we all see you just like to guess what the increase or decrease would be.
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: redbox
Originally posted by: schneiderguy
taking ~5% off the nvidia card's score is the next best thing to benching the cards at equal IQ settings.

Maybe, but what if I want to take off 20% just cause I think that's what the difference is? Which is what you suggested. We need a standard to discuss cards performance. Simple as that.

no, im not disregarding my test. 5% average is what ive seen the performance hit is for my 7600gt when going from Q to HQ some games are less, some games are more. thats where i got the 5% number that the whiners should subtract from the nvidia card score. sorry if that wasnt clear in my other post.

You were asked to show your results, you showed one bench. We said that isn't enough and you then label us whiners. If you are going to make a claim be prepared to back it up. I think it would be good to show those other bench numbers you ran to get your "average" 5%.

i wasnt talking about you and josh when i was talking about whiners. as for the other benches, what other games would you like to see? i have HL2/CSS/DODS, call of duty 2, oblivion, FEAR, and BF2. i dont have actual numbers to post (ie frames per second) from running benches on those games on my computer, but i would be happy to do any of those games above if you would like to see the results. those combined with BFG's numbers from his games should give everyone a good idea of the average performance hit from enabling HQ mode on nvidia cards.

I would tend to agree that your settings aren't cpu limited, but it wouldn't be that hard to find out for sure. Just down clock your cpu and see if it changes your frames a bit, or on the other end overclock if you aren't already and see if it increases your overclock. However that would be a test and from what we all see you just like to guess what the increase or decrease would be.

disabling AA makes my average frames per second in the CS Source VST increase from ~55 to ~102. that means its not CPU limited. if it was CPU limited my average framerate wouldnt go past 55

 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: schneiderguy
Originally posted by: redbox
Originally posted by: schneiderguy
taking ~5% off the nvidia card's score is the next best thing to benching the cards at equal IQ settings.

Maybe, but what if I want to take off 20% just cause I think that's what the difference is? Which is what you suggested. We need a standard to discuss cards performance. Simple as that.

no, im not disregarding my test. 5% average is what ive seen the performance hit is for my 7600gt when going from Q to HQ some games are less, some games are more. thats where i got the 5% number that the whiners should subtract from the nvidia card score. sorry if that wasnt clear in my other post.

You were asked to show your results, you showed one bench. We said that isn't enough and you then label us whiners. If you are going to make a claim be prepared to back it up. I think it would be good to show those other bench numbers you ran to get your "average" 5%.

i wasnt talking about you and josh when i was talking about whiners. as for the other benches, what other games would you like to see? i have HL2/CSS/DODS, call of duty 2, oblivion, FEAR, and BF2. i dont have actual numbers to post (ie frames per second) from running benches on those games on my computer, but i would be happy to do any of those games above if you would like to see the results. those combined with BFG's numbers from his games should give everyone a good idea of the average performance hit from enabling HQ mode on nvidia cards.

I would tend to agree that your settings aren't cpu limited, but it wouldn't be that hard to find out for sure. Just down clock your cpu and see if it changes your frames a bit, or on the other end overclock if you aren't already and see if it increases your overclock. However that would be a test and from what we all see you just like to guess what the increase or decrease would be.

disabling AA makes my average frames per second in the CS Source VST increase from ~55 to ~102. that means its not CPU limited. if it was CPU limited my average framerate wouldnt go past 55

It may be a big undertaking but I would like to see all of them. I have a volt modded 7800gt and can run some benches too. I would almost suggest that we work on benching most of the games we have and then comparing the results, but in a different thread as this isn't really about xbit's review anymore. It's more about all reviews that don't set the IQ equal. The games I have access to are BF2, HL2, HL2 episode 1, Oblivion, Far Cry, Day of Defeat, and FEAR. Let me know what you think.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
last time i checked, 1280*1024 with 8xS AA and 16x AF isnt cpu limited
For the original Source engine 1280x1024 is a low resolution.

If there is any GPU limitation it's because your xS AA mode has shifted the bottleneck onto the memory bandwidth and hence the HQ and Q modes have less of an impact like they do in shader bound situations.

and optimizations for quality mode WERE on with the second benchmark i did...
Except before you did this you were arguing there was nothing wrong with your first batch of tests.

how is my test flawed?
Already explained.

all i was saying is that BFG is incorrect in saying its always a 10-15% performance hit from going to Q to HQ
Where did I say that? Please quote me or retract that lie.

5% is average
Based on what tests? Show me your results and the statistical analysis that you used to calculate such an everage. Otherwise your 5% "average" is just something pulled out of an orifice.

its relatively simple for them to just subtract 5%
It's simple for the nVida trolls who like to detract from the issue. It's actually much more serious and complicated than "just take 5% off of any nvidia card's score. problem solved".

taking ~5% off the nvidia card's score is the next best thing to benching the cards at equal IQ settings.
:roll:
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: BFG10K
last time i checked, 1280*1024 with 8xS AA and 16x AF isnt cpu limited
For the original Source engine 1280x1024 is a low resolution.

If there is any GPU limitation it's because your xS AA mode has shifted the bottleneck onto the memory bandwidth and hence the HQ and Q modes have less of an impact like they do in shader bound situations.

and optimizations for quality mode WERE on with the second benchmark i did...
Except before you did this you were arguing there was nothing wrong with your first batch of tests.

how is my test flawed?
Already explained.

all i was saying is that BFG is incorrect in saying its always a 10-15% performance hit from going to Q to HQ
Where did I say that? Please quote me or retract that lie.

5% is average
Based on what tests? Show me your results and the statistical analysis that you used to calculate such an everage. Otherwise your 5% "average" is just something pulled out of an orifice.

its relatively simple for them to just subtract 5%
It's simple for the nVida trolls who like to detract from the issue. It's actually much more serious and complicated than "just take 5% off of any nvidia card's score. problem solved".

taking ~5% off the nvidia card's score is the next best thing to benching the cards at equal IQ settings.
:roll:


okay, you win. im stupid, you're smart. im wrong, you're right.
 

Praxis1452

Platinum Member
Jan 31, 2006
2,197
0
0
Schneider if you lose then you can't say 5% average and now have agreed with BFG to shut up.
 

blckgrffn

Diamond Member
May 1, 2003
9,676
4,308
136
www.teamjuchems.com
Wow, lots of angst here...

I would offer to do some benches too, as I always played with HQ when I remembered to flip it on after installing new drivers and didn't seem to to see a huge performance hit, but the best nvidia I have right now is a 6800 256 meg :p

Of course, "feeling" 5-10fps drop isn't the easiest if you aren't running the FPS gauge in whatever game you happen to be playing...

BTW, the shimmering that both brands offer in GW is horrendous. Yuck.

Does HQAF help with that issue? Anybody know?
 

blckgrffn

Diamond Member
May 1, 2003
9,676
4,308
136
www.teamjuchems.com
Originally posted by: redbox
Ya HQ AF helps in it. try it out and let us know if it helps in GW.


Waiting for the x1650xt ;)

I'll report back what I find. Right now, my x850xt shimmers more than my 6800 ever did.

What's crazy is that the game is perfect on my 5900SE :Q
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Originally posted by: blckgrffn
Originally posted by: redbox
Ya HQ AF helps in it. try it out and let us know if it helps in GW.


Waiting for the x1650xt ;)

I'll report back what I find. Right now, my x850xt shimmers more than my 6800 ever did.

What's crazy is that the game is perfect on my 5900SE :Q

I'll call a quiet and un-confrontational shens on that mate, especially if you used the quality setting on your 6800 ;)

Regarding the 5900, iirc it does the AF differently/properly? or something?
 

w00t

Diamond Member
Nov 5, 2004
5,545
0
0
Originally posted by: schneiderguy
As for single-chip single-card solutions, the GeForce 7900 GTX is in the lead but not by much. The 24 TMUs help this card feel confident in high resolutions with enabled FSAA. This is also the case when the game contains a lot of pixel shaders with multiple texture lookups or just a lot of high-resolution textures. On the other hand, the Radeon X1900 XTX, though having a somewhat lower average performance in comparison with the GeForce 7900 GTX, often surpasses the latter in minimum speed thanks to its ability to process more pixel shaders simultaneously. Thus, it provides a bigger speed reserve in games that make wide use of visual effects created by means of mathematics-heavy shaders. So, your choice will probably depend on what particular games you are going to play.

:thumbsup:


I'd rather have a X1900XT