Originally posted by: Wreckage
Originally posted by: Ackmed
First review ever I have seen that shows NV faster in Oblivion. That is enough for me to not take it seriously.
Translation: Any review that does not make ATI look good is invalid to Ackmed.
Their "work around" has yet to be proven to even work. MS never officially stated that ATI fully supports SM3.0.
Originally posted by: BFG10K
:roll:just take 5% off of any nvidia card's score. problem solved
Then your testing is obviously wrong.the 5% number if from my own testing.
Like I said, your testing is wrong. nVidia doesn't ship their cards with all optimizations off.Q mode in drivers, all optimizations OFF: 57.34 fps
Originally posted by: josh6079
We set up the ATI and Nvidia drivers as follows:
ATI Catalyst:
Catalyst A.I.: Standard
Mipmap Detail Level: Quality
Wait for vertical refresh: Always off
Adaptive antialiasing: Off
Temporal antialiasing: Off
Quality AF: Off
Other settings: default
Nvidia ForceWare:
Image Settings: Quality
Vertical sync: Off
Trilinear optimization: On
Anisotropic mip filter optimization: Off
Anisotropic sample optimization: On
Gamma correct antialiasing: On
Transparency antialiasing: Off
Other settings: default
:roll:
Both were standard settings which is horrible for a review of those kinds of high end systems. Why didn't they put both on "High Quality"?
Originally posted by: josh6079
I bet all those people who bought SLI Dell's left the settings at default.![]()
QFT.
Well like I said, it give a good baseline. I bet nobody has their setup the same. This is the best way to test both cards apples to apples.
Wouldn't setting both to "High Quality" be an apples to apples comparison? I know keeping everything at default is a good base line, but Xbit is a site for enthusiasts who research particular pieces of hardware, not average Joe's with SLI Dell systems that came in a box.
Originally posted by: schneiderguy
Originally posted by: BFG10K
:roll:just take 5% off of any nvidia card's score. problem solved
Then your testing is obviously wrong.the 5% number if from my own testing.
Like I said, your testing is wrong. nVidia doesn't ship their cards with all optimizations off.Q mode in drivers, all optimizations OFF: 57.34 fps
:roll:
so any tests that DONT agree with yours are wrong? What if your testing is obviously wrong? hmmm?? you always claim the performance hit from going to Q to HQ is 10 to 15 percent, thats what i tested :roll:
Originally posted by: schneiderguy
another quick test, this time going with default settings for quality (optimizations on)
A64 3500+
7600gt
1.5gig ram
CS:Source VST
1280*1024 everything on highest setting except vsync (off)
8xS AA, 16x AF, TRSS
Run #1: HQ mode set in drivers, all optimizations OFF --- 50.4 FPS Screenshot
Run #2: Q mode set in drivers, "Anisotropic Mip Optimization" and "Anisotropic Sample Optimization" both set to "on" --- 51.19 FPS Screenshot
Difference between HQ and Q with optimizations on? Less than 2%
Originally posted by: schneiderguy
another quick test, this time going with default settings for quality (optimizations on)
A64 3500+
7600gt
1.5gig ram
CS:Source VST
1280*1024 everything on highest setting except vsync (off)
8xS AA, 16x AF, TRSS
Run #1: HQ mode set in drivers, all optimizations OFF --- 50.4 FPS Screenshot
Run #2: Q mode set in drivers, "Anisotropic Mip Optimization" and "Anisotropic Sample Optimization" both set to "on" --- 51.19 FPS Screenshot
Difference between HQ and Q with optimizations on? Less than 2%
Originally posted by: Wreckage
Originally posted by: Ackmed
First review ever I have seen that shows NV faster in Oblivion. That is enough for me to not take it seriously.
Translation: Any review that does not make ATI look good is invalid to Ackmed.
Originally posted by: Wreckage
Originally posted by: apoppin
and NOTHING can be proved - to you . . . you post and then shut your eyes and mind to all reasonable replies and *proof*
No reasonable response or proof has been displayed.
Just denial and complaining.
Originally posted by: CKXP
Originally posted by: schneiderguy
another quick test, this time going with default settings for quality (optimizations on)
A64 3500+
7600gt
1.5gig ram
CS:Source VST
1280*1024 everything on highest setting except vsync (off)
8xS AA, 16x AF, TRSS
Run #1: HQ mode set in drivers, all optimizations OFF --- 50.4 FPS Screenshot
Run #2: Q mode set in drivers, "Anisotropic Mip Optimization" and "Anisotropic Sample Optimization" both set to "on" --- 51.19 FPS Screenshot
Difference between HQ and Q with optimizations on? Less than 2%
do you have Rivatuner installed? if so try checking the "Driver settings"-> Direct 3D tweaks-> Intellisample Tab....sometimes even if you switch to "HQ" mode from "Q" in the NV CP the Optimizations will still be Enabled in Rivatuner until you disable them.
Originally posted by: ShadowOfMyself
Originally posted by: schneiderguy
another quick test, this time going with default settings for quality (optimizations on)
A64 3500+
7600gt
1.5gig ram
CS:Source VST
1280*1024 everything on highest setting except vsync (off)
8xS AA, 16x AF, TRSS
Run #1: HQ mode set in drivers, all optimizations OFF --- 50.4 FPS Screenshot
Run #2: Q mode set in drivers, "Anisotropic Mip Optimization" and "Anisotropic Sample Optimization" both set to "on" --- 51.19 FPS Screenshot
Difference between HQ and Q with optimizations on? Less than 2%
Maybe it depends on what Cpu and ram you have etc... some systems will have a big drop and others wont.. Since no one tests this we will never know.. unless Nvidia decides to enable HQ by default from G80 on
Where were you the last 2 weeks?Originally posted by: josh6079
You're his equal in that slant of light. Do you hear the words you speak? Or are you spouting at the mouth and remaining oblivious to what your lips leak out?
Once again for you and yours since you refuse to listen.....My comments regarding VTF had nothing to do with the benchmarks, but were a response to a question about ATI's problem running PF.PF's VTF has nothing to do with X-bit's benchmarks.
They give a wider range of games than some other benches, but overlooked (or left out) key driver settings that level the comparisons. I'm not wanting to see ATI win in all benchmarks. I'm wanting to see both sides compared equally using the same settings.
Originally posted by: Wreckage
Granted this is SLI vs Crossfire, but it still shows NVIDIA ahead of ATI in Oblivion.
http://www.neoseeker.com/Articles/Hardware/Reviews/multigpumostplayed/3.html
Same place I'm at now. I don't see how that deals with your simularity to Ackmed?Where were you the last 2 weeks?
This is ironic. I'm refusing to listen, yet have listened to (and seen first hand) what both sides have to offer for the generations of games we are discussing. Have you?Once again for you and yours since you refuse to listen.....
Exactly. Your ignorant fits about the VTF issue is a textbook example of trolling.My comments regarding VTF had nothing to do with the benchmarks...
You respond to other questions but not to evidence that has been laid in front of you?...but were a response to a question about ATI's problem running PF.
Incorrect, if they would have maxed everything out on both sides, what would they have missed? Even if they were to turn everything off they would be more accurate. Floating around in the clouds with close to similar settings is not what I would call accurate. As I said, next to no one is going to have an X1900 Crossfire or 7900GTX SLI and not run HQ.Both cards were set to near identical settings. That's as a fair a benchmark as you are going to get.
Indeed, it is very hard to see red in their benchmarks, they're too full of green.Just because it does not fit your view of the world does not make it wrong. 99.9% of people who don't see red when these benchmarks come out agree that xbit is one of the best sites for reviews.
Originally posted by: Lucu
Can anyone tell me how nuch better is a 7600gt against a 6600 gt ?
i'm thinking about upgrading but i won't do it unless i can get double the performance of my 6600gt for 150 euros.
That's a good upgrade. The 7600GT does outperform the 6600 by a substantial margin. You'll like it.Originally posted by: Lucu
Can anyone tell me how nuch better is a 7600gt against a 6600 gt ?
i'm thinking about upgrading but i won't do it unless i can get double the performance of my 6600gt for 150 euros.
Originally posted by: Frackal
Originally posted by: josh6079
We set up the ATI and Nvidia drivers as follows:
ATI Catalyst:
Catalyst A.I.: Standard
Mipmap Detail Level: Quality
Wait for vertical refresh: Always off
Adaptive antialiasing: Off
Temporal antialiasing: Off
Quality AF: Off
Other settings: default
Nvidia ForceWare:
Image Settings: Quality
Vertical sync: Off
Trilinear optimization: On
Anisotropic mip filter optimization: Off
Anisotropic sample optimization: On
Gamma correct antialiasing: On
Transparency antialiasing: Off
Other settings: default
:roll:
Both were standard settings which is horrible for a review of those kinds of high end systems. Why didn't they put both on "High Quality"?
Probably because ATI doesn't gain much from lowering settings (stupid thing to do anyway) but Nvidia loses a lot from raising them. I don't like Xbit personally
Originally posted by: Wreckage
Granted this is SLI vs Crossfire, but it still shows NVIDIA ahead of ATI in Oblivion.
http://www.neoseeker.com/Articles/Hardware/Reviews/multigpumostplayed/3.html
Originally posted by: cmdrdredd
Originally posted by: Frackal
Originally posted by: josh6079
We set up the ATI and Nvidia drivers as follows:
ATI Catalyst:
Catalyst A.I.: Standard
Mipmap Detail Level: Quality
Wait for vertical refresh: Always off
Adaptive antialiasing: Off
Temporal antialiasing: Off
Quality AF: Off
Other settings: default
Nvidia ForceWare:
Image Settings: Quality
Vertical sync: Off
Trilinear optimization: On
Anisotropic mip filter optimization: Off
Anisotropic sample optimization: On
Gamma correct antialiasing: On
Transparency antialiasing: Off
Other settings: default
:roll:
Both were standard settings which is horrible for a review of those kinds of high end systems. Why didn't they put both on "High Quality"?
Probably because ATI doesn't gain much from lowering settings (stupid thing to do anyway) but Nvidia loses a lot from raising them. I don't like Xbit personally
I tested this and I reduced ATI driver settings on my x1900xt to the performance option. Ran a banch, then moved them up to the highest quality and ran a bench. In Fear and Doom3 I gained less than 1fps when reducing the quality.
My other card is a 7800GT and I enabled HQ and benched and then changed it to Quality. There was a difference of about 5-10fps in FEAR and DOOM. There is a margin of error and no 2 runs gave the same score, but this is an average of what I saw.
I cannot comment on the IQ of either because I run on HQ always and during a benchmark I don't notice things in the scenery that look off.
Not necessarily, but yours are. We can see this because by your own admission you claimed you didn't have optimizations on with Quality which is not how nVidia ships their cards.so any tests that DONT agree with yours are wrong?
Unlikely.What if your testing is obviously wrong?
But I didn't claim that at all as in fact it's often higher than that.the performance hit isnt ALWAYS 10-15% like BFG10K claims.
Most of the time? You run one CPU limited benchmark and that somehow allows you to extrapolate what happens in most situations? LOL!but most of the time, it isnt that high.
They could've but more than likely your test is flawed. Also I ran some quick checks on my 7900 GTX/91.31 system and the results pretty much matched what I got in the original tests.also, nvidia could have made some improvements in their drivers since BFG did his tests,