- May 7, 2005
- 5,161
- 32
- 86
We set up the ATI and Nvidia drivers as follows:
ATI Catalyst:
Catalyst A.I.: Standard
Mipmap Detail Level: Quality
Wait for vertical refresh: Always off
Adaptive antialiasing: Off
Temporal antialiasing: Off
Quality AF: Off
Other settings: default
Nvidia ForceWare:
Image Settings: Quality
Vertical sync: Off
Trilinear optimization: On
Anisotropic mip filter optimization: Off
Anisotropic sample optimization: On
Gamma correct antialiasing: On
Transparency antialiasing: Off
Other settings: default
Originally posted by: josh6079
We set up the ATI and Nvidia drivers as follows:
ATI Catalyst:
Catalyst A.I.: Standard
Mipmap Detail Level: Quality
Wait for vertical refresh: Always off
Adaptive antialiasing: Off
Temporal antialiasing: Off
Quality AF: Off
Other settings: default
Nvidia ForceWare:
Image Settings: Quality
Vertical sync: Off
Trilinear optimization: On
Anisotropic mip filter optimization: Off
Anisotropic sample optimization: On
Gamma correct antialiasing: On
Transparency antialiasing: Off
Other settings: default
:roll:
Both were standard settings which is horrible for a review of those kinds of high end systems. Why didn't they put both on "High Quality"?
Originally posted by: redbox
Originally posted by: josh6079
We set up the ATI and Nvidia drivers as follows:
ATI Catalyst:
Catalyst A.I.: Standard
Mipmap Detail Level: Quality
Wait for vertical refresh: Always off
Adaptive antialiasing: Off
Temporal antialiasing: Off
Quality AF: Off
Other settings: default
Nvidia ForceWare:
Image Settings: Quality
Vertical sync: Off
Trilinear optimization: On
Anisotropic mip filter optimization: Off
Anisotropic sample optimization: On
Gamma correct antialiasing: On
Transparency antialiasing: Off
Other settings: default
:roll:
Both were standard settings which is horrible for a review of those kinds of high end systems. Why didn't they put both on "High Quality"?
Like it takes some stretch of the mind to figure out why xbit never changes IQ levels. Oblivion is a game that has alot of different areas that result in alot of different fps but most of them stayed the same between the xbit and the anandtech except the 7900gtx it got 29fps on anandtech and 43.3fps on xbit same res and settings others increased too.
the x1900xtx went from 32.6fps anandtech to 42.2fps
Crossfire and SLI got a bost too.
Crossfire Anandtech 46.1fps Xbit 56.3fps
SLI Anandtech 43.5fps Xbit 56.2
So
x1900xtx difference 9.6fps
7900GTX difference 14.3fps
Xfire difference 10.2fps
SLI difference 12.7fps
I know if is a little hard to compare performance results between to benchs I just wanted to see what the break down was and what performed differently on Xbit's so that Nvidia was better in Oblivion. One thing I thought was strange was they used Nvidia's 91.31 drivers but didn't use ATI's 6.8
The Anandtech bench used the 6.4 w/chuck patch and the Nvidia 84.43 so that bench is out of date on both sides. But I just wonder why Xbit choose not to test with ATI's newest drivers if they are going to test with Nvidia's?
Originally posted by: redbox
But I just wonder why Xbit choose not to test with ATI's newest drivers if they are going to test with Nvidia's?
Originally posted by: Wreckage
Originally posted by: redbox
But I just wonder why Xbit choose not to test with ATI's newest drivers if they are going to test with Nvidia's?
The title of the article is "Seven Games and One Week". 6.8 just came out today. Xbit was probably testing all last week. They could have used 91.45 as well.
Good review overall, it gives a nice baseline without all the little odd tweaks (that each card has) being messed with. I like how they use a broader range of games instead of a handful of FPS games and 3Dmark like most sites.
Originally posted by: josh6079
Originally posted by: Wreckage
Originally posted by: redbox
But I just wonder why Xbit choose not to test with ATI's newest drivers if they are going to test with Nvidia's?
The title of the article is "Seven Games and One Week". 6.8 just came out today. Xbit was probably testing all last week. They could have used 91.45 as well.
Good review overall, it gives a nice baseline without all the little odd tweaks (that each card has) being messed with. I like how they use a broader range of games instead of a handful of FPS games and 3Dmark like most sites.
True, I like their range of games as well, but I don't see why they use those default settings when someone would have a top of the line Crossfire or SLI setup. Who honestly has two 7900GTX's in SLI and uses the default "Quality" setting in the driver?
I bet all those people who bought SLI Dell's left the settings at default.![]()
Well like I said, it give a good baseline. I bet nobody has their setup the same. This is the best way to test both cards apples to apples.
Originally posted by: josh6079
I bet all those people who bought SLI Dell's left the settings at default.![]()
QFT.
Well like I said, it give a good baseline. I bet nobody has their setup the same. This is the best way to test both cards apples to apples.
Wouldn't setting both to "High Quality" be an apples to apples comparison? I know keeping everything at default is a good base line, but Xbit is a site for enthusiasts who research particular pieces of hardware, not average Joe's with SLI Dell systems that came in a box.
Originally posted by: Wreckage
Originally posted by: redbox
But I just wonder why Xbit choose not to test with ATI's newest drivers if they are going to test with Nvidia's?
The title of the article is "Seven Games and One Week". 6.8 just came out today. Xbit was probably testing all last week. They could have used 91.45 as well.
Good review overall, it gives a nice baseline without all the little odd tweaks (that each card has) being messed with. I like how they use a broader range of games instead of a handful of FPS games and 3Dmark like most sites.
Yeah, it's betaOriginally posted by: redbox
I just went to Nvidia.com are the 91.45 beta?
Originally posted by: josh6079
We set up the ATI and Nvidia drivers as follows:
ATI Catalyst:
Catalyst A.I.: Standard
Mipmap Detail Level: Quality
Wait for vertical refresh: Always off
Adaptive antialiasing: Off
Temporal antialiasing: Off
Quality AF: Off
Other settings: default
Nvidia ForceWare:
Image Settings: Quality
Vertical sync: Off
Trilinear optimization: On
Anisotropic mip filter optimization: Off
Anisotropic sample optimization: On
Gamma correct antialiasing: On
Transparency antialiasing: Off
Other settings: default
:roll:
Both were standard settings which is horrible for a review of those kinds of high end systems. Why didn't they put both on "High Quality"?
Originally posted by: Wreckage
Yeah, it's betaOriginally posted by: redbox
I just went to Nvidia.com are the 91.45 beta?
<a target=_blank class=ftalternatingbarlinklarge href="ftp://download.nvidia.com/Windows/91.45">ftp://download.nvidia.com/Windows/91.45</a>
Originally posted by: tuteja1986
Originally posted by: josh6079
We set up the ATI and Nvidia drivers as follows:
ATI Catalyst:
Catalyst A.I.: Standard
Mipmap Detail Level: Quality
Wait for vertical refresh: Always off
Adaptive antialiasing: Off
Temporal antialiasing: Off
Quality AF: Off
Other settings: default
Nvidia ForceWare:
Image Settings: Quality
Vertical sync: Off
Trilinear optimization: On
Anisotropic mip filter optimization: Off
Anisotropic sample optimization: On
Gamma correct antialiasing: On
Transparency antialiasing: Off
Other settings: default
:roll:
Both were standard settings which is horrible for a review of those kinds of high end systems. Why didn't they put both on "High Quality"?
Became XBit Lab cators for the noobs now!!
When VR-Zone did decent bechmarking , the idiot editor didn't realise that he was using High Quadity Settingahh the irony
In terms of image quality ATi's default quality with optimizations on matches or exceeds nVidia's High Quality with optimizations off.Both were standard settings which is horrible for a review of those kinds of high end systems. Why didn't they put both on "High Quality"?
Originally posted by: BFG10K
In terms of image quality ATi's default quality with optimizations on matches or exceeds nVidia's High Quality with optimizations off.Both were standard settings which is horrible for a review of those kinds of high end systems. Why didn't they put both on "High Quality"?