Originally posted by: BFG10K
In my case "hell no". Like I said the drivers are getting progressively worse each iteration.Does older hardware like that benefit from the newest of new drivers?
What do you think the best ones are?
Originally posted by: BFG10K
In my case "hell no". Like I said the drivers are getting progressively worse each iteration.Does older hardware like that benefit from the newest of new drivers?
Originally posted by: BFG10K
Now that I've knocked nVidia's shimmering I feel it only fair to knock ATi's driver quality which has been on a steady downward trend.
The latest 6.8 drivers are absolutely heinous on my X800 XL.
Originally posted by: BFG10K
Now that I've knocked nVidia's shimmering I feel it only fair to knock ATi's driver quality which has been on a steady downward trend.
The latest 6.8 drivers are absolutely heinous on my X800 XL.
Originally posted by: redbox
Originally posted by: josh6079
We set up the ATI and Nvidia drivers as follows:
ATI Catalyst:
Catalyst A.I.: Standard
Mipmap Detail Level: Quality
Wait for vertical refresh: Always off
Adaptive antialiasing: Off
Temporal antialiasing: Off
Quality AF: Off
Other settings: default
Nvidia ForceWare:
Image Settings: Quality
Vertical sync: Off
Trilinear optimization: On
Anisotropic mip filter optimization: Off
Anisotropic sample optimization: On
Gamma correct antialiasing: On
Transparency antialiasing: Off
Other settings: default
:roll:
Both were standard settings which is horrible for a review of those kinds of high end systems. Why didn't they put both on "High Quality"?
Like it takes some stretch of the mind to figure out why xbit never changes IQ levels. Oblivion is a game that has alot of different areas that result in alot of different fps but most of them stayed the same between the xbit and the anandtech except the 7900gtx it got 29fps on anandtech and 43.3fps on xbit same res and settings others increased too.
the x1900xtx went from 32.6fps anandtech to 42.2fps
Crossfire and SLI got a bost too.
Crossfire Anandtech 46.1fps Xbit 56.3fps
SLI Anandtech 43.5fps Xbit 56.2
So
x1900xtx difference 9.6fps
7900GTX difference 14.3fps
Xfire difference 10.2fps
SLI difference 12.7fps
I know if is a little hard to compare performance results between to benchs I just wanted to see what the break down was and what performed differently on Xbit's so that Nvidia was better in Oblivion. One thing I thought was strange was they used Nvidia's 91.31 drivers but didn't use ATI's 6.8
The Anandtech bench used the 6.4 w/chuck patch and the Nvidia 84.43 so that bench is out of date on both sides. But I just wonder why Xbit choose not to test with ATI's newest drivers if they are going to test with Nvidia's?
Only reviews that don't show ATI as teh best are misleading....of course :roll:Originally posted by: schneiderguy
Originally posted by: 5150Joker
Originally posted by: josh6079
We set up the ATI and Nvidia drivers as follows:
ATI Catalyst:
Catalyst A.I.: Standard
Mipmap Detail Level: Quality
Wait for vertical refresh: Always off
Adaptive antialiasing: Off
Temporal antialiasing: Off
Quality AF: Off
Other settings: default
Nvidia ForceWare:
Image Settings: Quality
Vertical sync: Off
Trilinear optimization: On
Anisotropic mip filter optimization: Off
Anisotropic sample optimization: On
Gamma correct antialiasing: On
Transparency antialiasing: Off
Other settings: default
:roll:
Both were standard settings which is horrible for a review of those kinds of high end systems. Why didn't they put both on "High Quality"?
Xbit reviews are misleading that's why. They always fail to use high quality settings in the CP thus avoiding's nVidia's achilles heel.
so by your line of thinking, every review on the internet is misleading because they dont all test a game like say, pacific fighters, thus avoiding ATI's achilles heel? okay![]()
Originally posted by: Todd33
What freq 7900GT? It's in both mainstream and high end, same card? I think the brands can range from like 450Mhz to like 580Mhz.
Originally posted by: BFG10K
What achilles heel?so by your line of thinking, every review on the internet is misleading because they dont all test a game like say, pacific fighters, thus avoiding ATI's achilles heel
ATi doesn't have an achiles heel with that game, nVidia does when running under High Quality mode. They also have it for many other games.
they don't look ANY different . . . what a joke!Originally posted by: Wreckage
Originally posted by: BFG10K
What achilles heel?so by your line of thinking, every review on the internet is misleading because they dont all test a game like say, pacific fighters, thus avoiding ATI's achilles heel
ATi doesn't have an achiles heel with that game, nVidia does when running under High Quality mode. They also have it for many other games.
ATI can not run Pacific Fighters in SM 3.0 mode because ATI cards lack a key feature of SM3.0 called VTF. Since ATI did not fully support the SM3.0 standard, Pacific Fighters has to be run at a lower quality than on NVIDIA cards.
Originally posted by: apoppin
they don't look ANY different . . . what a joke!
:thumbsdown:
nvidia's gfx are NOT 'better' using SM3.0 in PF . . . and ATi has a 'legal workaround' for VTF
Originally posted by: Wreckage
Originally posted by: apoppin
they don't look ANY different . . . what a joke!
:thumbsdown:
nvidia's gfx are NOT 'better' using SM3.0 in PF . . . and ATi has a 'legal workaround' for VTF
http://www.4gamer.net/specials/2005-2006_nvidia/img/05.jpg
Only took me 5 seconds to find that and prove you wrong.
Originally posted by: Wreckage
Originally posted by: apoppin
they don't look ANY different . . . what a joke!
:thumbsdown:
nvidia's gfx are NOT 'better' using SM3.0 in PF . . . and ATi has a 'legal workaround' for VTF
http://www.4gamer.net/specials/2005-2006_nvidia/img/05.jpg
Only took me 5 seconds to find that and prove you wrong.
Originally posted by: Wreckage
Originally posted by: apoppin
they don't look ANY different . . . what a joke!
:thumbsdown:
nvidia's gfx are NOT 'better' using SM3.0 in PF . . . and ATi has a 'legal workaround' for VTF
http://www.4gamer.net/specials/2005-2006_nvidia/img/05.jpg
Only took me 5 seconds to find that and prove you wrong.
Originally posted by: apoppin
damn dp
well, i guess it is now useful
Pacific Fighters VTF follow up
just for you . . . enjoy!![]()
What would there have been to address? Would he say anything that would really surprise you?Dont know why we bother. If he loses and argument he just fades to black for a day. I remember him bashing Crossfire and I provided my results -- he didn't even address them.
Originally posted by: tuteja1986
I think he means the drivers as in the stupid CCC
Anyway Richard Huddy confirmed that the X1000 generation of R520 based cards and derived R530 and RV515 ca not directly support vertex texture fetch (VTF). ATI decided to do it as an optional feature of Shader model 3.0. I guess ATI would support it if it could but it found another way around it.
ATI supports a feature called Renter to Vertex Buffer (R2VB) and it allows developers to do anything that they would want of VTF. "In its simplest form you can see that R2VB is capable of everything that VTF can do because you could trivially simply copy the vertex texture data into the vertex stream using R2VB." He added.
He believe that this is better than just doing vertex texture cache. Render to vertex buffer is built on the pixel Shader and therefore you will have all the Shader's texturing capabilities.
When Nvidia is doing Vertex texture fetch it is limited to only isotropically point samples of two different texture formats. If you use ATI's Render to Vertex Buffer you can anisotropically trilinear filter any texture format that ATI supports. ATI believes that Renter to vertex buffer is just more flexible and better way to solve the same problem and even more.
Originally posted by: redbox
Nvidia's default IQ settings however affect just about everyone that plays games on a PC. It would make much more sense for review sites to bench both cards at the same IQ levels.
Originally posted by: redbox
even Rollo said it is a dull game.
Originally posted by: schneiderguy
Originally posted by: redbox
Nvidia's default IQ settings however affect just about everyone that plays games on a PC. It would make much more sense for review sites to bench both cards at the same IQ levels.
just take 5% off of any nvidia card's score. problem solved
Originally posted by: 5150Joker
Originally posted by: BFG10K
Now that I've knocked nVidia's shimmering I feel it only fair to knock ATi's driver quality which has been on a steady downward trend.
The latest 6.8 drivers are absolutely heinous on my X800 XL.
ATi's drivers have been crappy for me since 6.2