- Apr 12, 2004
- 3,478
- 1
- 76
Originally posted by: Sylvanas
its great seeing ati bring out a special doom driver for us all, what customer service! i dont know any other company that would be as good to us as ati. well done!
Originally posted by: firerock
It has been said many times all over the forum, if you gonig to spend your money on just one game's benchmark, you are crazy! Either of them will do your money justice and if you prefer OGL, get Nvidia; if you prefer D3D, get ATi.
I wonder if they would stoop so low as to use replacement shaders or static clip planes. Those, application detection, Brilinear/adaptive Tri filtering are nothing new. Only difference is its OK when nVidia does it, but a horror if ATi does.Originally posted by: Rollo
I'll be interested to see if these are application specific "optomizations" and if they come at the cost of reduced texture quality.
Quack quack quack.
Oh. I forgot. These days they're called "adaptive" and the reduced IQ is "what the IQ really should be".
Originally posted by: oldfart
I wonder if they would stoop so low as to use replacement shaders or static clip planes. Those, application detection, Brilinear/adaptive Tri filtering are nothing new. Only difference is its OK when nVidia does it, but a horror if ATi does.Originally posted by: Rollo
I'll be interested to see if these are application specific "optomizations" and if they come at the cost of reduced texture quality.
Quack quack quack.
Oh. I forgot. These days they're called "adaptive" and the reduced IQ is "what the IQ really should be".
NOW you can disable Brilinear. When it first came out you couldn't. Yes, ATi needs to put that in their drivers as as option like nVidia did.Originally posted by: nemesismk2
Originally posted by: oldfart
I wonder if they would stoop so low as to use replacement shaders or static clip planes. Those, application detection, Brilinear/adaptive Tri filtering are nothing new. Only difference is its OK when nVidia does it, but a horror if ATi does.Originally posted by: Rollo
I'll be interested to see if these are application specific "optomizations" and if they come at the cost of reduced texture quality.
Quack quack quack.
Oh. I forgot. These days they're called "adaptive" and the reduced IQ is "what the IQ really should be".
The difference is that with nvidia you can disable alot of the optimizations however with ati you are stuck with them whether you want them or not! :evil:
I can't say I have noticed any application detection with my 6800gt, I have renamed a few game exe files for example and they didn't decrease in performance.
Originally posted by: oldfart
NOW you can disable Brilinear. When it first came out you couldn't. Yes, ATi needs to put that in their drivers as as option like nVidia did.Originally posted by: nemesismk2
Originally posted by: oldfart
I wonder if they would stoop so low as to use replacement shaders or static clip planes. Those, application detection, Brilinear/adaptive Tri filtering are nothing new. Only difference is its OK when nVidia does it, but a horror if ATi does.Originally posted by: Rollo
I'll be interested to see if these are application specific "optomizations" and if they come at the cost of reduced texture quality.
Quack quack quack.
Oh. I forgot. These days they're called "adaptive" and the reduced IQ is "what the IQ really should be".
The difference is that with nvidia you can disable alot of the optimizations however with ati you are stuck with them whether you want them or not! :evil:
I can't say I have noticed any application detection with my 6800gt, I have renamed a few game exe files for example and they didn't decrease in performance.
I'm not condoning this stuff one way or the other. Its just the usual double standard that I'm commenting on. That and the assumption that there is something shady going on when its ATi, but never such a comment when a performance increasing driver from nVidia comes out.
Double standard.
Originally posted by: oldfart
I'm not condoning this stuff one way or the other. Its just the usual double standard that I'm commenting on. That and the assumption that there is something shady going on when its ATi, but never such a comment when a performance increasing driver from nVidia comes out.
Double standard.
nvidia wins this round for doom3 but i will have to see hl2 benchmarks before we can really crown the winner of this generation of video cards.
Originally posted by: nitromullet
Nice increase... ATi's X800 XT actually beats the GeForce 6800 NU in the AA/AF benchmarks. Nice job ATi, your $500 MSRP card beat the competator's $300 MSRP card in Doom3 in some tests. Then again, I could see how they were caught unaware that Doom3 was coming out, it really did kinda sneak up on us with all the E3 demos.
nvidia wins this round for doom3 but i will have to see hl2 benchmarks before we can really crown the winner of this generation of video cards.
Too early to say anything definite, but....
http://www.xbitlabs.com/articles/video/display/graphics-cards-2004_26.html
Originally posted by: oldfart
I wonder if they would stoop so low as to use replacement shaders or static clip planes. Those, application detection, Brilinear/adaptive Tri filtering are nothing new. Only difference is its OK when nVidia does it, but a horror if ATi does.Originally posted by: Rollo
I'll be interested to see if these are application specific "optomizations" and if they come at the cost of reduced texture quality.
Quack quack quack.
Oh. I forgot. These days they're called "adaptive" and the reduced IQ is "what the IQ really should be".
Originally posted by: oldfart
I wonder if they would stoop so low as to use replacement shaders or static clip planes. Those, application detection, Brilinear/adaptive Tri filtering are nothing new. Only difference is its OK when nVidia does it, but a horror if ATi does.Originally posted by: Rollo
I'll be interested to see if these are application specific "optomizations" and if they come at the cost of reduced texture quality.
Quack quack quack.
Oh. I forgot. These days they're called "adaptive" and the reduced IQ is "what the IQ really should be".
nVidia fan boy Rollo:Originally posted by: Rollo
Originally posted by: oldfart
I wonder if they would stoop so low as to use replacement shaders or static clip planes. Those, application detection, Brilinear/adaptive Tri filtering are nothing new. Only difference is its OK when nVidia does it, but a horror if ATi does.Originally posted by: Rollo
I'll be interested to see if these are application specific "optomizations" and if they come at the cost of reduced texture quality.
Quack quack quack.
Oh. I forgot. These days they're called "adaptive" and the reduced IQ is "what the IQ really should be".
"The XFiles/Journal Agent Old Fart:
Conspiracy is everywhere around me. I think my toaster oven is a device planted by nVidia to monitor my posts. I must expose the works of nVidia Agent Rollo before my neighbor's mail box notifies the mothership in Santa Clara!"
