X-bit labs update their Doom III Benchmarks with Cats 4.9Beta

Smilin

Diamond Member
Mar 4, 2002
7,357
0
0
Hm. Always nice to see improvement but I'm still glad I sold the x800pro and got a 6800 GT. Other games may differ, but with Doom 3 ATI is getting pwned.
 

jjyiz28

Platinum Member
Jan 11, 2003
2,901
0
0
nice, ati said that in every new release of calalyst drviers, they will improve openGL performnace slighlty.

so dont expect ati to ever bring out a driver that will give on par performace with 6800 series at once.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,004
126
If ATi delivered the promised overhauled OpenGL ICD I'm sure they'd get more performance. The problem is that they've done absolutely nothing about it all this time.
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
its great seeing ati bring out a special doom driver for us all, what customer service! i dont know any other company that would be as good to us as ati. well done!
 

nemesismk2

Diamond Member
Sep 29, 2001
4,810
5
76
www.ultimatehardware.net
Originally posted by: Sylvanas
its great seeing ati bring out a special doom driver for us all, what customer service! i dont know any other company that would be as good to us as ati. well done!

Someone please pass me the sickbag? :disgust:

Do you honestly believe what you just said? If ATI had great customer service they would have done something about their OpenGL performance years ago not just because Nvidia is beating them in Doom3. ATI can see sales slipping through their fingers because of all the bad pr from the Doom3 articles which constantly say that the 6800 is the video card for Doom3! :)

BTW the 6800 range is STILL the video card for Doom3!

From that article the quote i like the best is:-

"ATI?s upcoming 4.9 beta drivers obtained already after the article?s first publication deliver a speed bump for ATI?s RADEON X800 hardware, however, the additional fps are not enough to outperform any of the GeForce 6800-family graphics processing units."
 

firerock

Senior member
Jun 2, 2004
404
0
0
It has been said many times all over the forum, if you gonig to spend your money on just one game's benchmark, you are crazy! Either of them will do your money justice and if you prefer OGL, get Nvidia; if you prefer D3D, get ATi.
 

imported_tss4

Golden Member
Jun 30, 2004
1,607
0
0
Originally posted by: firerock
It has been said many times all over the forum, if you gonig to spend your money on just one game's benchmark, you are crazy! Either of them will do your money justice and if you prefer OGL, get Nvidia; if you prefer D3D, get ATi.

Not sure, I agree with you there. ATI gets pummeled in OGL. Nvidia barely gets beat in in D3D. Seems the 6800 series is a better performer across the spectrum of games than ATI's. Just my opinion.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
I'll be interested to see if these are application specific "optomizations" and if they come at the cost of reduced texture quality.

Quack quack quack.

Oh. I forgot. These days they're called "adaptive" and the reduced IQ is "what the IQ really should be".
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
Originally posted by: Rollo
I'll be interested to see if these are application specific "optomizations" and if they come at the cost of reduced texture quality.

Quack quack quack.

Oh. I forgot. These days they're called "adaptive" and the reduced IQ is "what the IQ really should be".
I wonder if they would stoop so low as to use replacement shaders or static clip planes. Those, application detection, Brilinear/adaptive Tri filtering are nothing new. Only difference is its OK when nVidia does it, but a horror if ATi does.
 

nemesismk2

Diamond Member
Sep 29, 2001
4,810
5
76
www.ultimatehardware.net
Originally posted by: oldfart
Originally posted by: Rollo
I'll be interested to see if these are application specific "optomizations" and if they come at the cost of reduced texture quality.

Quack quack quack.

Oh. I forgot. These days they're called "adaptive" and the reduced IQ is "what the IQ really should be".
I wonder if they would stoop so low as to use replacement shaders or static clip planes. Those, application detection, Brilinear/adaptive Tri filtering are nothing new. Only difference is its OK when nVidia does it, but a horror if ATi does.

The difference is that with nvidia you can disable alot of the optimizations however with ati you are stuck with them whether you want them or not! :evil:

I can't say I have noticed any application detection with my 6800gt, I have renamed a few game exe files for example and they didn't decrease in performance.
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
Originally posted by: nemesismk2
Originally posted by: oldfart
Originally posted by: Rollo
I'll be interested to see if these are application specific "optomizations" and if they come at the cost of reduced texture quality.

Quack quack quack.

Oh. I forgot. These days they're called "adaptive" and the reduced IQ is "what the IQ really should be".
I wonder if they would stoop so low as to use replacement shaders or static clip planes. Those, application detection, Brilinear/adaptive Tri filtering are nothing new. Only difference is its OK when nVidia does it, but a horror if ATi does.

The difference is that with nvidia you can disable alot of the optimizations however with ati you are stuck with them whether you want them or not! :evil:

I can't say I have noticed any application detection with my 6800gt, I have renamed a few game exe files for example and they didn't decrease in performance.
NOW you can disable Brilinear. When it first came out you couldn't. Yes, ATi needs to put that in their drivers as as option like nVidia did.

I'm not condoning this stuff one way or the other. Its just the usual double standard that I'm commenting on. That and the assumption that there is something shady going on when its ATi, but never such a comment when a performance increasing driver from nVidia comes out.
Double standard.
 

imported_tss4

Golden Member
Jun 30, 2004
1,607
0
0
Originally posted by: oldfart
Originally posted by: nemesismk2
Originally posted by: oldfart
Originally posted by: Rollo
I'll be interested to see if these are application specific "optomizations" and if they come at the cost of reduced texture quality.

Quack quack quack.

Oh. I forgot. These days they're called "adaptive" and the reduced IQ is "what the IQ really should be".
I wonder if they would stoop so low as to use replacement shaders or static clip planes. Those, application detection, Brilinear/adaptive Tri filtering are nothing new. Only difference is its OK when nVidia does it, but a horror if ATi does.

The difference is that with nvidia you can disable alot of the optimizations however with ati you are stuck with them whether you want them or not! :evil:

I can't say I have noticed any application detection with my 6800gt, I have renamed a few game exe files for example and they didn't decrease in performance.
NOW you can disable Brilinear. When it first came out you couldn't. Yes, ATi needs to put that in their drivers as as option like nVidia did.

I'm not condoning this stuff one way or the other. Its just the usual double standard that I'm commenting on. That and the assumption that there is something shady going on when its ATi, but never such a comment when a performance increasing driver from nVidia comes out.
Double standard.

I've only had my 6800gt a few weeks and every driver improvement I've seen from nvidia (with the exception of ps2.0 in farcry) has been minimal. I just don't see how you can improve performance 10% or better without incorporating new technology unless you're doing something that either degrades IQ or "cheats" for a better word. Don't get me wrong though. I really see nothing wrong with cheating. If a driver can detect a program and optimize for it. I say do it. And I'm not bashing ATI. They make great cards. They're taking a beating right now, but they'll bounce back better than ever. 6 months from now I'm sure they'll put something out to shame the nvidia owners and then the cycle will repeat itself. Competition is good.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: oldfart

I'm not condoning this stuff one way or the other. Its just the usual double standard that I'm commenting on. That and the assumption that there is something shady going on when its ATi, but never such a comment when a performance increasing driver from nVidia comes out.
Double standard.

To be fair, oldfart, the double standard used to go the other way (and rightly so - Nvidia was caught red-handed with optimizations last year). Lately, both companies seem to have reduced their cheaty ways (or at least the public perception of it); neither company has been recently caught for image-quality deterring optimizations, unlike the Nvidia ones of old.

With that said, it's true that there are an obvious handful here @ AT who love to pounce into every ATI thread and chime in with essentially "I bet ATI is cheating... Nvidia would never do this to us." Ironically, many of these members are longtime AT forum members who were here through the whole Nvidia image-quality debate.

I don't know if it's insecurity or what - Nvidia has this round won so far; rubbing it in is just beating a dead horse IMO. The X800XT is too expensive for the masses and the X800 Pro gets clobbered by the GT; until ATI gets their own GT out the door, which will (ideally) make the X800Pro the competitor to the 6800Nu, they will continue to lose sales to 6800 series.

It's a bit early, but the 6800GT thus far is proving to be this round's 9800 Pro, so to speak.
 

stickybytes

Golden Member
Sep 3, 2003
1,043
0
0
nvidia wins this round for doom3 but i will have to see hl2 benchmarks before we can really crown the winner of this generation of video cards.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Nice increase... ATi's X800 XT actually beats the GeForce 6800 NU in the AA/AF benchmarks. Nice job ATi, your $500 MSRP card beat the competator's $300 MSRP card in Doom3 in some tests. Then again, I could see how they were caught unaware that Doom3 was coming out, it really did kinda sneak up on us with all the E3 demos.

nvidia wins this round for doom3 but i will have to see hl2 benchmarks before we can really crown the winner of this generation of video cards.

Too early to say anything definite, but....
http://www.xbitlabs.com/articles/video/display/graphics-cards-2004_26.html
 

stnicralisk

Golden Member
Jan 18, 2004
1,705
1
0
Originally posted by: nitromullet
Nice increase... ATi's X800 XT actually beats the GeForce 6800 NU in the AA/AF benchmarks. Nice job ATi, your $500 MSRP card beat the competator's $300 MSRP card in Doom3 in some tests. Then again, I could see how they were caught unaware that Doom3 was coming out, it really did kinda sneak up on us with all the E3 demos.

nvidia wins this round for doom3 but i will have to see hl2 benchmarks before we can really crown the winner of this generation of video cards.

Too early to say anything definite, but....
http://www.xbitlabs.com/articles/video/display/graphics-cards-2004_26.html

Go to the next page and it shows Nvidia owning STALKER.
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Here we go again.It's like a vicious circle...
When HL2 will be released and will probably be the performance leader then fanATIc ppl will say the same things... :disgust:
I didn't bought my 6800GT based on DoomIII's perf. I would never do that.
Either way you can't go wrong. Both are great this round,based on the vast majority of the benches.
And don't look at benches of non released games!
Highly Anticipated DX9 game? Plz don't make me laugh
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Originally posted by: oldfart
Originally posted by: Rollo
I'll be interested to see if these are application specific "optomizations" and if they come at the cost of reduced texture quality.

Quack quack quack.

Oh. I forgot. These days they're called "adaptive" and the reduced IQ is "what the IQ really should be".
I wonder if they would stoop so low as to use replacement shaders or static clip planes. Those, application detection, Brilinear/adaptive Tri filtering are nothing new. Only difference is its OK when nVidia does it, but a horror if ATi does.

Why not. NV already does, at least for their older cards. Remember what Carmack said himself, in that HOCP D3 preview article about older NV3x cards "falling off the fast path"?
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: oldfart
Originally posted by: Rollo
I'll be interested to see if these are application specific "optomizations" and if they come at the cost of reduced texture quality.

Quack quack quack.

Oh. I forgot. These days they're called "adaptive" and the reduced IQ is "what the IQ really should be".
I wonder if they would stoop so low as to use replacement shaders or static clip planes. Those, application detection, Brilinear/adaptive Tri filtering are nothing new. Only difference is its OK when nVidia does it, but a horror if ATi does.

"The XFiles/Journal Agent Old Fart:
Conspiracy is everywhere around me. I think my toaster oven is a device planted by nVidia to monitor my posts. I must expose the works of nVidia Agent Rollo before my neighbor's mail box notifies the mothership in Santa Clara!"
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
Originally posted by: Rollo
Originally posted by: oldfart
Originally posted by: Rollo
I'll be interested to see if these are application specific "optomizations" and if they come at the cost of reduced texture quality.

Quack quack quack.

Oh. I forgot. These days they're called "adaptive" and the reduced IQ is "what the IQ really should be".
I wonder if they would stoop so low as to use replacement shaders or static clip planes. Those, application detection, Brilinear/adaptive Tri filtering are nothing new. Only difference is its OK when nVidia does it, but a horror if ATi does.

"The XFiles/Journal Agent Old Fart:
Conspiracy is everywhere around me. I think my toaster oven is a device planted by nVidia to monitor my posts. I must expose the works of nVidia Agent Rollo before my neighbor's mail box notifies the mothership in Santa Clara!"
nVidia fan boy Rollo:
ATi has a new driver. Lets be sure to spam the video forum with the usual useless anti ATi drivel.