• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

X-bit labs update their Doom III Benchmarks with Cats 4.9Beta

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
nVidia fan boy Rollo:
ATi has a new driver. Lets be sure to spam the video forum with the usual useless anti ATi drivel.

There's nothing untrue about what I said Old AgentMulder.

ATI cheated on a driver for the 8500 to make it bench Quake3 better. They stopped when they were caught.

ATI didn't provide full trilinear filtering, and didn't tell anyone they weren't. They even went so far as to state that their method is what trilinear filtering should be. (even though it's obviously not)

So pardon my skepticism when they come up with 12-15% performance increases for Doom3 a couple days after it's release. Where do you think the performance came from? Radical new revelations about how their 2 year old core works, that they just happened to figure out a couple days after Doom3 came out? Uh huh.


Old Fart, my wife will be selling her car within a year. I want to offer it to you first, and give you a real sweet deal......
 
Originally posted by: Rollo
nVidia fan boy Rollo:
ATi has a new driver. Lets be sure to spam the video forum with the usual useless anti ATi drivel.

There's nothing untrue about what I said Old AgentMulder.

ATI cheated on a driver for the 8500 to make it bench Quake3 better. They stopped when they were caught.

ATI didn't provide full trilinear filtering, and didn't tell anyone they weren't. They even went so far as to state that their method is what trilinear filtering should be. (even though it's obviously not)

So pardon my skepticism when they come up with 12-15% performance increases for Doom3 a couple days after it's release. Where do you think the performance came from? Radical new revelations about how their 2 year old core works, that they just happened to figure out a couple days after Doom3 came out? Uh huh.


Old Fart, my wife will be selling her car within a year. I want to offer it to you first, and give you a real sweet deal......

The most likely explanation for ATi suddenly gaining 12-15% with this new driver is simple. They had not touched their OpenGL drivers in over a year. When Doom3 came along they scrambled and assigned a huge team of programmers to OpenGL.

Honestly, performance in ATi's GL drivers has not gone up in over a year. This has been discussed elsewhere on the internet. I can see why you're skeptical (8500 Quack3 thing) but from what I can tell these drivers are legit.
 
The most likely explanation for ATi suddenly gaining 12-15% with this new driver is simple. They had not touched their OpenGL drivers in over a year. When Doom3 came along they scrambled and assigned a huge team of programmers to OpenGL.
ATI has been getting beat in Open GL for many moons, with the only exception I know of the 9 months between 9700P and 5900U.

Honestly, performance in ATi's GL drivers has not gone up in over a year. This has been discussed elsewhere on the internet. I can see why you're skeptical (8500 Quack3 thing) but from what I can tell these drivers are legit.
I'll wait for some review sites to dissect them. When you're playing it's hard to say, "Hmmm that texture looks 10% less detailed".

We need some magnified sections of screen shots,comparison of the mip bands, scrutiny of the drivers etc. to make a call on these.
 
Why don't we simply ask the Nvidia driver team to take a look the new 4.9 Cats? I mean, Nvidia has done more (*cough*)optimizing(*cough*) than all the rest of the manufacturers combined. They should be able to spot anything fishy in no time.
 
Originally posted by: SickBeast
I'm pretty sure ATi doesn't want nVidia to see the source code of their drivers. 😛


Maybe if they said,"Canadian buddies, it saddens us to see your less than expected performance at Doom3. As Doom3 will be part of all reviewers benchmarks for the next few years, we'd like to help you achieve the results we have, and establish a parity in the marketplace."
 
Originally posted by: Rollo
nVidia fan boy Rollo:
ATi has a new driver. Lets be sure to spam the video forum with the usual useless anti ATi drivel.

There's nothing untrue about what I said Old AgentMulder.

ATI cheated on a driver for the 8500 to make it bench Quake3 better. They stopped when they were caught.

ATI didn't provide full trilinear filtering, and didn't tell anyone they weren't. They even went so far as to state that their method is what trilinear filtering should be. (even though it's obviously not)

So pardon my skepticism when they come up with 12-15% performance increases for Doom3 a couple days after it's release. Where do you think the performance came from? Radical new revelations about how their 2 year old core works, that they just happened to figure out a couple days after Doom3 came out? Uh huh.


Old Fart, my wife will be selling her car within a year. I want to offer it to you first, and give you a real sweet deal......
And we all know how squeaky clean nVidia has been. Never once have they ever had a questionable optimization. Never. If they did, you would be just as quick to criticize them for it. You are so unbiased after all.

Not looking for a car, but thanks for the offer anyway.
 
Originally posted by: Rollo
Originally posted by: SickBeast
I'm pretty sure ATi doesn't want nVidia to see the source code of their drivers. 😛

Maybe if they said,"Canadian buddies, it saddens us to see your less than expected performance at Doom3. As Doom3 will be part of all reviewers benchmarks for the next few years, we'd like to help you achieve the results we have, and establish a parity in the marketplace."

I was gonna say something about beer but I changed my mind. I have a lousy "Canadian" beer sitting in front of me that's no longer Canadian. 🙂

Needless to say, nVidia would never do that. The more likely scenario would be nVidia scouring through ATi's drivers and reverse-engineering a 6900GT w/ X800 features, and then launching a hostile takeover of ATi once they fully PWNED them. 😉
 
Originally posted by: oldfart
Originally posted by: Rollo
nVidia fan boy Rollo:
ATi has a new driver. Lets be sure to spam the video forum with the usual useless anti ATi drivel.

There's nothing untrue about what I said Old AgentMulder.

ATI cheated on a driver for the 8500 to make it bench Quake3 better. They stopped when they were caught.

ATI didn't provide full trilinear filtering, and didn't tell anyone they weren't. They even went so far as to state that their method is what trilinear filtering should be. (even though it's obviously not)

So pardon my skepticism when they come up with 12-15% performance increases for Doom3 a couple days after it's release. Where do you think the performance came from? Radical new revelations about how their 2 year old core works, that they just happened to figure out a couple days after Doom3 came out? Uh huh.


Old Fart, my wife will be selling her car within a year. I want to offer it to you first, and give you a real sweet deal......
And we all know how squeaky clean nVidia has been. Never once have they ever had a questionable optimization. Never. If they did, you would be just as quick to criticize them for it. You are so unbiased after all.

Not looking for a car, but thanks for the offer anyway.

Where did I say nVidia never optomized? I don't remember lieing like that?

I don't know why anyone wouldn't look at these new drivers with a critical eye. The performance increase is substantial, and even on Rage 3d many have stated that Doom3 may well be a hardware limitation that will have to be lived with?
 
No, but you didn't jump all over it like you do with ATi either. In fact, you either said nothing, or defended nvidias optimizations.

From the benches I've seen, the gain is OK, but not enough IMHO. It's clear ATi needs to get their OpenGL act together. Not just a patch for one game. They have always been behind on OpenGL performance.

I wont jump on them or nVidia assuming the worst. Why not just wait for the reviewers and users to get the real information instead of making negative assumptions? It IS possible to have a performance increase without doing anything funny, right?

XBit's conclusion was
ATI?s new beta CATALYST 4.9 drivers did not bring the Doom III performance crown to ATI?s RADEON X800 XT hardware, even though it also did not degrade image quality. Now those, who already own, or, only plan to acquire, an ATI?s latest graphics card should probably either hope for totally re-done ATI?s OpenGL drivers, or pay attention to more sophisticated GeForce 6800-lineup.
One review, which isn't absolute, but I dont see any wrong doing here.
 
One review, which isn't absolute, but I dont see any wrong doing here.
Nobody saw their brilinear either until that German website pointed it out.

I'm just very skeptical of the BS "Doom 3 was ATIs wake up call! They know they have to make better drivers now and have put their best scientists to work on it! They have just made a breakthrough in understanding how their two year old core works with OpenGL!"

BS. They've been getting spanked in OpenGL by nVidia for YEARS. The release of this game didn't prompt some think tank on the order of a Martian invasian with revolutionary new insights into the way old hardware works.

If this happened with the nV40, I might believe it, because it's a new core that probably will see 10% gains on subsequent driver revisions.

In two years if we get big gains on the nV40 in Open GL, I'll guess then that they came at the expense of IQ as well, unless they release some document that shows how their Z culling didn't work with the game and they fixed it or something like that.
 
Actually, I think Dave at B3D pointed it out first in his forums, but 3DC was the first to make an article out of it.

As to ATi's D3 gains, I think their relatively low Wolf:ET performance shows they have room for improvement in their OGL driver. Dunno what's the bottleneck in D3, though (drivers or hardware). We'll know for sure in a few months, I guess.
 
Problem is, ATI lags in other areas, not only this.. Compatibility was ALWAYS an issue for me with my 9800Pro.. Some other games would suddenly "Break" with a new driver from ATI, and totally is unacceptable, and tells me they still have bad drivers. I would have to swap drivers in my 9800Pro almost 7-10 times a week depending on what games I was playing.

So back to Nvidia with me, I had my fill of ATI... Never again.
 
LOL this is funny.

Ive been finding out now that D3s engine is not very mathematical in its texture organizing and stuff, and so it use more look ups and other non math calculations in texture like bump mapping.

This in fact helps the old NV architecture as well as the new architecture, like the 6800, the ATI cards are well known for calculating math instructions faster and more efficently than the NV cards. This happens to help the NV cards run much faster as they dont have to calculate massive maths stuff.

While on the other hand ATis have to still go through that process because of the way its built.

The thing i find most exciting is finding out that the D3 engine has a SEPERATE render and math path to the engine specifically for the NV3x architecture, hmm, so i ask, why couldnt they have this for the ATi architecture.

And ppl now have found within some files in D3 that u can change some properties so it works out more of the textures in maths than the "specific" NV pathways.

And this is what i think the driver did in some way. It could have unlocked the card so it doesnt have to always do maths and so its not using much of its processing power, as with the D3 engine the NV hardware is doing 50% more work than the ATI, so this means that the engine doesnt use most of what ATI has to offer, and so in turn, the drivers are reconfiguring and sorting everything out so that more of the hardware is used to run the engine...

Also, i would like to add this link in also about why ATi might be doing badly in the Doom3 engine, because of the different use of the Z-buffering and colour information, and it looks like it can hopefully be sorted out somehow, or maybe not, we dont know...
 
Back
Top