• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

A major breakthrough for ATI Cat 5.11: Doom3's performances increase 35%

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Obviously if this is true it's good news for all.

I'll wait until I see more on it to comment. I remember the black>brown Humus tweaks last year, I want to see this in drivers and critiqued by more reviewers.

Certainly good news for ATI buyers if true. Would be nice if they would actually put a few X1800XTs up for sale this year too.
 
I hope this improves all OpenGL games rather than just Doom 3.

It is quite possible that if ATI rewrites and increases their OpenGL performance considerably, they may well be sacrificing their Direct3d performance.
ATi could compile an empty OpenGL driver and it would still have no effect on Direct3D performance because the two are completely separate.

I don't see anyone installing separate OGL and D3D drivers.
Then look harder - atiogl.dll is the OpenGL driver while ATi2mag.dll is the DirectDraw/Direct3D one. The two files are completely separate.

So does this mean that ATI's OGL team is just composed of poor programmers?
Possibly. It could also be that ATi doesn't focus on OpenGL as much as they do on Direct3D. Whatever the case an OpenGL rewrite has been hinted for quite a while but hasn't yet materialized.
 
I have two theories about ATI OpenGL:

1. That nVidia, having been formed by ex-SGI people, just has a better handle on OpenGL.

2. That ATI doesn't care much about OpenGL so they've made the business decision to focus on their strength- D3d performance. (sort of like Crossfire, they feel it's not as big of a chunk of the market, so they focus elsewhere)

 
Originally posted by: SickBeast
Originally posted by: crazydingo
So how much die space does the new memory controller take up, anybody know?

I think I read it takes up over 50% of the die. :Q
Are you sure?

Originally posted by: keysplayr2003
Originally posted by: crazydingo
So how much die space does the new memory controller take up, anybody know?

Who would want to know such a thing. 😀
Curious minds. 😛

If it's a big investment then it means ATI is betting big on it to provide such kinds of boosts.
 
Originally posted by: Amuro
Big deal beating a 7800 GTX reference card by a small margin... Nobody's running their GTX @430/1200.

I'm running mine at 430/1200?

You have a point though, it's interesting how all these X1800XT benches use 430/1200 cards when there are several much faster GTXs- EVGA KO, Asus, Leadtek,and XFX to name a few.

I wonder why that is? You'd think they could give us some of the 490/1300 cards in benches given that they readily available and the X1800XT is still MIA.?
 
It's a moot point until they release the XT Rollo - as you well know.

But stock vs stock is a legitmate review. Plus ANY Ati card beating it's comparable nvidia card at Doom III is a big deal.

That being said I'm feeling more and more certain that as November rolls around we'll have plenty of XTs available for review.
 
Originally posted by: Amuro
Big deal beating a 7800 GTX reference card by a small margin...
Good performance by ATI in OpenGL isnt a big deal? :roll: I guess that is one way of beating down this news. 😀
 
I never understood why ATI payed little attention, or even at first ignored open GL. When the old 4 meg rage cards were out they had the ability to run open GL, but they chose not to and left the market open for 3DFX to own almost the first 2 years of 3d acceleration. I'm glad they're taking a more active view on open GL, especially if it is in time for Quake 4.
 
Originally posted by: Rollo
I'm running mine at 430/1200?

You're in a minority. Crank it up! 😀

You have a point though, it's interesting how all these X1800XT benches use 430/1200 cards when there are several much faster GTXs- EVGA KO, Asus, Leadtek,and XFX to name a few.

I wonder why that is? You'd think they could give us some of the 490/1300 cards in benches given that they readily available and the X1800XT is still MIA.?

Probably because while a select group of manufacturers' (BFG,eVGA,XFX mainly) sell factory overclocked GTX's, there are lots of vanilla 430/1200 cards out there as well.

And of course, comparing a stock X1800XT to a factory overclocked GTX might come off as a bit unfair 😀
 
Originally posted by: keysplayr2003
Originally posted by: jasonja
Originally posted by: keysplayr2003
Originally posted by: ZobarStyl
This looks a lot more like a bandaid on Doom3 than a fix for their truly sad OpenGL performance. If your summation is correct, it really only effects AA performance, which doesn't change the fact that even base performance (no AA/AF) in OpenGL is pretty much crap on ATi cards. Whatever happened to the full OGL rewrite?

edit like dug: Up to 35% typically means they tried it at the most inanely high setting and went from 15 to 20 fps. Grain of salt.

It is quite possible that if ATI rewrites and increases their OpenGL performance considerably, they may well be sacrificing their Direct3d performance. Maybe this is the reason they are just refusing to do it.

How so? The OGL and D3D teams are completely seperate.

I don't see anyone installing separate OGL and D3D drivers. The "teams" you mentioned may be separate, but not completely. The end result is, you only install one driver that fits all. I don't disagree with you. Just have another perspective.

They sure do! You don't get one driver when you install the Catalysts (or ANY video card driver). There's a 2D driver, a D3D driver, a OGL driver, a multimedia driver, a miniport driver, and many more. Just because you click setup and it does the work doesn't mean it's all one driver. I just counted them up and there are 19 DLL's in my system32 directory that are ATI drivers. I have the AIW X800 so I probably have a few more than the average Radeon but for sure everyone has many at least 5.

In general the teams are seperate... they may share ideas, but a OpenGL driver is completely different than a D3D driver. OpenGL drivers sit in user mode, while D3D drivers are in kernel mode. Two very different worlds of operations for drivers.
 
Originally posted by: Paratus
It's a moot point until they release the XT Rollo - as you well know.

But stock vs stock is a legitmate review. Plus ANY Ati card beating it's comparable nvidia card at Doom III is a big deal.

That being said I'm feeling more and more certain that as November rolls around we'll have plenty of XTs available for review.

That's sort of my point- the cards clocked at 490/1300 are at factory stock speed for their OEMs, and we never see them in X1800XT comparisons.


 
Originally posted by: Pabster
Originally posted by: Rollo
I'm running mine at 430/1200?

You're in a minority. Crank it up! 😀
I don't OC high end parts, I don't feel the gains are worth the risk. Also, I have SLI and am already getting more benefit than any none phase changed OC.


You have a point though, it's interesting how all these X1800XT benches use 430/1200 cards when there are several much faster GTXs- EVGA KO, Asus, Leadtek,and XFX to name a few.

I wonder why that is? You'd think they could give us some of the 490/1300 cards in benches given that they readily available and the X1800XT is still MIA.?

Probably because while a select group of manufacturers' (BFG,eVGA,XFX mainly) sell factory overclocked GTX's, there are lots of vanilla 430/1200 cards out there as well.

And of course, comparing a stock X1800XT to a factory overclocked GTX might come off as a bit unfair 😀
[/quote]
I don't know why that would be unfair, people use BFG cards for reviews all the time and they're all factory OCd?
 
It's to benchmark reference card vs. reference card. There's no reason that ATI's partners couldn't release a X1800XT that was overclocked. Judging by the overclocks some people are getting with the X1800XL's it seems likely that we could see this.

I actually think this will become more common since these partners are trying very hard to seperate their cards from others.
 
The new memory architecture takes up a lot of die space (it's the big thing in the middle, left of center).

A couple people at B3D have recently explained that the reason ATI does so poorly in D3 is because ATI's hier-z falls flat with Carmack's shadow algorithm, so ATI cards end up rendering a lot of unnecy pixels. Other OGL titles, like Bioware's games, were probably (obviously, in some cases) written with nV in mind. So it may not all be ATI incompetance, but obviously early inattention by ATI (and 3dfx, etc.) was one factor that helped nV gain OGL preeminence. I suppose you could say ATI didn't pay attn to OGL b/c their employee make-up wasn't as much SGI or they didn't deem it as important, but Carmack was pretty close to God back in the early 3D days. I'd be surprised if they didn't spend some time on his games (which basically means OGL games in general). Nowadays, the proportion--or at least the popularity--of OGL is much lower, but you'd think Linux and the workstation market would compel them to put some ppl on it (which apparently they have, per the annual rumor, although the "rewrite" may not be as radical or as tangible as we may hope).

But it's not that much of a surprise to see an ATI card with a similar fillrate and hugely greater bandwidth compete with nV with AA+AF. I'd like to know what's up with Riddick, though. Given the manufacturing date of the review cores (late Sept and early Oct) and the apparently semi-trivial work on the MC that resulted in such huge gains, I guess ATI has some general tweaking to do with R520. The re-reviews comparing a 512MB X1800XT to a 512MB 7800GTX/Ultra should be even more interesting than I expected. 🙂
 
Originally posted by: Pete
The new memory architecture takes up a lot of die space (it's the big thing in the middle, left of center).

A couple people at B3D have recently explained that the reason ATI does so poorly in D3 is because ATI's hier-z falls flat with Carmack's shadow algorithm, so ATI cards end up rendering a lot of unnecy pixels.

That sounds plausible because I heard Doom3 is designed with hypershadow in mind (ie a heavy stencil shadow load that NV40+ cards are designed to handle well). It seems that he didn't spend nearly as much time coding it for ATI's architecture; 'hypershadow or bust' if you will.

This may be a touchy subject for some (for those it is touchy for: get over it you fanboys 😉 ), but throughout the Doom3 development process, I basically heard that Carmack used Nvidia cards as a reference point, and hypershadow was a sort of give and take between him and NV - he planned on exploiting the feature heavily, and in return Nvidia emphasized it even more with Hypershadow 2 on the NV40 series.

This is in contrast to Valve, who wrote excellent DirectX 7, 8 and 9 paths for Half Life 2, allowing for multiple generations of cards to perform well in Source games. Despite their "ATI bias," Nvidia cards perform quite well on Source titles due to this more balanced approach, with the 7800GTX being basically the #1 GPU for Source these days, but good performance is all around, with no apparent chasm like with Doom3.

Other OGL titles, like Bioware's games, were probably (obviously, in some cases) written with nV in mind. So it may not all be ATI incompetance, but obviously early inattention by ATI (and 3dfx, etc.) was one factor that helped nV gain OGL preeminence. I suppose you could say ATI didn't pay attn to OGL b/c their employee make-up wasn't as much SGI or they didn't deem it as important, but Carmack was pretty close to God back in the early 3D days. I'd be surprised if they didn't spend some time on his games (which basically means OGL games in general). Nowadays, the proportion--or at least the popularity--of OGL is much lower, but you'd think Linux and the workstation market would compel them to put some ppl on it (which apparently they have, per the annual rumor, although the "rewrite" may not be as radical or as tangible as we may hope).

But it's not that much of a surprise to see an ATI card with a similar fillrate and hugely greater bandwidth compete with nV with AA+AF. I'd like to know what's up with Riddick, though. Given the manufacturing date of the review cores (late Sept and early Oct) and the apparently semi-trivial work on the MC that resulted in such huge gains, I guess ATI has some general tweaking to do with R520. The re-reviews comparing a 512MB X1800XT to a 512MB 7800GTX/Ultra should be even more interesting than I expected. 🙂

I just think that (as has been mentioned before), ATI doesn't put much of an emphasis on OpenGL at all, in their driver team, and in their product. Their bread and butter is D3D, and they are right to focus on that with the vast majority of games being D3D titles, but there's no question that there is some serious room for improvement in OpenGL performance, and they have made the issue even more sensitive by promising a completely rewritten OpenGL driver over the past couple of years, which has never materialized...

Regarding Riddick, I don't know exactly why ATI gets such abysmal performance, but I suspect their OpenGL drivers aren't helping the issue much and are mostly to blame. I could be wrong, though, as I'm not that well technically versed as many others (especially lots of the Beyond 3d regulars).

---------------

This ATI "fix" from Hexus sounds ok, but their results show nowhere near 35% improvements at any resolution; the best I can see is around 20% for a couple, and the results look sketchy. "Ultra quality" supposedly uses up to 512MB of RAM, meaning the NV cards may be unfairly penalized here since performance may be substantially better @ High Quality.

The results @ Hexus just don't look right - Nvidia usually has an even bigger lead in Riddick and Doom3 - those are ATI's achilles heel.

I'd like to see an Anandtech/Xbit comparison running HQ and Ultra Q with the final 5.11's before I make any judgements. I suspect that ATI's performance will improve, but I doubt this will be their "magic pill," and I predict Nvidia to retain dominance in Riddick and Doom3 at the end of the day.


Finally I have a question: is this fix supposed to be for the X1xxx series only, or does it apply to previous ATI cards as well?
 
The fact that ATi are able to find 35% worth of improvement in a game as old as Doom3 speaks volumes about their (lack of) driver quality in general and OpenGL quality in particular. It's all the more ironic that one ATi engineer goes by the handle "OpenGLguy" in public forums (he also worked at S3 on the savage series)...
 
I suppose you could say ATI didn't pay attn to OGL b/c their employee make-up wasn't as much SGI or they didn't deem it as important,

Its funny though since ATI's CEO Dave Orton, is formerly Senior Management at SGI. He and his cronies left SGI to form ArtX, and then on to Senior Positions at ATI after ATI bought ArtX. Makes me wonder if there may be some disdain for SGI and perhaps OpenGL? It sure seems like they should know there way around OpenGL as Orton holds several patents himself in graphics and visual technology, and his peers from SGI were top guys.
 
Back
Top