AT's Real World DirectX 10 Performance: It Ain't Pretty

CP5670

Diamond Member
Jun 24, 2004
5,660
762
126
Wow, those Call of Juarez framerates are completely pathetic, even worse than the others. Is the game just badly optimized? At least in that screenshot, it looks good but not really any better than existing DX9 games.
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
No, the drivers just suck all around on both sides of the fence. They barely have Dx9 games working at close to full speed reliably in Vista, it's going to take a few more months (I hope only a few!) to get Dx10 drivers optimized.
 

pcslookout

Lifer
Mar 18, 2007
11,959
157
106
I expected this. I say bring on the push for more powerful hardware! This happen when DX9 first came out as well! The biggest thing to remember is that if you want great playable frame rates at very high resolutions (1600x1200 or above), 4x FSAA 16AF, and all details set on maximum you always need to wait a while because a video card that can handle this never comes out the first time the new DirectX comes out. Its always 6 to 9 months later if not longer. Just look at dx9 for comparison.
 

imaheadcase

Diamond Member
May 9, 2005
3,850
7
76
What I don't get is how come old DX9 games run just fine in Vista now but put directx10 in and the frame rates drop in half..while all the developers are saying "You can do most of the optimizations in games that DX10 does".

So it begs the question..are game developers just bad programmers or something else is up?

For instance I get same fps pretty much on bf2 as i did on XP now (was terrible at first drivers). But Company of Heros after DX10 patch was released, my framerate HALVED with the only noticeable difference was lighting and more objects on the ground. Does changing the lighting and putting more crap on ground really worth it for terrible performance vs dx9?
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
Originally posted by: imaheadcase
What I don't get is how come old DX9 games run just fine in Vista now but put directx10 in and the frame rates drop in half..while all the developers are saying "You can do most of the optimizations in games that DX10 does".

So it begs the question..are game developers just bad programmers or something else is up?

For instance I get same fps pretty much on bf2 as i did on XP now (was terrible at first drivers). But Company of Heros after DX10 patch was released, my framerate HALVED with the only noticeable difference was lighting and more objects on the ground. Does changing the lighting and putting more crap on ground really worth it for terrible performance vs dx9?

It's pretty obviously video card drivers.
 

Fenixgoon

Lifer
Jun 30, 2003
33,160
12,606
136
Originally posted by: imaheadcase
What I don't get is how come old DX9 games run just fine in Vista now but put directx10 in and the frame rates drop in half..while all the developers are saying "You can do most of the optimizations in games that DX10 does".

So it begs the question..are game developers just bad programmers or something else is up?

For instance I get same fps pretty much on bf2 as i did on XP now (was terrible at first drivers). But Company of Heros after DX10 patch was released, my framerate HALVED with the only noticeable difference was lighting and more objects on the ground. Does changing the lighting and putting more crap on ground really worth it for terrible performance vs dx9?

devs are new to DX10, and so are nvidia/ati - both game coding and driver optimization simply will not be there at the debut of a technology like DX10. give it some more time and then we'll REALLY see what dx10 can do
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
AT's Real World DirectX 10 Performance: It Ain't Pretty

neither is nvidia's ... even their high-range barely struggles thru DX10 paths if you really try to max it at high resolution [i thnk we are seeing "averages", right?]

CoJ and CoH aren't "Full DX10 games" ... they are DX9 games with "patched in" and rewitten DX10 features ... we read the way CoH had to be ReWritten - originally planned and complied as dx10 - and the devs were crippled by lack of early HW and SW so they scaled it way back

... if you run the CoJ benchmark it explains WHY it runs so much slower then the DX9 pathway .... it does so much more
The lighting model has been upgraded to be completely per pixel with softer and more shadows. All lights can cast shadows, making night scenes more detailed than on the DX9 version. These shadows are created by generating cube maps on the fly from each light source and using a combination of instancing and geometry shading to create the effect.
...

if you have a DX10 card and Vista, i urge everyone to DL and run the CoJ benchmark for yourselves ... then you will know
-it ain't pretty - at "max" - for any single GPU

we have to see if games like Crysis are coded for efficiency or just to *add* more textures and features like 'HDR correct AA' or particle based physics water droplets

i HAD to do an edit:

DW conclusions are totally right-on: [in part]
For now, AMD does seem to have an advantage in Call of Juarez, while NVIDIA leads the way in Company of Heroes and Lost Planet. But as far as NVIDIA vs. AMD in DirectX 10 performance, we really don't want to call a winner right now. It's just way too early, and there are many different factors behind what we are seeing here ...

there are really three ways a game can come to support DirectX 10, and almost all games over the next few years will ship with a DX9 path as well. The easiest thing is to do a straight port of features from DirectX 9 (which should generally be slightly faster than the DirectX 9 counterpart if drivers are of equal quality). We could also see games offer a DirectX 10 version with enhanced features that could still be implemented in DX9 in order to offer an incentive for users to move to a DX10 capable platform. The most aggressive option is to implement a game focused around effects that can only be effectively achieved through DirectX 10.

Games which could absolutely only be done in DX10 won't hit for quite a while for a number of reasons. The majority of users will still be on DX9 platforms. It is logical to spend the most effort developing for the user base that will actually be paying for the games. Developers are certainly interested in taking advantage of DX10, but all games for the next couple of years will definitely have a DX9 path. It doesn't make sense to rewrite everything from the ground up if you don't have to.

We are also hearing that some of the exclusive DX10 features that could enable unique and amazing effects DX9 isn't capable of just don't perform well enough on current hardware. Geometry shader heavy code, especially involving geometry amplification, does not perform equally well on all available platforms (and we're looking at doing some synthetic tests to help demonstrate this). The performance of some DX10 features is lacking to the point where developers are limited in how intensely they can use these new features.

Developers (usually) won't write code that will work fine on one platform and not at all on another ...

we get the impression that straight ports of DX9 to DX10 won't be the norm either. After all, why would a developer want to spend extra time and effort developing, testing and debugging multiple code paths that do exactly the same thing? This fact, combined with the lack of performance in key DX10 features on current hardware, means it's very likely that the majority of DX10 titles coming out in the near term will only be slightly enhanced versions of what could have been done through DX9.

Both NVIDIA and AMD were very upset over how little we thought of their DX10 class mainstream hardware. They both argued that graphics cards are no longer just about 3D, and additional video decode hardware and DX10 support add a lot of value above the previous generation. We certainly don't see it this way. Yes, we can't expect last years high-end performance to trickle down to the low-end segment, but we should at least demand that this generation's $150 part will always outperform last generation's.

good well thought out conclusion .. DX10 is here but not quite ready for "primetime" ;)
 

dreddfunk

Senior member
Jun 30, 2005
358
0
0
I think "AT's" stands for "AnandTech's" not "ATI"--referencing the publisher of the article.

I think the jury is still out until we see a lot more finished DX10 products that can be benchmarked reliably.

Still, I agree with Cookie Monster: unless you need the added video decoding features present in the DX10 midrange products from nVidia and ATI, there seems little reason to purchase them over better DX9-based products. It just doesn't look like they'll perform well enough in DX10 to make using DX10 features playable.

People might actually get better gameplay experiences using the DX9 codepath on better (but older) DX9 hardware.

All will be revealed in due time, I suppose.

Now where, oh where is that 8900GS with 96 shaders and 512MB of memory, with a single-slot cooling system?

Pretty please, nVidia, with sugar on it?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: dreddfunk
I think "AT's" stands for "AnandTech's" not "ATI"--referencing the publisher of the article.

I think the jury is still out until we see a lot more finished DX10 products that can be benchmarked reliably.

Still, I agree with Cookie Monster: unless you need the added video decoding features present in the DX10 midrange products from nVidia and ATI, there seems little reason to purchase them over better DX9-based products. It just doesn't look like they'll perform well enough in DX10 to make using DX10 features playable.

People might actually get better gameplay experiences using the DX9 codepath on better (but older) DX9 hardware.

All will be revealed in due time, I suppose.

Now where, oh where is that 8900GS with 96 shaders and 512MB of memory, with a single-slot cooling system?

Pretty please, nVidia, with sugar on it?

i am not sure even 8900GS is enough or that it matters ... if the low end defines gaming then this is quite true from the conclusion:
AMD and NVIDIA had the chance to define the minimum performance of a DX10 class part higher than what we can expect from cards that barely get by with DX9 code. By choosing to design their hardware without a significant, consistent performance advantage over the X1600 and 7600 class of parts, developers have even less incentive (not to mention ability) to push next generation features only possible with DX10 into their games. These cards are just not powerful enough to enable widespread use of any features that reach beyond the capability of DirectX 9.

Even our high-end hardware struggled to keep up in some cases, and the highest resolution we tested was 2.3 megapixels. Pushing the resolution up to 4 MP (with 30" display resolutions of 2560x1600) brings all of our cards to their knees. In short, we really need to see faster hardware before developers can start doing more impressive things with DirectX 10.
they are both looking after their own bottom line :p
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
If memory serves me correct when the 9700pro came out heralding the new generation of DX9 it breezed through the new DX9 titles.... no such thing this generation. :(
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Sylvanas
If memory serves me correct when the 9700pro came out heralding the new generation of DX9 it breezed through the new DX9 titles.... no such thing this generation. :(

i don't think so ... performance hogs like DE-IW killed it

and the 9600p was pretty weak also

we survived DX9 and i think it will be oK for DX10 .. if nvidia and AMD get a move on .. perhaps they really are conspiring to max their individual profit by releasing similar low end junk
:Q

maybe intel will shock em both on the low end
 

pcslookout

Lifer
Mar 18, 2007
11,959
157
106
Originally posted by: apoppin
Originally posted by: dreddfunk
I think "AT's" stands for "AnandTech's" not "ATI"--referencing the publisher of the article.

I think the jury is still out until we see a lot more finished DX10 products that can be benchmarked reliably.

Still, I agree with Cookie Monster: unless you need the added video decoding features present in the DX10 midrange products from nVidia and ATI, there seems little reason to purchase them over better DX9-based products. It just doesn't look like they'll perform well enough in DX10 to make using DX10 features playable.

People might actually get better gameplay experiences using the DX9 codepath on better (but older) DX9 hardware.

All will be revealed in due time, I suppose.

Now where, oh where is that 8900GS with 96 shaders and 512MB of memory, with a single-slot cooling system?

Pretty please, nVidia, with sugar on it?

i am not sure even 8900GS is enough or that it matters ... if the low end defines gaming then this is quite true from the conclusion:
AMD and NVIDIA had the chance to define the minimum performance of a DX10 class part higher than what we can expect from cards that barely get by with DX9 code. By choosing to design their hardware without a significant, consistent performance advantage over the X1600 and 7600 class of parts, developers have even less incentive (not to mention ability) to push next generation features only possible with DX10 into their games. These cards are just not powerful enough to enable widespread use of any features that reach beyond the capability of DirectX 9.

Even our high-end hardware struggled to keep up in some cases, and the highest resolution we tested was 2.3 megapixels. Pushing the resolution up to 4 MP (with 30" display resolutions of 2560x1600) brings all of our cards to their knees. In short, we really need to see faster hardware before developers can start doing more impressive things with DirectX 10.
they are both looking after their own bottom line :p

There goes people with their 30 inch lcd monitors :D
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I read a bunch of reviews (here and elsewhere) and finally decided to go with just a cheap x1950xt for 149.99 after mir at newegg and see how the dx10 cards look in 6-12 mos.
 

pcslookout

Lifer
Mar 18, 2007
11,959
157
106
Originally posted by: apoppin
my prediction:

SLI and X-fire will get much more popular due to these performance-hogging titles

--it IS a conspiracy
:Q

There goes my motherboard and power supply I have now then.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
damn I dont even want to buy a video card anymore after looking at those 19x12 numbers.
 

pcslookout

Lifer
Mar 18, 2007
11,959
157
106
Originally posted by: Matt2
damn I dont even want to buy a video card anymore after looking at those 19x12 numbers.

This is normal don't you remember the beginning of the DX9 days ?
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Yeah, but I wanted to get a new video as my X1900XTX gets blown to bits by almost every game at 1900x1200.

I think i might as well wait for some refreshes.
 

pcslookout

Lifer
Mar 18, 2007
11,959
157
106
Originally posted by: Matt2
Yeah, but I wanted to get a new video as my X1900XTX gets blown to bits by almost every game at 1900x1200.

I think i might as well wait for some refreshes.

Then get a EVGA Geforce 8800 GTS then step up to a refresh. Just make you plan the purchase so you can step up to one of the refreshes. Remember they only give you 90 days to step up. So get the card around October.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Matt2
Yeah, but I wanted to get a new video as my X1900XTX gets blown to bits by almost every game at 1900x1200.

I think i might as well wait for some refreshes.
how many refreshes?

you need crossfire
--only problem is you need 2900xt Xfire for your LCD

that's why i started small when my CRT started to go. ... ONE 2900xt is fine ... for now. :p


sorry to keep mentioning your LCD
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: pcslookout
Originally posted by: Matt2
damn I dont even want to buy a video card anymore after looking at those 19x12 numbers.

This is normal don't you remember the beginning of the DX9 days ?

What games couldn't I play well on a 9800 Pro, or even on a 9500/9600 Pro? It's only in the last year or so that you really NEEDED a better card than 9xxx series... I got along pretty well with a 9600 (non-pro) until I built a whole new PC early last year... I played FEAR, Doom 3, DOD:S, CoD:2... no DX9 game wouldn't work on it.

With DirectX 10, I think there are three major problems, and combined they kill performance. First of all, developers have not really learned to use DirectX 10 effectively. The first batch of DX10 games are little more than DX9 games with patches... it seems CoH was made with DX10 in mind, but still, the devs still didn't even have hardware to test their game on until last November. When we see Crysis, Alan Wake, UT3, etc.... later this year, I think we are going to see much more optimized DX10 titles that will perform better than the first we see now.

Part 2 of the problem is lack of good drivers. Both nVidia and AMD clearly have very immature DX10 drivers; nVidia's drivers are only 5 months old, AMD's DX10 drivers have only been around for 2~ months. Both AMD/nV have seen big increases with recent drivers, and as their driver teams get better w/ DX10 and they see how developers are using it, they will definately get better. I think this is a big part of the problem.

Part 3 is a simple lack of hardware that is powerful enough to take advantage of what DX10 can really do. Part of the problem is probably that the current hardware is not optimized for DX10; I think nVidia, at least, built the G80 to be very good at DX9 games, with DX10 being less important. The next generation of hardware will be much more powerful than what we have now and likely be more optimized for DX10, because when it comes out DX10 games will be widespread and DX9 will matter less.
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
Originally posted by: Extelleron
Originally posted by: pcslookout
Originally posted by: Matt2
damn I dont even want to buy a video card anymore after looking at those 19x12 numbers.

This is normal don't you remember the beginning of the DX9 days ?

What games couldn't I play well on a 9800 Pro, or even on a 9500/9600 Pro? It's only in the last year or so that you really NEEDED a better card than 9xxx series... I got along pretty well with a 9600 (non-pro) until I built a whole new PC early last year... I played FEAR, Doom 3, DOD:S, CoD:2... no DX9 game wouldn't work on it.

With DirectX 10, I think there are three major problems, and combined they kill performance. First of all, developers have not really learned to use DirectX 10 effectively. The first batch of DX10 games are little more than DX9 games with patches... it seems CoH was made with DX10 in mind, but still, the devs still didn't even have hardware to test their game on until last November. When we see Crysis, Alan Wake, UT3, etc.... later this year, I think we are going to see much more optimized DX10 titles that will perform better than the first we see now.

Part 2 of the problem is lack of good drivers. Both nVidia and AMD clearly have very immature DX10 drivers; nVidia's drivers are only 5 months old, AMD's DX10 drivers have only been around for 2~ months. Both AMD/nV have seen big increases with recent drivers, and as their driver teams get better w/ DX10 and they see how developers are using it, they will definately get better. I think this is a big part of the problem.

Part 3 is a simple lack of hardware that is powerful enough to take advantage of what DX10 can really do. Part of the problem is probably that the current hardware is not optimized for DX10; I think nVidia, at least, built the G80 to be very good at DX9 games, with DX10 being less important. The next generation of hardware will be much more powerful than what we have now and likely be more optimized for DX10, because when it comes out DX10 games will be widespread and DX9 will matter less.

I hope Nvidia is really putting most of the horsepower towards that... Seeing these DX10 benchmarks makes me realize their claims about "close to 1 teraflop" are actually REQUIRED to run DX10 games smoothly

As for AMD... I really dont see how they are gonna pull off nice DX10 performance until R700... Although its nice to see the R600 sitting between the GTX and GTS again
 

imaheadcase

Diamond Member
May 9, 2005
3,850
7
76
devs are new to DX10, and so are nvidia/ati - both game coding and driver optimization simply will not be there at the debut of a technology like DX10. give it some more time and then we'll REALLY see what dx10 can do

Well thats not an excuse with company of Heroes, they worked WITH Microsoft with DX10, DX10 was not a after thought with CoH, it was made with it in mind. It was not rigged to work with dx10.
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: ShadowOfMyself
Originally posted by: Extelleron
Originally posted by: pcslookout
Originally posted by: Matt2
damn I dont even want to buy a video card anymore after looking at those 19x12 numbers.

This is normal don't you remember the beginning of the DX9 days ?

What games couldn't I play well on a 9800 Pro, or even on a 9500/9600 Pro? It's only in the last year or so that you really NEEDED a better card than 9xxx series... I got along pretty well with a 9600 (non-pro) until I built a whole new PC early last year... I played FEAR, Doom 3, DOD:S, CoD:2... no DX9 game wouldn't work on it.

With DirectX 10, I think there are three major problems, and combined they kill performance. First of all, developers have not really learned to use DirectX 10 effectively. The first batch of DX10 games are little more than DX9 games with patches... it seems CoH was made with DX10 in mind, but still, the devs still didn't even have hardware to test their game on until last November. When we see Crysis, Alan Wake, UT3, etc.... later this year, I think we are going to see much more optimized DX10 titles that will perform better than the first we see now.

Part 2 of the problem is lack of good drivers. Both nVidia and AMD clearly have very immature DX10 drivers; nVidia's drivers are only 5 months old, AMD's DX10 drivers have only been around for 2~ months. Both AMD/nV have seen big increases with recent drivers, and as their driver teams get better w/ DX10 and they see how developers are using it, they will definately get better. I think this is a big part of the problem.

Part 3 is a simple lack of hardware that is powerful enough to take advantage of what DX10 can really do. Part of the problem is probably that the current hardware is not optimized for DX10; I think nVidia, at least, built the G80 to be very good at DX9 games, with DX10 being less important. The next generation of hardware will be much more powerful than what we have now and likely be more optimized for DX10, because when it comes out DX10 games will be widespread and DX9 will matter less.

I hope Nvidia is really putting most of the horsepower towards that... Seeing these DX10 benchmarks makes me realize their claims about "close to 1 teraflop" are actually REQUIRED to run DX10 games smoothly

As for AMD... I really dont see how they are gonna pull off nice DX10 performance until R700... Although its nice to see the R600 sitting between the GTX and GTS again

The HD 2900XT is actually pretty good in DX10... Company of Heroes and Lost Planet were both nVidia-supported titles, Call of Juarez is somewhat of an ATI-supported game. I believe that when UT3 and Crysis come around, the HD 2900XT is going to be a bit faster than the GTX. I don't think AMD has anything to worry about; the HD 2900XT is very strong in shader power, and that's what DX10 wants. With a 65nm or 55nm R700, perhaps with multiple GPUs on one card, I think they have the opportunity to lead the pack again.