DirectX 10: It was meant to reduce CPU and Bandwidth Overhead

ashishmishra

Senior member
Nov 23, 2005
906
0
76
DirectX 10 will increase game performance by as much as six to eight times [:laugh:]. Much of that will be accomplished with smarter resource management, improving API and driver efficiencies, and moving more work from the CPU to the GPU. "The entire API and pipeline have been redesigned from the ground-up to maximize performance, and minimize CPU and bandwidth overhead[:laugh:]," according to Microsoft. Furthermore, "The idea behind D3D10 is to maximize what the GPU can do without CPU interaction, and when the CPU is needed it?s a fast, streamlined, pipeline-able operation." Giving the GPU more efficient ways to write and access data will reduce CPU overhead costs by keeping more of the work on the video card.

Source: Direct X 10 Article Gamespot


I remember reading articles in the press like the one quoted above on how DX10 will be so much faster at rendering etc. etc. We all have already have had our hearts broken on that front, now I was just thinking how Derek observed in his 9800GX2 quad SLI article that Crysis DX10 on Very High is CPU limited and even System Bound on some platforms. So there we go, the second promise about reducing CPU/System limitation in the whole rendering process starts appearing hollow.

It just doesn't end there, even World In Conflict, which is being an RTS makes it understandable that it is CPU bound. However, when we simply switch rendering mode to DX10 without adding the DX10 exclusive Shadows From Clouds option, the frames still go down. Now theoretically since the load from the cpu would have been lowered due to DX10's less overheads, we should see improvements at least in CPU limited scenarios, however we see no improvements whatsoever.

Now in conclusion, my theory is that the CPU overheads must not have been significant enough to make a difference even if they managed a 400% reduction (Random Figure). Why do these Marketing types make these bold claims without much substance??

Any thoughts on the matter?
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
It's way too early to judge based on first-gen games whose dev cycles likely pre-dated the public release of the API, the platform it runs on, and any supporting hardware.

There is also the issue that most developers that have bothered to support it this early in it's lifecycle opted to massively increase the use of performance intensive effects in the Dx10 mode for those games in order to showcase the feature.

At this point, I think that the concept of high-level shader-based programming is starting to settle out and mature, and we won't see as drastic changes to D3D in the new few DirectX versions. This will give developers a chance to catch up and start optimizing more as they won't have to start over with a new shader model paradigm every 24 months.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
I have seen examples where DX10 was added to games resulting in improved FPS AND improved graphics... the reason you see such extreme lag is because those who design DX10 games just went completely wild with adding extra eye candy. Much of which is not even noticeable.
 

ashishmishra

Senior member
Nov 23, 2005
906
0
76
aka1nas and taltamir: I guess that makes sense, that it order to make it noticeable the developers go overboard on the graphical effects, so low performance from current gen of graphic cards is understandable. I just wonder why are we seeing increased CPU and system limitations with crysis for example, I for one never expected it to be this CPU limited judging on how it looks.

Also any clues on the first ground up DX10 game and estimated launch timeframe.
 

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
You will probably not see a true DX10 game until some actually makes a game engine that fully supports DX10 from ground up.

Cry engine 2 (DX9 / hybrid DX10)
Unreal 3.0 (DX9 engine) DX10 support added later
ID tech 5 (opengl / DX9) ID has no plan to support DX10
Source engine (DX9) DX10 support coming with Episode 3

Their is not a single engine that supports DX10 fully and as far as i know, no one is developing a engine that will support DX10 fully. Currently you will only see hybrid or add few feature of DX10 like games like Bioshock have.


Cry engine 3 or Unreal 4.0 will most probably fully support DX10 but they are years away from release.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: tuteja1986
You will probably not see a true DX10 game until some actually makes a game engine that fully supports DX10 from ground up.

Cry engine 2 (DX9 / hybrid DX10)
Unreal 3.0 (DX9 engine) DX10 support added later
ID tech 5 (opengl / DX9) ID has no plan to support DX10
Source engine (DX9) DX10 support coming with Episode 3

Their is not a single engine that supports DX10 fully and as far as i know, no one is developing a engine that will support DX10 fully. Currently you will only see hybrid or add few feature of DX10 like games like Bioshock have.


Cry engine 3 or Unreal 4.0 will most probably fully support DX10 but they are years away from release.

Yeah, it will be a while yet before we see a 100% DX10 coded game. After all, the devs have to sell games to those that can play them. A large user base still has DX9 hardware, and they can't abandon them lest they shoot themselves in the buttocks. :D
 

CVSiN

Diamond Member
Jul 19, 2004
9,289
1
0
So what is age of conan? is it also a hybrid? or was it written for DX10 and then stripped down for the legacy peeps?
I was under the impression it was added right at the beginning of the dev cycle.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: keysplayr2003
Originally posted by: tuteja1986
You will probably not see a true DX10 game until some actually makes a game engine that fully supports DX10 from ground up.

Cry engine 2 (DX9 / hybrid DX10)
Unreal 3.0 (DX9 engine) DX10 support added later
ID tech 5 (opengl / DX9) ID has no plan to support DX10
Source engine (DX9) DX10 support coming with Episode 3

Their is not a single engine that supports DX10 fully and as far as i know, no one is developing a engine that will support DX10 fully. Currently you will only see hybrid or add few feature of DX10 like games like Bioshock have.


Cry engine 3 or Unreal 4.0 will most probably fully support DX10 but they are years away from release.

Yeah, it will be a while yet before we see a 100% DX10 coded game. After all, the devs have to sell games to those that can play them. A large user base still has DX9 hardware, and they can't abandon them lest they shoot themselves in the buttocks. :D

You guys have been saying the same thing for SM3 vs. SM2, and now it DX10 vs. DX9. I wish people would just abandon the concept that coding "from the ground up" for a certain rendering method would somehow magically free the game of legacy code that's slowing it down, because it's not how things work. DX10 is just a graphics API, and regardless of whether you use DX10, DX9 or OpenGL, all these get translated by the driver into machine code that the video card understands. Many modern games are already limited by the video hardware at the highest settings, and there's no reason why coding the same game exclusively for DX10 would alleviate that problem. If anything, you could blame the poor performance on immature drivers and the fact that Vista requires more resources to do the same thing as XP. I was never expecting to see games with an order of magnitude improvement in performance from DX10 like M$ was hyping it up.
 

SergeC

Senior member
May 7, 2005
484
0
71
recently wrote an article about comparing Crisys DX9 to DX10. It seems DX10 uses significantly less system RAM with the same settings.

Perhaps there's some truth to it?
 

PingSpike

Lifer
Feb 25, 2004
21,754
599
126
Why does everyone think there is some magic bullet that will allow photorealistic graphics right now? On todays hardware! I can understand Microsoft saying the bullshit about DX10 performing 100 million times better...but I can't understand why anyone believes it. Not after every company in existence has peddled a similar lie about their software or hardware.

As for the "DX10 only engine" and its mythically better performance...get used to its mythical status, at least for the foreseeable future. I doubt any game developer in their right mind is going to be developing anything but a "hybrid" engine any time soon. Not if they don't want their customer base slashed.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: SergeC
HOCP<<recently wrote an article about comparing Crisys DX9 to DX10. It seems DX10 uses significantly less system RAM with the same settings.

Perhaps there's some truth to it?

Do you have a link to the article? I'd be interested to read it, as I've also noticed this and commented on it. DX10 uses less system RAM with e_precache level on than DX9, like 1285MB compared to 825MB @ 1920x1200 Textures @ High. My guess is that less static texturing was being used to render the scene with a heavier dose of pixel shaders, particle effects and post processing.

As for the original comment about DX10 being a bust, my impression was that DX10 would make what was impossible to accomplish in DX9 while maintaining playability, possible. That doesn't mean you would see improved performance compared to DX9, it just means that you'd get better visuals while maintaining playability, rather than improved visuals and a slideshow if attempted in DX9. I think DX10 has definitely accomplished this.

Edit: lol just saw it was a HOCP reference causing the highlights
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Something else the article does not mention is consoles.
The trend now is becoming , develop for consoles first, then port to pc.
The xbox 360 does not support dx10.
So if I'm a developer and I code for the 360 first I'm going to make a dx9 engine.
Then if I really want to do dx10 on the pc, I will add dx10 support.

That last part rarely happens, and if it does, the implementation of dx10 is lacking.

My hope is that developers start to look more at OpenGL which would allow them to implement engines that look as good as their dx10 counterparts, but code for all platforms at once.
 

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
Originally posted by: munky
Originally posted by: keysplayr2003
Originally posted by: tuteja1986
You will probably not see a true DX10 game until some actually makes a game engine that fully supports DX10 from ground up.

Cry engine 2 (DX9 / hybrid DX10)
Unreal 3.0 (DX9 engine) DX10 support added later
ID tech 5 (opengl / DX9) ID has no plan to support DX10
Source engine (DX9) DX10 support coming with Episode 3

Their is not a single engine that supports DX10 fully and as far as i know, no one is developing a engine that will support DX10 fully. Currently you will only see hybrid or add few feature of DX10 like games like Bioshock have.


Cry engine 3 or Unreal 4.0 will most probably fully support DX10 but they are years away from release.

Yeah, it will be a while yet before we see a 100% DX10 coded game. After all, the devs have to sell games to those that can play them. A large user base still has DX9 hardware, and they can't abandon them lest they shoot themselves in the buttocks. :D

You guys have been saying the same thing for SM3 vs. SM2, and now it DX10 vs. DX9. I wish people would just abandon the concept that coding "from the ground up" for a certain rendering method would somehow magically free the game of legacy code that's slowing it down, because it's not how things work. DX10 is just a graphics API, and regardless of whether you use DX10, DX9 or OpenGL, all these get translated by the driver into machine code that the video card understands. Many modern games are already limited by the video hardware at the highest settings, and there's no reason why coding the same game exclusively for DX10 would alleviate that problem. If anything, you could blame the poor performance on immature drivers and the fact that Vista requires more resources to do the same thing as XP. I was never expecting to see games with an order of magnitude improvement in performance from DX10 like M$ was hyping it up.

I never meant DX10 100% exclusively engine. I meant a game that actually put native DX10 support than just tacking on the code to enable DX10.

HL 2 was one of the 1st to actually real advantage of DX9 shaders and it did show the difference. People were bitching about the same thing then and will always.

http://au.gamespot.com/pages/i...id=914642&img=323#next
vs
http://image.com.com/gamespot/...hmark/dx_screen001.jpg

But what i am looking for is a game that clearly show that DX10 looks way better than DX9. Crysis was suppose to be that game but turned out they just locked the feature out for DX9 and some people found a hack to enable for DX9.

Harald Seeley @ GDC talk where he said the biggest problem for Crysis DX10 was that they did a tacky job by sticking in DX10 code without real customization and got some nvidia guy to help with performance tweaking. They said they can improve the performance but they have do some major rehaul with their engine inherit problems to take advantage of new gpu architecture. Crysis 2 according to them is suppose to slove serious problem with DX10 they had with Crysis 1.

Cry engine 2.5:

"Adding native support for DirectX 10 and Vista, re-working our multi-threading approach to support the latest multi-core processors, implementing our greatly expanded integrated physics system, adding a completely new animation system which seamlessly combines procedural and pre-recorded animations, and replacing much of the need for creating game-side C++ code and LUA scripts with the output of our visual flow graph editor, are just a few of the highlights. But of course, what everyone immediately notices is our new near-photorealistic renderer, which has completely eliminated the need for pre-baked light or shadow maps, and provides fully dynamic HDR lighting, soft shadows, sub-surface scattering and ambient occlusion, among other features, all in real time." Harald Seeley @ GDC 2008.

They had this whole seminar @ GDC about why DX10 implementation failed in Crysis and what lesson did Crytek learned.
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Originally posted by: tuteja1986

But what i am looking for is a game that clearly show that DX10 looks way better than DX9. Crysis was suppose to be that game but turned out they just locked the feature out for DX9 and some people found a hack to enable for DX9.

What someone really needs to do is code a demo that shows off dx10 features that would completely blow away anything that could be done in dx9.

I don't think we will see a game doing it anytime soon.
Its just not worth it in development time.
If everyone was using dx10 it would be worth it.
But when the majority of your user base is dx9, its better to spend the cash on that.
 

PingSpike

Lifer
Feb 25, 2004
21,754
599
126
Originally posted by: Modelworks
Originally posted by: tuteja1986

But what i am looking for is a game that clearly show that DX10 looks way better than DX9. Crysis was suppose to be that game but turned out they just locked the feature out for DX9 and some people found a hack to enable for DX9.

What someone really needs to do is code a demo that shows off dx10 features that would completely blow away anything that could be done in dx9.

I don't think we will see a game doing it anytime soon.
Its just not worth it in development time.
If everyone was using dx10 it would be worth it.
But when the majority of your user base is dx9, its better to spend the cash on that.

Is such a beast even reasonable to expect? I'm not up to snuff on all this stuff...but is DX10 even really capable of doing things that DX9 can't? I remember talk about how it had more efficient ways to do some things that were I think already in DX9...and they removed some limitations. What are the new killer DX10 features though?

Its worth a side note here, that each generation of graphics represents diminished returns. It is probably unreasonable to expect amazing changes every time.

Anyway, If you want to target all of the PC with DX10 features, your choice is clear...OpenGL. To my knowledge, no one created an artificial OS level scism for that API. :p
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
Originally posted by: munky


You guys have been saying the same thing for SM3 vs. SM2, and now it DX10 vs. DX9. I wish people would just abandon the concept that coding "from the ground up" for a certain rendering method would somehow magically free the game of legacy code that's slowing it down, because it's not how things work. DX10 is just a graphics API, and regardless of whether you use DX10, DX9 or OpenGL, all these get translated by the driver into machine code that the video card understands. Many modern games are already limited by the video hardware at the highest settings, and there's no reason why coding the same game exclusively for DX10 would alleviate that problem. If anything, you could blame the poor performance on immature drivers and the fact that Vista requires more resources to do the same thing as XP. I was never expecting to see games with an order of magnitude improvement in performance from DX10 like M$ was hyping it up.

It's not so much being able to write a solely DX 10 engine that makes a difference, but rather giving the developers time to become familiar with the new API and what works well in it.

It would be reasonable to expect a team to write relatively worse code on a first production attempt at Dx10 that is tacked on to a product near the end of a dev cycle than on an engine/game that the same team has had a year or more of experience to apply to.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Hmmm, where to start.

D3D from the beginning had enormous problems on numerous different aspects with even basic rendering sequences. As time progressed, MS continued to make huge strides in rectifying the situations. One example of these would be small batch geometric data. I'm using that one as it directly relates to a change made moving from DX9-DX10. In reality I would say that an order of magnitude increase in performance would be a tad on the conservative side, it is likely a bit more then that. Now the problem lies in the fact that most people aren't running DX10 hardware or Vista yet, and taking advantage of the enormous improvements that DX10 offers for this particular aspect requires not just proper code support, but also asset reworking. To properly take advantage of this element you would need a duplicate set of art assets for a great deal of content along with having to have a completely seperate code base in order to handle the different assets. Yes, the API does just transfer calls, the problem is it did that horrificly poorly for certain types of rendering techniques in the past.

Will you ever see this displayed in games? No, because the DX10 version would run at 60fps versus 5 for the DX9 version or you would be looking at completely different art assets and code bases when trying to compare performance. This is far from the only element that would offer the kind of performance boost that MS was talking about, but in realistic terms we shouldn't expect these types of features to be actually utilized until DX10 is considered the bare minimum to run new games(likely not for at least another 4-5 years). Honestly, the processor overhead isn't something I know nearly as much about, however the example the OP used relates to the increased physics load at the highest settings in Crysis, that is something outside of D3D and hence can't be utilized as an example one way or the other.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: ashishmishra
aka1nas and taltamir: I guess that makes sense, that it order to make it noticeable the developers go overboard on the graphical effects, so low performance from current gen of graphic cards is understandable. I just wonder why are we seeing increased CPU and system limitations with crysis for example, I for one never expected it to be this CPU limited judging on how it looks.

Also any clues on the first ground up DX10 game and estimated launch timeframe.

Side by side comparisons in crysis make it clear that even at identical in game settings the DX10 version looks better and uses much more intensive features then the DX9 version.

Incomplete DX9 functions CAN be enabled to give improved DX9 appearance as well. So basically what happed with crysis:
DX9 stuff coded
Some high end DX9 stuff was dropped and rewritten in DX10.

As for "coding from the ground up"... I would imagine there is also programming techniques and familiarity with the API that comes into play. Maybe current DX10 implementations are just inefficient due to programmers lacking experience with it, not knowing what does and does not work...

I recall when galciv2 changed their texturing method in patch 1.5
It cut down ram usage to a fraction of what it was before, it reduced CPU usage, and it looked WAAAAY better.
All it took was discovering that there is a "better way" of doing something. People had years to figure that out in DX9, they are still learning.
 

ashishmishra

Senior member
Nov 23, 2005
906
0
76
Originally posted by: taltamir
Originally posted by: ashishmishra
aka1nas and taltamir: I guess that makes sense, that it order to make it noticeable the developers go overboard on the graphical effects, so low performance from current gen of graphic cards is understandable. I just wonder why are we seeing increased CPU and system limitations with crysis for example, I for one never expected it to be this CPU limited judging on how it looks.

Also any clues on the first ground up DX10 game and estimated launch timeframe.

Side by side comparisons in crysis make it clear that even at identical in game settings the DX10 version looks better and uses much more intensive features then the DX9 version.

Incomplete DX9 functions CAN be enabled to give improved DX9 appearance as well. So basically what happed with crysis:
DX9 stuff coded
Some high end DX9 stuff was dropped and rewritten in DX10.

As for "coding from the ground up"... I would imagine there is also programming techniques and familiarity with the API that comes into play. Maybe current DX10 implementations are just inefficient due to programmers lacking experience with it, not knowing what does and does not work...

I recall when galciv2 changed their texturing method in patch 1.5
It cut down ram usage to a fraction of what it was before, it reduced CPU usage, and it looked WAAAAY better.
All it took was discovering that there is a "better way" of doing something. People had years to figure that out in DX9, they are still learning.

Hmm...yeah point taken. I guess we just need to hang tight and be patient.
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
I think this is the first time in a very long time that the hardware available is actually ahead of the software that can make use of it.
We have dx10.1 and OpenGL 3.0 (which rocks by the way), quad core cpu, gigs of ram, and nothing really using all of that at its full potential.

I understand its hard to develop for the cutting edge, its such a small market compared to the lower end specs.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: Modelworks
I think this is the first time in a very long time that the hardware available is actually ahead of the software that can make use of it.
We have dx10.1 and OpenGL 3.0 (which rocks by the way), quad core cpu, gigs of ram, and nothing really using all of that at its full potential.

I understand its hard to develop for the cutting edge, its such a small market compared to the lower end specs.

so true. Neverwinter Nights 2 for example suffers from HORRIBLE load times... Because it will NOT take up ram... You will not be paging with NWN2 even on a 2GB machine running vista. It takes under a gig. And yet every 3 minutes it dumps 300MB of textures and data and reloads all of them from a very high compression file that is decompressed using a single threaded utility. Resulting in extremely long load times that depend ENTIRELY on the single core speed of your CPU.
It is a nightmare to play (I play on the lowest settings to reduce the size of the textures and data that needs to be loaded thus reduce the load times)

Crysis is one of the few games that claim to be quad core optimized, and even it relies on the cores in an uneven manner.
 

ashishmishra

Senior member
Nov 23, 2005
906
0
76
Originally posted by: taltamir
Originally posted by: Modelworks
I think this is the first time in a very long time that the hardware available is actually ahead of the software that can make use of it.
We have dx10.1 and OpenGL 3.0 (which rocks by the way), quad core cpu, gigs of ram, and nothing really using all of that at its full potential.

I understand its hard to develop for the cutting edge, its such a small market compared to the lower end specs.

so true. Neverwinter Nights 2 for example suffers from HORRIBLE load times... Because it will NOT take up ram... You will not be paging with NWN2 even on a 2GB machine running vista. It takes under a gig. And yet every 3 minutes it dumps 300MB of textures and data and reloads all of them from a very high compression file that is decompressed using a single threaded utility. Resulting in extremely long load times that depend ENTIRELY on the single core speed of your CPU.
It is a nightmare to play (I play on the lowest settings to reduce the size of the textures and data that needs to be loaded thus reduce the load times)

Crysis is one of the few games that claim to be quad core optimized, and even it relies on the cores in an uneven manner.

I remember reading that Crysis's Physics Engine makes optimal use of a quad core, but for the rest of the game a dual core is plenty. Which means that only in heavily physics intensive situation the quad will be useful, which as we have seen is rarely the case in the actual game.