• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Why DX10.1 matters to you

JPB

Diamond Member
Why DX10.1 matters to you

_____________________



YOU KNOW THE graphics wars are heating up when the 'presentations' start to fly. The latest one is from NV talking about their upcoming 8800GT/256 which will be out soon, but it is the header that makes me think it is more than innocent advertising.

"Kevin Unangst, Senior Global Director for Microsoft Games for Windows: " DX10.1 is an incremental update that won?t affect any games or gamers in the near future." He is right, mostly because it won't be out until SP1 hits, so no argument there. More importantly, DX10 and onward is tied to the broken malware infestation known as Vista, so it is really irrelevant, but there is more to it than that.

Lets be honest, DX10.1 brings a lot of new features that don't really matter much if at all, and you can read all about them here. That said, there is one there that will matter a lot, contrary to what MS people say. This magic feature is the multi-sample buffer reads and writes(MSBRW). If you are wondering how you missed that big one in the feature list, well shame on you, read better next time.

What MSBRW does is quite simple, it gives shaders access to depth and info for all samples without having to resolve the whole pixel. Get it now? No? OK, we'll go into a bit more detail. DX10 forced you to compute a pixel for AA (or MSAA) to be functional, and this basically destroyed the underlying samples. The data was gone, and to be honest, there was no need for it to be kept around.

Games like Quake3 would do a lighting pass, then a shader pass, and another lighting followed by shaders and so on until everything was rendered right. This was quite precise but also quite slow. Dog slow.

To optimize around this, a technique called deferred shading took was invented. This does all the lighting passes followed by a single shader pass. If you have five passes, you basically can skip four trips through the shaders. The problem? Because the pixel isn't fully computed, just a pile of AA data, there is no way for it to be read. This is horribly simplified, but I don't want to go into the low level stuff here, go look it up if you really care.

What this meant is that you can't turn on AA if you have deferred rendering unless you do Supersampling which is rendering it at higher reolutions and sampling down. This is unusably slow, so it went out the door, meaning if you were designing a game, you picked speed in the form of deferred shading, or beauty in the form of AA. Most DX10 games will go for speed, meaning the AA hardware will sit more or less idle.

DX10.1 brings the ability to read those sub-samples to the party via MSBRW. To the end user, this means that once DX10.1 hits, you can click the AA button on your shiny new game and have it actually do something. This is hugely important.

The first reaction most people have is that if a game is written for DX10, then new 10.1 features won't do anything, AA awareness needs to be coded in the engine. That would be correct, but we are told it is quite patchable, IE you will probably see upgrades like the famous 'Chuck patch' for Oblivion. Nothing is guaranteed, but there is a very good chance that most engines will have an upgrade available.

In the end, DX10.1 is mostly fluff with an 800-pound gorilla hiding among the short cropped grass. MSBRW will enable AA and deferred shading, so you can have speed and beauty at the same time, not a bad trade-off.

Since NV has not done the usual 'we can do it too' song and dance when they are being beaten about the head and neck by a bullet point feature they don't have, you can be pretty sure they can't do it.

Close looks at the drivers, and more tellingly no PR trumpeting that they will have it out before the release of SP1 almost assuredly means that it will never happen. If you have a G8x or a G9x card, the only feature of DX10.1 you will miss is the important one. µ
 
Interesting stuff I guess, but ya know, I really *don't* care about DX 10.1. AA is good and all, but I'm having a grand time with my games in 16x10 or higher resolutions.
 
that's funny, the huge knock on amd cards right now is their aa perf. Looks like 3870 is just going to get better and better as time goes on... kinda like the last couple of daamit card generations.
 
Originally posted by: bryanW1995
that's funny, the huge knock on amd cards right now is their aa perf. Looks like 3870 is just going to get better and better as time goes on... kinda like the last couple of daamit card generations.

Plus it doesn't matter anyway, since nothing can run DX10 games well enough anyway normally, AA would just slaughter them.

Having said that, GoW has a DX10 4xAA option, so I wonder how they funked that one in
(AFAIK the game modes are DX9, DX10 and DX10 4xAA)
 
I'm not impressed with deferred shading basically meaning we have no AA options w/o crappy workarounds, so DX 10.1 looks very good to me.

Just also looks like it'll be a long time before the video cards can handle games with that...
 
Originally posted by: bryanW1995
that's funny, the huge knock on amd cards right now is their aa perf. Looks like 3870 is just going to get better and better as time goes on... kinda like the last couple of daamit card generations.

No where in that article does it say 3870 is going to get better and better.

Just because the damn thing has an option to use DX10.1 AA, doesnt mean that it will do it above 15 fps.

My 2900XT hasnt gotten better and better so I dont expect the 3870 to be any different.
 
Originally posted by: Lonyo

Having said that, GoW has a DX10 4xAA option, so I wonder how they funked that one in
(AFAIK the game modes are DX9, DX10 and DX10 4xAA)

Does GoW use deferred shading? Dx10 games/engines don't HAVE to use this method, it's just that many developers are choosing to do so because of the performance advantages.
 
I said before I don't care about DX10.1.. this is good news though. This means that DX10.1 isn't DX10 only heavier with little improvement.. it means DX10 should be LIGHTER on max settings while producing the same/better image... thats a GOOD thing.
So I guess I no longer "don't care" for DX10.1


Actually let me retract this... This still isn't a big deal NOW... a "one year from now a card you buy right now will start performing alittle bit better when you turn on AA, which it can't even handle ANYWAYS because it can barely run DX10..." well... you see the point. I am not gonna turn on AA either way... and will experience muchu lag even without it. DX10.1 will matter for cards that get 40FPS on max everything on DX10 games... those cards you will want AA turned on. And that extra few FPS from 10.1 vs 10 method of doing AA would be a welcome deal.

I am certain nvidia CAN do it... but they are keeping it in the lab instead of on cards where it gives absolutely no benefit and just complicates things.
 
Originally posted by: taltamir
I said before I don't care about DX10.1.. this is good news though. This means that DX10.1 isn't DX10 only heavier with little improvement.. it means DX10 should be LIGHTER on max settings while producing the same/better image... thats a GOOD thing.
So I guess I no longer "don't care" for DX10.1


Actually let me retract this... This still isn't a big deal NOW... a "one year from now a card you buy right now will start performing alittle bit better when you turn on AA, which it can't even handle ANYWAYS because it can barely run DX10..." well... you see the point. I am not gonna turn on AA either way... and will experience muchu lag even without it. DX10.1 will matter for cards that get 40FPS on max everything on DX10 games... those cards you will want AA turned on. And that extra few FPS from 10.1 vs 10 method of doing AA would be a welcome deal.

I am certain nvidia CAN do it... but they are keeping it in the lab instead of on cards where it gives absolutely no benefit and just complicates things.

What do you mean CAN do it? They certainly CAN NOT do it with their current gpu's, because DX10.1 functionality requires hardware changes, it's not something you can just "enable" in the driver.
 
Back
Top