How important is DX10.1?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
On a side note most of those games above which you have marked DX10 are really DX9c with one or two DX10 effects. Even crysis - the poster child of DX10 - basically looks and runs identically using DX9c (using the ultra high config tweak on XP). Not one of those games is really using the power of DX10.

DirectX 8.1 was ATI only and got picked up, with real benefits, in a few games. Notably, Halo PC and Battlefield 2. (halo looked almost identical to the dx9 mode with dx8.1, but dx8 looked closer to the dx7 path)

Was directx9.0b on the x800 or 9700pro? If it was on the 9700pro, it got major support, otherwise it's only supported in....every source engine game.

As for DX10, most of what it really offers over DX9 requires a ground up development around dx10 hardware, many of which would be unable to truly handle taking full advantage of dx10 anyway.
As for what dx10.1 offers over both dx9 and dx10, are mostly minor tweaks that can be easily implemented without much (or sometimes any) performance loss. There's already more games using dx10.1 than probably ever made use of the other dx side branches.
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
Originally posted by: Dribble
[
On a side note most of those games above which you have marked DX10 are really DX9c with one or two DX10 effects. Even crysis - the poster child of DX10 - basically looks and runs identically using DX9c (using the ultra high config tweak on XP). Not one of those games is really using the power of DX10.

I got excited when I saw The Last Remnant and actually looked into as I'm currently playing it on PC. It's actually only a Dx9c game still. Square got all Engrishy on their TLR website and call it Dx10 on Vista (as it's the "default" Dx installation), then in the footnote state that it's using Dx9c.

 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: Fox5
DirectX 8.1 was ATI only and got picked up, with real benefits, in a few games. Notably, Halo PC and Battlefield 2. (halo looked almost identical to the dx9 mode with dx8.1, but dx8 looked closer to the dx7 path)

Was directx9.0b on the x800 or 9700pro? If it was on the 9700pro, it got major support, otherwise it's only supported in....every source engine game.

As for DX10, most of what it really offers over DX9 requires a ground up development around dx10 hardware, many of which would be unable to truly handle taking full advantage of dx10 anyway.
As for what dx10.1 offers over both dx9 and dx10, are mostly minor tweaks that can be easily implemented without much (or sometimes any) performance loss. There's already more games using dx10.1 than probably ever made use of the other dx side branches.

9700PRO was only DX9 with Shader Model 2.0 support, X800 series had DX9.0b with Shader Model 2.0 and 2.0b support and got average support, it was used in games like Splinter Cell: Chaos Theory, Gears of War, Timeshift, Source Engine, Age of Empires 3, Far Cry and some others that I can't remember now, but usually it brought more eye candy without a performance hit or even got performance improvements, in some games like Gears of War and Splinter Cell it looked identical to the SM3.0 in image quality while in other games like TimeShift and Age of Empires 3, looked much better than SM2.0, but slighly worse than SM3.0 like less complex shaders, shadows or lack of HDR.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Originally posted by: Fox5
Was directx9.0b on the x800 or 9700pro? If it was on the 9700pro, it got major support, otherwise it's only supported in....every source engine game.

dx9b was the x??? range. Pretty sure the HL2 source engine was DX9a - if it did support DX9b then it didn't add anything - you could run with full details and all the effects using 9a.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Originally posted by: Dribble
Originally posted by: Fox5
Was directx9.0b on the x800 or 9700pro? If it was on the 9700pro, it got major support, otherwise it's only supported in....every source engine game.

dx9b was the x??? range. Pretty sure the HL2 source engine was DX9a - if it did support DX9b then it didn't add anything - you could run with full details and all the effects using 9a.

Source engine, on ATI cards, even supports features not in directx 9 at all. Possibly on nvidia cards as well. I think they only add performance though, versus running on a less optimized path, but there might be some difference going from 9700pro to x1950xt.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: RussianSensation

3) F.E.A.R. 2 - DX10 - LithTech Jupiter Extended (EX)
Nope, Fear 2 is DX9. And so is Call of Juarez Bound in Blood, a game which wasn?t on your list.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
We are suppose to have some games out this fall that use some nice lighting and shading that can only be done with Dx10.1 To be DX11 you have to be DX10.1 capable. EXCEPT when you do it in software than you can do what ever you want up to the capabilities of ANY hardware involved in final render. With Larrabee that will be Textures units . The NV 300 looks to be alot like larrabee . NV is going software also.

ATI has the best allround Tech for were were heading .

The winner is he who has best compilers to run 16 CPU threads. NV 300 could have that.

Ati can get it right now except they lack the compiler.

Larrabee will tell us if Intel really has a compiler that can run 16 threads and 1000s of vectoring threads on differant branch levels. If Intel has such a compiler than good chance AMD will be using at a later time but not a lot later.

NV is going the same route but differant. Next year this time nothing will be the same as now.

Sun has shown with there cpus as has intel they do have a good 16thread compiler if not more.

Sun choose the sparc name for a good reason as it used russian compiler. But this occurred durring cold war . Not a good thing. Later Intel licensed the tech and than imedititly bought the company . Sun Rock didn't get what Intel got . Intel Got it all including the MAN.

This compiler is been underconsruction since 1982 I believe maybe earlier its native VliW

So we know SuN made it work well with risc Intel with Epic(VLIW)

So what this comes down to is how good is NV compilers . Because the software is going to WIN this thing . Not hardware!

Remember when it was announced that IBM was buying SUN . We dicussed it in thread here . I lol . Than I told you guys it won;t happen and why. IBM wanted that compiler . but found out they couldn't get it. I will say this again Havok is the KEY. Elbrus (Intel) Compiler is the lock. Dx10 Dx10.1 Dx11 is the path to software so yes its hugh . It forced open shut doors with Intel pushing against to open them . Back to the future we go.

If your going down to the OK coral to battle Intel on threading . You best have more than blanks in those six shooters because intel is packing proven compilers. Ya don't pull on super mans cape and ya don't miss around with Jim. (ABRASH)

 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: BFG10K
Compared to DX9, uptake on supporting DX10 has been poor. This is largely in part due to the negative stigma attached to Vista. Windows 7/DX11 may turn things around for Microsoft.

I thought it was due to xbox360, ps3 and wii. Hell when they get dx11, than we will get the ports.

 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
If MS would have stuck to orginal DX10 specs things would have been differant . Thats another subject tho.

If ya think about what NV is doing it really is kinda smart. IF Nv can get cuda widely used . Than they could also than build and sell a cuda OS. I believe MS uses C alot like cuda.

I think Intel is doing that also . Get C++ used as the base programming language . Havok a seperate but intel company would and will have a gaming API. Because of Intel and look for them later to intro OS based off of C++. Once they or PX can get lock on physics as main use. Things are about to move into face forward like back in the old days . Its anyones game. Imagination isn't looking to bad here either. If hydra works Imagination is looking down right stronge on the desk top.

http://www.joystiq.com/2009/07...technology-hydra-chip/

Now I don't no what hydra implies here but this low powered chip is going to kick ass on hand helds and if the Lucid hydra works as claimed . Look out these low powered chips could be awsome highend chips because of low power and scaling . These times are going to be like it use to be . New tech companies going to pop up now.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Originally posted by: Nemesis 1
If ya think about what NV is doing it really is kinda smart. IF Nv can get cuda widely used . Than they could also than build and sell a cuda OS. I believe MS uses C alot like cuda.

I don't thing they quite want to do that - nvidia's vision of the computer is a cheap processor + an nvidia gpu compute engine. e.g. Ion is a good example of what they want - atom runs windows, nvidia graphics do the hard stuff - video decode/encode, flash, rendering for games, etc. Tegra is the same thing - weak arm cpu + powerful nvidia gpu compute engine.

It's kind of true already for the desktop in that graphics card is often more important then cpu for getting the best game play experience. Cuda apps, physx, etc are just more of the same - offload compute intensive stuff to gpu.

It is probably the way forward - intel think so, hence larrabee, cell is similar in that it has a weak cpu to run the os, and a load dedicated compute engines to do the hard stuff. However I doubt it will end working out quite the way nvidia hope, although by leading the way nvidia obviously get more say then they would do otherwise.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
One thing I seen wrong with your statement . You say NV was the one starting this . Abrash started working on larrabee in 2003 . AMD announced fusion in 06 . ATI announced stream when ? Nv announced cuda when ? Sorry Intel got this going . When others got wind of what intel was up to Than they started moving.

Its alot like Havoc right now. People are saying weres Havoc physics . It well arrive when first game arrives using it. THE CL version.

We have seen there demo s metior / Cloth and others. Just because Yoiu can't play it right now means very little.

When we do see a game using . and we will. The same people will say thats only 1 game.
Nv has advantage short term because of priority. thats the only reason .

Were just about threw this thing now. Well soon see who choose best and who worked the hardest and longest and brings the most polished DX11 cards . Than we can talk what each was doing Since 2003. If intel comes out and lays waste the ans. will be evident.

It really is hard to see everthing intel is up to. With all the differant companies they bought . But Daniel Pohl actually gives us more insight . THE RT demo of water is as close to real as I have seen . The metor demo was great also IF you looked at All the phyics going on . Its was very good, the clouds swirling was perfect example . But that scene had much more than that . Was it perfect ? NO . But its still best example I have seen . If you seen better post .
 

nemesismk2

Diamond Member
Sep 29, 2001
4,810
5
76
www.ultimatehardware.net
That is a good point Dribble but you have to consider that both ATI and Nvidia will soon have DX10.1 video cards which will seriously boost it's support. What if DX11 is shown to be a performance hog and has no real advantage over DX10.1? I know that DX11 video cards which also be able to run in DX10.1 but what is the point in gettting a DX11 video card to run in DX10?