DirectX

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: live2game
Originally posted by: Gamingphreek
R520 can not and will not have support for the next DX. However, i have heard that both ATI and Nvidia are supposed to have something a little bit above 9.0C. Not DX10 (or whatever you want to call if) but a little bit above 9.

Additionally DX9.0B is not as good as C. There is most certainly a reason to buy a 6800GT over an X800XL or something to that effect. 9B does not support hardware displacement mapping, and the ATI cards do not have 32bit FP support, as well as SM3.0 spec. While they run just as fast as Nvidia (if not faster) they are less "future-proof" because of the lack of these features.

I will say what i said in everyother thread, if you plan on keeping your card for a while and holding out next gen, then get teh Nvidia card because of the support for those features. If you plan on upgrading next gen first thing, you will probably want all the speed you can get this gen so unless you have a huge lump of money sitting around for SLI, the ATI cards are more than likely the better buy because the features do not matter as because you are upgrading.

More OT: No cards support DXxx because the spec isn't even released yet, hence the reason none of the next gen cards will be able to support it.

-Kevin
You are a Nvidia fanboy. SO why can't they support it.

Because the spec isnt finalized by MS, so they cant claim support.
 

imported_humey

Senior member
Nov 9, 2004
863
0
0
NVM, most here prob aint ever ran LongHorn or avalon, ive ran l-h POS since early alphas and its ok for short time to mess with, betas may be better soon, anyhow wtf is prob with admiting some new games are DX9C and only Nvidia use this now on new cards ?.

I do have Bro's In Arms in a box infront me and its req's state DX9c card, etc, etc.

Maybe i just throw my Ultra in bin and use my old g 4 ti 4600se or try it on my old g 2 gts utlra.
 

Fern

Elite Member
Sep 30, 2003
26,907
174
106
Originally posted by: humey
NVM, most here prob aint ever ran LongHorn or avalon, ive ran l-h POS since early alphas and its ok for short time to mess with, betas may be better soon, anyhow wtf is prob with admiting some new games are DX9C and only Nvidia use this now on new cards ?.

AFAIK, there are no "real" DX 9.0c games. There are a couple which have little bits of SM 3.0 here or there as I understand it. Example: Far Cry, get the latest patch and it got some SM 3.0 eyecandy added. In other words, no games yet AFAIK built from the ground up to take advantage of DX9.0c or SM 3.0

I do have Bro's In Arms in a box infront me and its req's state DX9c card, etc, etc.

Does it say DX9c COMPLIANT or COMPATIBLE? I think you're reading a bit too much into the box wording. Although BIA may have some DX9c/SM3.0 code, I doubt it only because I haven't seen any mention of it. You can be sure the nvidia peeps would be telling the ATI peeps how much better the game looked on their cards cuz they can do DX9c/SM 3.0 :)

Maybe i just throw my Ultra in bin and use my old g 4 ti 4600se or try it on my old g 2 gts utlra.

 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Originally posted by: LTC8K6
Having to install DX9.0C does not mean that you need a a DX9.0C hardware card.

Brothers In Arms will run on a Radeon 8500 and a GF4 ti4200.

ATI does not have a DX9.0C hardware card out yet. Currently that means absolutely nothing, and there is no indication that it will mean anything in the future.

Currently no game requires you to have a DX9.0C hardware card.

You might need a card that will work okay with DX9.0C, but many pretty old cards fit that description.

are you sure about that? it wouldn't run on my friend's GF4 mx440.

EDIT: never mind, that's because the MX series lacks the nFiniteFX engine. i just thought it was odd they'd have two cards in a line that support different features, but i guess they do.

from what i've read (the Inq), the r520 will support the "next DX" available. ok if this is wrong I appologize. maybe i'm confusing this with "DirectX next". IF it IS right:

---
just because it hasn't been codenamed doesn't mean it won't have support for the features of the "next DX". put it this way: Even though "r520" will not have support for "DirectX 10", it will have support for the features of "(the next DX)" (maybe...), that will be announced in the near future. happy? :) in turn, that means it will support the new games, if the spec is officially announced later on and includes the new features that may be introduced in r520. the likelihood of that? i don't know...but if microsoft and ATI are "in bed" together, it's probable...
---

what are the definitions of compliant and compatible? a gf4 isn't dx9 "compatible" or "compliant". the DX9 spec is an API that is only on cards that include it. techically, the gf4 doesn't include dx9 support of any kind. games will run on it because they can fallback on dx8. ...am I wrong? I'm not quite sure what you're trying to say with compliant/compatible, LTC8K6. Does DX9.0C only use functions that have been available since DX8 or something? Then why's it a new DX?

here's an example...say these are new function modifications in dx9:
http://msdn.microsoft.com/library/defau...ectx/graphics/ConvertingToDirectX9.asp

No GeForce4 MX card will have those, so it doesn't support them. If a game engine asks for these functions, it's not going to get them. It will have to fallback to something else.

From what I understand, DirectX(9.x) is a set of APIs that either IS or IS NOT implemented on a video chip. If it's not, the game ISN'T using DX9.x. It's falling back to the latest DirectX version that the chip (or reference rasterizer for that matter :evil: ) supports.

...and now I'm more confused than when I started this post. :confused:
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Originally posted by: live2game
Originally posted by: Gamingphreek
R520 can not and will not have support for the next DX. However, i have heard that both ATI and Nvidia are supposed to have something a little bit above 9.0C. Not DX10 (or whatever you want to call if) but a little bit above 9.

Additionally DX9.0B is not as good as C. There is most certainly a reason to buy a 6800GT over an X800XL or something to that effect. 9B does not support hardware displacement mapping, and the ATI cards do not have 32bit FP support, as well as SM3.0 spec. While they run just as fast as Nvidia (if not faster) they are less "future-proof" because of the lack of these features.

I will say what i said in everyother thread, if you plan on keeping your card for a while and holding out next gen, then get teh Nvidia card because of the support for those features. If you plan on upgrading next gen first thing, you will probably want all the speed you can get this gen so unless you have a huge lump of money sitting around for SLI, the ATI cards are more than likely the better buy because the features do not matter as because you are upgrading.

More OT: No cards support DXxx because the spec isn't even released yet, hence the reason none of the next gen cards will be able to support it.

-Kevin
You are a Nvidia fanboy. SO why can't they support it.

Why am i an Nvidia fanboy? I said neither could claim support for it. I recommended Nvidia if you aren't upgrading next gen simply because they support SM3.

A better question is why didn't you read my response?

-Kevin
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Originally posted by: xtknight
what are the definitions of compliant and compatible? a gf4 isn't dx9 "compatible" or "compliant". the DX9 spec is an API that is only on cards that include it. techically, the gf4 doesn't include dx9 support of any kind. games will run on it because they can fallback on dx8. ...am I wrong? I'm not quite sure what you're trying to say with compliant/compatible, LTC8K6. Does DX9.0C only use functions that have been available since DX8 or something? Then why's it a new DX?

It helps if you expand the terms out a bit, to "API compatible", and "hardware feature compliant". Then it makes a bit more sense. The applications, talk to the DirectX system runtimes on the system. The application is written to a certain revision of the DirectX API spec. The application *requires* at least that version of the DirectX runtime present on the system to operate. Note that this hasn't even gotten to the hardware yet - this is purely an issue of the high-level software APIs. All DirectX system runtimes, being based on MS's OLE/COM technology, are also backwards-compatible. So if you have an an app that uses the DX9.0C API, then you have to have the DX9.0C runtime on your system. However, an older game app that only uses DX5.0, should also run fine on top of the DX9.0C runtime, because the runtime is backwards-compatible and includes the COM interfaces from the older API versions. In fact, some applications can themselves, detect if the DX9.0C runtime is present, and if not, fallback to attempting to use older DX interfaces. Generally though, if the app doesn't have any code-paths to use the features exposed in the newer interfaces, then it should simply be built to use the older DX API interfaces, period.

Ok, now onto the hardware side. Hardware is always evolving, and adding new features. However, the DX spec is standardized, at various revision levels. (That's one of the drawbacks of DX, as compared to OpenGL that has a provision for allowing access to vendor-specific non-standardized features. In the DX world, apps have to wait for a new DX API version instead. Thus DX will always (theoretically) be behind OpenGL in terms of supporting new hardware features... but I digress.) These represent incremental hardware-feature support levels, more or less, along with some API cleanup along the way.

So, let's look at it this way. You're writing a game, using DX, and the newest hardware includes some new whiz-bang feature that you would like your game to take advantage of in your game, to appear "cutting edge". Let's assume that prior versions of the DirectX API don't support this new hardware feature, and neither did prev-gen hardware. A new DX API standard was just released to take advantage of this feature.

Since you want to be able to use this new hardware feature, you wouldn't want to write your app to the prior DX API spec, because that one doesn't support it. So you code for the newest spec. But in DX, every feature, is (generally) available, whether or not the hardware actually supports it. It provides "device caps" to allow your app to find out if the feature is implimented in hardware (fast), or emulated by the DirectX runtime instead (slow). So in this case, you could either code your app expecting it to only be run on cards that are "hardware feature compliant" with the newest DX API spec, and thus it would require that hardware to run efficiently, or you could code your app to detect, using the newer DX API spec, whether or not the underlying hardware supported that feature, or if it was being emulated, so you could code around it. (Not use that graphical special-effect, for example.) That would be the case if you were running an older graphics card, that was not "hardware feature compliant", but was installed on a system with the newest DX runtime, with drivers that supported the newest DX runtime. (DDI interface version, I think, if you check using DXDIAG.EXE)

Originally posted by: xtknight
No GeForce4 MX card will have those, so it doesn't support them. If a game engine asks for these functions, it's not going to get them. It will have to fallback to something else.
Well, the game can ask for them, and get them, but they may not be in hardware, so the game will run hella slow. So generally if a feature is a performance-critical to the game, and not critical overall, the game code will detect if the feature is in hardware, and work around it if not.

Originally posted by: xtknight
From what I understand, DirectX(9.x) is a set of APIs that either IS or IS NOT implemented on a video chip. If it's not, the game ISN'T using DX9.x. It's falling back to the latest DirectX version that the chip (or reference rasterizer for that matter :evil: ) supports.
No, chips don't impliment DX at all. DirectX is implimented by the runtime (high-level APIs), and the video card drivers (low-level APIs). It helps, of course, if the actual chip hardware supports the same sort of data formats as the APIs do, because that eliminates any potential overhead from translating the data between what DirectX understands, and what the hardware understands. Some things though, like shaders, are implimented by the app using DX's HLSL, and then translated into a more hardware-specific format, by a driver-level "shader compilier". (Actually, there are two - one in the DirectX high-level runtime, and another in the low-level drivers, at least in NVidia's case.) But that's what it means when you hear "DirectX 8.1 hardware compliant" - it means that the hardware itself is compatible, both in terms of data-formats, and in terms of supported features, in the DX 8.1 spec. (And additionally, "compliant" usually means "fully (100%) compatible", whereas "compatible" means "partially compatible". Example - a Voodoo3 card, it doesn't impliment T&L in hardware, so it wouldn't be "DirectX 6.1 compliant", but it would be compatible, because it can run in a system, with a DX 6.1 runtime, running games written to use the DX6.1 API spec version. It's just that the T&L API functions, are emulated in software for those games.)

Originally posted by: xtknight
...and now I'm more confused than when I started this post. :confused:
I hope all of that made it less confusing, and not more... I hope. :p
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,576
126
xtnight, a gf4mx is actually a gf2 core. That's why I eliminated it in my earlier post.

Most cards from the real gf4 & r8500 up are dx9 compatible. That means they will cause no trouble with the dx9.0c runtime. Anything they can't do in hardware is done in software as far as I know. Either that, or the game runs in an earlier dx mode when it detects the card.

The game makers list the cards the game will run on, so just look at the site for the game.

BIA will clearly run on a gf4. There is no doubt about it.

It would make no sense for a game maker to have a game that will only run on the top end cards. Few people will even notice that the game is running in DX8.1 or DX9, or DX9b or c anyway. It would likely need to be pointed out to you for you to notice the differences.

Heck, a GF4 running a game in DX8.1 can be faster than a newer card trying to run the game in DX9.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: LTC8K6 Few people will even notice that the game is running in DX8.1 or DX9, or DX9b or c anyway. It would likely need to be pointed out to you for you to notice the differences.

Heck, a GF4 running a game in DX8.1 can be faster than a newer card trying to run the game in DX9.
We're not talking "faster" . . . a DX9 card - (e.g.) 9800p - will display much more eyecandy than a DX8.1 card - (e.g. ) R8500. Just glance at the water textures and lighting & shadows . . . even a "blind" person can tell the difference. :p
:roll:
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,576
126
Yeah, I prob should have left dx8.1 off of that comparison.

Regarding running faster in DX8.1, I was thinking of FX5200's vs gf4ti's, apoppin.

Will an FX5200 really display more eye candy than a ti4600? Is less more? :D

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: LTC8K6
Yeah, I prob should have left dx8.1 off of that comparison.

Regarding running faster in DX8.1, I was thinking of FX5200's vs gf4ti's, apoppin.

Will an FX5200 really display more eye candy than a ti4600? Is less more? :D

i don't know much about the 5200 other than it is a relatively low-end card . . . is it even fully DX9?
i DO know the radeon 9600 series will display more eyecandy than the GF4 ti4600 ;)
(even if the 4600 may be "faster" on the DX8 pathways)