• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

What DX 10.1 features do you like the most?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
I'm genuinely curious Ben, what DX10.1 features does the GT200 not support, since it seems like it has most things covered?

The only thing that I am aware of that isn't exposed in drivers atm is cubemap arrays, and I couldn't honestly say if they are supported in hardware or not(it seems nV has more functionality then they are willing to admit to, why I have no idea- perhaps performace is sub optimal?). I'm not saying there isn't more missing, but if there is I haven't seen it.

I'm wondering, is this a sarcasm thread or a thread to show just how clueless most people are about DX10.1?

Actually, both ;)

But the more important question is if NV "unofficially" supports DX10.1, as I've heard in some rumors, then how would one utilize those features on NV hardware? For example, does the driver expose such functionality through OpenGL extensions? And if so, how come no game takes advantage of those features on NV hardware?

They are exposed under DX, do a caps test.

Of course when nVidia pull stunts like this:

That single page I think sums up AT better then any other when it comes to vid card analysis. It is honestly shocking that they would publish it, perhaps moreso then their shockingly ignorant flames in their AC review.

From what I've read (yeah mostly marketing material), I could see at least one clear advantage of DX10.1 (for us) - tightening the standard (requirements). Standards that everyone (be it video card makers or developers) has to follow tend to benefit consumers.

Benefit consumers, horrificly suffocate innovation. You say tomato ;)

When there is no entity to force the standards, it's the consumers who are disadvantaged.

This can go both ways. There is a rather large amount of funcitonality in BOTH the 48x0 and 2x0 cores that are not exposed under DX10- this functionality won't be properly utilized because developers almost exclusively utilize DX10 at this point.

I also rememember that John Carmack bemoaning about ATI/NV doing same things same way with just different names.

That was the fault of OGL, but it was also a benefit in a lot of ways. Because of the open extension support, nV or ATi could release a new part with a new feature and see devs start to take advantage of it right away.

exactly so it would be logical to assume a GPU will be able to process DX API much faster, in case of DX10, special architectural changes (like to process geometry shader and vertex buffer pool) are need to process truck loads of data.

Sounds like you have spent a lot of time with DirectX, and pretty much no time at all with any other 3D API. The problems you are talking about were native to DirectX, they never were an issue using legacy hardware in OpenGL or any other modern 3D API. The hardware didn't need to change, the horrificly coded segment of D3D that tanked small batch geometric data needed to be reworked.
 

Aberforth

Golden Member
Oct 12, 2006
1,707
1
0
Hardly anyone uses OpenGL these days, it is best suited for corridor based games :p and google earth, Carmack would know all about it. Longspeak 3.0 is supposed to be a major revamp of open gl, I dunno much about it.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: taltamir
I didn't quote a single word from wiki, I linked it so that you will be able to read up on what multi sample AA is.
If you checked the link you would have seen that NOTHING I said in that post came from wiki.

Some developers may choose to require Vista Service Pack 1, which will be distributed broadly to end-users and includes a series of improvements outside of Direct3D 10.1. These developers can use the Direct3D 10.1 headers and libraries exclusively, taking a dependency on the Direct3D 10.1 DLLs which support both 10.0 and 10.1 hardware (some calls may fail, however, on 10.0 devices where the new functionality is not supported).

Translation to english: you don't need to write seperate code for DX10.1, just use it and it will work on both DX10 and 10.1 hardware... except when it FAILS, oops!


Also, i don't see where it says AA suddenly becomes a free operation.

It won't be free, but the performance penalty will be reduced considerably. Also is true that you don't need to write separate codes from DX10 or DX10.1, but it's stated that have to make sure that when the game is running on a DX10 card, it has to make sure that the developer avoid calling DX10.1 features that aren't available on the DX10 card.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: schneiderguy
Originally posted by: taltamir

Assassin creed showed a 20% performance boost because a bug in their DX10.1 implementation caused it to not render some things

No, it allowed them to skip a rendering pass. There's a reason why the performance improvement only shows up when you turn on MSAA.

Also, I'm halfway through the game and I haven't seen any rendering bugs.

In many sites was stated that the bug made the game not to render some particles in screen, wow, sure particles takes a lot of performance away in a game. Strangely enough, I'm currently playing the game unpatched, with the latest drivers and I see dust flying around (When Altair walks, or when the horse is running etc, something that didn't happen with the old Catalyst Drivers)
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: ViRGE
I'm genuinely curious Ben, what DX10.1 features does the GT200 not support, since it seems like it has most things covered?

But still not being a DX10.1 complaint since those optimizations aren't exposed on DX10, but they can be used on OpenGL. Look at the X800 card, it had almost everything that SM 3.0 had, but it cannot be exposed on 2.0 cards like Geometry Instancing for example. Did developers bothered to compile games for a 2.0b card? Gears of War, Timeshift, Ages of Empire 3, and may be a very few more, but since DX10 doesn't use capbits, I dont see how those nVidia capabilities can be exposed on DX10.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: Aberforth
All DX10 cards are definitely capable of DX 10.1, there is only a little or no change in the architecture. Just like you enable PhysX on older G80 cards you can enable DX 10.1 support with a firmware or driver update, they won't do this because they'll lose the quarterly sales, there is always a risk involved at the user end during firmware update and GPU's never undergo reliability testing.

It would be great if it's like you said, but DX10.1 implementation requires some modifications at the shader core like FP32 precision that a firmware update cannot do.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: Aberforth
Originally posted by: VirtualLarry
Originally posted by: Aberforth
All DX10 cards are definitely capable of DX 10.1, there is only a little or no change in the architecture. Just like you enable PhysX on older G80 cards you can enable DX 10.1 support with a firmware or driver update, they won't do this because they'll lose the quarterly sales, there is always a risk involved at the user end during firmware update and GPU's never undergo reliability testing.

Flashing the BIOS isn't going to give the GPU new functions, sorry to say.

DX10 programs can be made to run without dx10 capable GPU. A video card is just a graphic accelerator which is programmed by drivers and firmware.

Yeah, like a CPU, let's just flash a Pentium 3 firmware to add SSE2 and SSE3 instructions!! :roll:
 

pmv

Lifer
May 30, 2008
15,142
10,040
136
Hmmm, that original post seemed an unnecessarily sarcastic and confrontational way of asking a perfectly reasonable question.

Must say I was surprised to see in the Steam user base survey that, apparently, only 5% of Steam users (who responded) actually have DirectX10, yet alone 10.1, so perhaps the whole issue is moot? If I read the survey correctly, only 15% have Vista, and of those only 1 in 3 have a DirectX10 capable card. On results like that I would doubt that Valve, for one, will be putting too much effort into DirectX10 in the near future. Presumably it means only a bit over 2% have DirectX10.1.

But these things change quickly, doesn't seem like very long ago I was swearing I'd never go near Vista, now I'm thinking it'll be part of my next upgrade...Though I'll probably end up getting it a month before Windows 7 comes out.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Aberforth
Hardly anyone uses OpenGL these days, it is best suited for corridor based games :p and google earth, Carmack would know all about it. Longspeak 3.0 is supposed to be a major revamp of open gl, I dunno much about it.

Um, I use OpenGL, and I wouldn't discount it as unsuited for modern games. In fact, with OpenGL I can use modern gpu features like geometry shaders without the need for Vista or being tied to any specific version of proprietary M$ junk.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: BenSkywalker

That single page I think sums up AT better then any other when it comes to vid card analysis. It is honestly shocking that they would publish it, perhaps moreso then their shockingly ignorant flames in their AC review.
Ignore AT's commentary and focus only on nVidia's PR quote then.

You don't think nVidia?s quote is quite possibly one of the most ridiculous things ever stated?
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: BFG10K
Originally posted by: BenSkywalker

That single page I think sums up AT better then any other when it comes to vid card analysis. It is honestly shocking that they would publish it, perhaps moreso then their shockingly ignorant flames in their AC review.
Ignore AT's commentary and focus only on nVidia's PR quote then.

You don't think nVidia?s quote is quite possibly one of the most ridiculous things ever stated?

sure it is... but saying retarded things is natural for marketing...

Remember AMDs infamous "oh, the guy who was supposed to explain the TLB bug to the media was on a 3 month vacation"... There are many such gems.

Anyways, that statement by nvidia is malicious, false, rediculous, etc... that doesn't change anything about the usefulness of DX10.1

and AMD and Intel aren't any better... those aren't some family owned business.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: taltamir

sure it is... but saying retarded things is natural for marketing...
You and I know that, but I want to hear it from Ben instead of him pointing the finger at Anandtech's review.

I want Ben to admit the quote from nVidia's PR is utter garbage.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Ignore AT's commentary and focus only on nVidia's PR quote then.

OK-

"We support Multisample readback, which is about the only dx10.1 feature (some) developers are interested in. If we say what we can't do, ATI will try to have developers do it

Does anyone think ATi wouldn't do that? Because if they didn't, I would openly consider them fools. Hell, EVERY major player in the 3D market has tried to get developers to use features their competition can't and they do it all the time. Anyone who remotely hints that that isn't the case is delusional, ignorant, or a liar.

which can only harm pc gaming and frustrate gamers."

And then you have the PR segment of the quote. It would have a negative impact on nVidia's PR, how much of an end impact I honestly couldn't tell you because I don't know what feature it is that they aren't capable of doing. PR is full of sh!t- equating a negative impact to themselves with that of PC gaming is the real over the top stretch, they always have been from all the companies, that is their job. We have been over this before, remember, according to ATi a single 4870 is the ultimate gaming experience, it is superior to 280s, even in SLI. PR departmens spin all the time. The only remotely surprising thing about it is the percentage that was accurate versus spin.

Then you have-

NVIDIA insists that if it reveals it's true feature set, AMD will buy off a bunch of developers with its vast hoards of cash to enable support for DX10.1 code NVIDIA can't run. Oh wait, I'm sorry, NVIDIA is worth twice as much as AMD who is billions in debt and struggling to keep up with its competitors on the CPU and GPU side. So we ask: who do you think is more likely to start buying off developers to the detriment of the industry?

Straight up- that is profoundly ignorant fan boy drivel at best. nV saying ATi would encourage developers to use a feature nV can't, which we all know they have openly done in the past(as has nV, 3Dfx- hell S3 used to do it too) somehow equates out to this inane drivel? Yeah, I'm not sure if this is more demonstrative then the AC article to just how openly and flamboyantly this site backs ATi.

I want Ben to admit the quote from nVidia's PR is utter garbage.

Heh, man this is getting old. nV and ATi both lie through their teeth all the time, that is what PR departments are paid to do. Do not ever confuse me with being as slanted as you are. Until you go off about ATi being the Ultimate Gaming Experience with as much gusto as you have about nV's equally obnoxious TWIMTBP you aren't being close to even handed.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: BenSkywalker

Until you go off about ATi being the Ultimate Gaming Experience with as much gusto as you have about nV's equally obnoxious TWIMTBP you aren't being close to even handed
I'll do that right after the list of GITG titles I've had problems in match the number of TWIMTBP titles I?ve had problems in.

So far that list is zero so it may take a while. :roll:
 

tcsenter

Lifer
Sep 7, 2001
18,937
568
126
Originally posted by: tcsenter
Originally posted by: BenSkywalker
I have been hearing a ton of hype from all over the forums
Should I have stated these forums? Would that have been explicit enough? As far as evidence, how many quotes you want, 50? 100? Have to give me a number.
Two different topics will suffice, with special emphasis on actual use of terms such as "enormous" improvements or "enormous" new features that will constitute an "enormous" technological leap. Your words, but I would accept the use of different words that are equivalent in sentiment, character, or meaning (huge, massive, incredible, for example). Thanks.
I'd accept 'large' or 'gigantic' as well. Let me know when you've found them. Getting a little tired of checking back here.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
I'll do that right after the list of GITG titles I've had problems in match the number of TWIMTBP titles I?ve had problems in.

I'm talking about The Ultimate Gaming Experience, not GITG. I can see you still have your very interesting slant when it comes to PR. I have never had the slightest problem calling BS what it is, I am dubfounded as to why you think for an instant I would now. The fact is nV's PR was more down to Earth then AT's 'journalists'- that is a rather serious issue. PR lies all the time, they are paid to do it. AT comes back with a whole page of BS and that's supposed to be OK because nV had half a sentence worth in a press statement.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: BenSkywalker

I'm talking about The Ultimate Gaming Experience, not GITG.
What issue do you have with the slogan "the ultimate gaming experience"?

Specifically how does a generic slogan compare to the paid TWIMTBP program which makes claims about specific games?

The only legitimate comparison to TWIMTBP is GITG and only if you have titles in that program that have issues on ATi hardware.

If you can?t see that then it?s a waste of time continuing this discussion with you.