What DX 10.1 features do you like the most?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: Aberforth
Originally posted by: taltamir
I didn't quote a single word from wiki, I linked it so that you will be able to read up on what multi sample AA is.
If you checked the link you would have seen that NOTHING I said in that post came from wiki.


Also, i don't see where it says AA suddenly becomes a free operation.

That depends on your GPU and it's drivers, but the API itself won't impact the performance largely.Read this again "multisampling has been enhanced to generalize coverage based transparency and make multisampling work more effectively with multi-pass rendering."

It's also very nice of you to educate me about MSAA but I already knew it- since my job involves writing application modules based on D3D.

exactly, work more effectively, increase performance.. but not FREE.
Yes DX10.1 MSAA will slow you down less, but it will still lower your FPS.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
This thread is inflammatory, I am disappointed in you BenSkywalker, unless you are being sarcastic that is...

Me, sarcastic? Never ;)

So far for all the hype all we have is a not quite clearly defined explenation of how the new MSAA works under DX10.1(it mainly deals with how it can be applied while using post processnig, this ties in with the data passage mentioned in the specs) and not much else.

Anandtech and Elite Bastards certainly expressed disappointment in nvidia for not supporting it and I assumed they had good reasons. If you have a site that has better education on this subject, would be happy to read it.

edit: Certainly implementation is being held back by Nvidia's lack of compliant hardware.

Tell me what feature nVidia is missing. Hell, get Elite Bastards or AT to tell you what feature is missing. DX 10.1 isn't like a 9-10 shift, developers can do a per feature call check to see exactly what is exposed and what is not. Now I am in no way saying that aren't missing anything- what I want is some of the vocal supporters of this fantastic new standard to point out what the big features are so we can iinivestigate and find out exactly what it is that nV doesn't support.

Let's be honest, if anyone is concerned about the issue due to lack of developer support then isolating exactly what it is that is missing would be a rather important issue would it not? Figuring out which of the new DX10.1 features will not run flawlessly on all of the newest round of hardware should be something we are all interested in.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
So you know better than anandtech and hanners? Very nice, why are you wasting your time here promoting nvidia's finances. I am sure your own site would be excellent.

:beer:
 

VirtualLarry

No Lifer
Aug 25, 2001
56,343
10,046
126
What about virtualized graphics memory? I thought that was a big feature of DX 10.1. Why hasn't this been mentioned already? Or was this feature dropped, because NV doesn't support it?
 

Aberforth

Golden Member
Oct 12, 2006
1,707
1
0
Originally posted by: VirtualLarry
What about virtualized graphics memory? I thought that was a big feature of DX 10.1. Why hasn't this been mentioned already? Or was this feature dropped, because NV doesn't support it?

I thought Video paging was a Windows Vista feature...
 

Obsoleet

Platinum Member
Oct 2, 2007
2,181
1
0
Nvidia and their cronies mission: downplay lack of DX10.1 hw compliance.
Thus, everyone who has financial or personal interest in Nvidia will do this.

The rest of us, unbiased consumers, will buy DX10.1 hardware from ATI. Thanks for your personal attempt at spin though!
 

Aberforth

Golden Member
Oct 12, 2006
1,707
1
0
Originally posted by: Obsoleet
Nvidia and their cronies mission: downplay lack of DX10.1 hw compliance.
Thus, everyone who has financial or personal interest in Nvidia will do this.

The rest of us, unbiased consumers, will buy DX10.1 hardware from ATI. Thanks for your personal attempt at spin though!

qft, I couldn't have said it more clearly :thumbsup:
 

thilanliyan

Lifer
Jun 21, 2005
11,871
2,076
126
Originally posted by: Obsoleet
Nvidia and their cronies mission: downplay lack of DX10.1 hw compliance.
Thus, everyone who has financial or personal interest in Nvidia will do this.

The rest of us, unbiased consumers, will buy DX10.1 hardware from ATI. Thanks for your personal attempt at spin though!

Forgot to add: tell us how great PhysX and Cuda is.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,971
126
The feature I'm most interested in is the ability to anti-alias shaders without resorting to super-sampling the entire screen, an operation generally far too costly for modern games.

TRAA/AAA has been absolutely awesome for alpha textures and if we can get something similar for shaders it?d be great.

Originally posted by: VirtualLarry
What about virtualized graphics memory? I thought that was a big feature of DX 10.1. Why hasn't this been mentioned already? Or was this feature dropped, because NV doesn't support it?
I thought this was a feature of the Vista driver model, not DirectX per-se. But yeah, I also heard it had been dropped because nVidia couldn't support it.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Obsoleet
Nvidia and their cronies mission: downplay lack of DX10.1 hw compliance.
Thus, everyone who has financial or personal interest in Nvidia will do this.

The rest of us, unbiased consumers, will buy DX10.1 hardware from ATI. Thanks for your personal attempt at spin though!

Errrr....no.

As there are no games that support DX10.1 a person doesn't have to "downplay" lack of DX10.1 HW compliance, it's impossible to point out need for it at this point.

DX10.1 is an unknown commodity at this point, our only first hand knowledge of it comes from a game that removed it due to render errors.

It's a little premature to declare DX10.1 an advantage when not one game supports it and few are scheduled to currently. There's also the question of whether DX11 will hit before this matters, or what the difference in the DX10.1 games will be.


 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
ping... he was accusing ALL of us of being nvidia cronies right? lets see, yesturday i was an nvidia croney/fanboi 3 times and an AMD croney/fanboi 2 times... today... 2 nvidia crony (but one of them general and applies to more then just myself), one AMD... i posted less today...
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: nitromullet
Yep, exactly... In light of any real price/performance advantages forum goers have had to make up differences between the HD4870 and GTX 260 based on marketing material provided by the hardware manufacturers.

Ok, so maybe F@H... It might be important to some.

But, when you have to use "proof-of-concept" or "Tesla rack" in your list of benefits, it doesn't really translate to real price/performance advantage IMO. I'm not arguing that DX10.1 is any better mind you. The only reason we care about these things that barely matter is because the competing products are so close to each other in terms of price/performance.

Well I was certainly one who felt CUDA and PhysX were pointless features, up until they started showing huge benefits with existing hardware in real-world applications. As for Tesla, it won't directly impact gamers but it definitely validates NV/Jensen's claims that CUDA and GPGPU's are faster than CPUs in many applications. Here's a few blurbs about it Tesla Boosts Oil Industry
[With the latest Tesla series] we?re looking at 10-30x performance improvement. That?s a great performance boost and we?re expecting more."
Considering you can fit 4 GT200-based Tesla cards in a 1U rack for ~$8,000 its no surprise the HPC community is getting excited about Tesla.

With gaming, it is important to show proof of concept as that's quite simply the easiest way to show people its worthwhile. UT3 is the first example, where I'm sure you've seen UT3 on a 9800GTX. From what I've read, you're not only getting better visuals/effects, but also better performance as well.

On my own system I went from P10,945 with CPU = 12,357 to P12,890 with CPU = 39,467 Sure its only 3DMark, but when you actually see the demo run its very clear PhysX has teeth, it just needs to be implemented in current and future games. NV seems focused on making it work and developers/gamers are certainly interested in physics as well, as many recent games already have software implementations of PhysX or Havok.

 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
So you know better than anandtech and hanners? Very nice, why are you wasting your time here promoting nvidia's finances. I am sure your own site would be excellent.

I'm promoting nothing at all. Just tell me what features it is that you care about in 10.1, that's all I'm asking. I can understand if you just blindly do what you are told, if that is your personality type that is certainly understandable. You were told to believe something, so you do. I don't work that way. Tell me what it is that is great about 10.1 is all I'm asking.

The rest of us, unbiased consumers, will buy DX10.1 hardware from ATI. Thanks for your personal attempt at spin though!

All I've been trying to get is what feature they don't support. Come on now, this thread is getting plenty long enough for someone to post ONE FEATURE not supported by the 260/280. I'm not saying it isn't there, what is dumbfounding is why not a single person can say what it is.

Forgot to add: tell us how great PhysX and Cuda is.

CUDA is unrelated to gaming atm and will remain that way for some time. It is a feature entirely aimed at the HPC market at this point and will remain that way for a while. If nV hadn't devoted the resources that they did to CUDA performance gaming performance would have been stronger on the 2x0 parts without a doubt. Don't confuse me with a company spokesman for either camp, I call them like I see them. Oh yeah, PhysX we have seen in proof of concept terms, but we have seen nothing to date to make it anything more then a footnote feature. This may change at some point, and yes, it is true we have seen far greater support for PhysX then we have for 10.1, but it simply isn't a factor at this point in time.

The feature I'm most interested in is the ability to anti-alias shaders without resorting to super-sampling the entire screen, an operation generally far too costly for modern games.

That is fully supported by the 260 and 280.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
The rest of us, unbiased consumers, will buy DX10.1 hardware from ATI. Thanks for your personal attempt at spin though!

All I've been trying to get is what feature they don't support. Come on now, this thread is getting plenty long enough for someone to post ONE FEATURE not supported by the 260/280. I'm not saying it isn't there, what is dumbfounding is why not a single person can say what it is.

They already did, repeatedly, weren't you reading?
You wanted someone to give you one feature that they support that the GTX260 or GTX280 does not, and they did, that feature is called "DX10.1". Doh!

that, and the magical ability to anti alias without any FPS drop when done through DX10.1



As for physX... I rated it as underwhelming.. When I finally got it running on an 8800GTS 512 and tried some games and tech demos all I could think is... Meh.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
I'm genuinely curious Ben, what DX10.1 features does the GT200 not support, since it seems like it has most things covered?
 

Aberforth

Golden Member
Oct 12, 2006
1,707
1
0
All DX10 cards are definitely capable of DX 10.1, there is only a little or no change in the architecture. Just like you enable PhysX on older G80 cards you can enable DX 10.1 support with a firmware or driver update, they won't do this because they'll lose the quarterly sales, there is always a risk involved at the user end during firmware update and GPU's never undergo reliability testing.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
I'm wondering, is this a sarcasm thread or a thread to show just how clueless most people are about DX10.1? I would imagine the two biggest features of DX10.1 are multisample AA depth readback and cubemap arrays. But the more important question is if NV "unofficially" supports DX10.1, as I've heard in some rumors, then how would one utilize those features on NV hardware? For example, does the driver expose such functionality through OpenGL extensions? And if so, how come no game takes advantage of those features on NV hardware?
 

tcsenter

Lifer
Sep 7, 2001
18,351
259
126
Originally posted by: BenSkywalker
I have been hearing a ton of hype from all over the forums
Should I have stated these forums? Would that have been explicit enough? As far as evidence, how many quotes you want, 50? 100? Have to give me a number.
Two different topics will suffice, with special emphasis on actual use of terms such as "enormous" improvements or "enormous" new features that will constitute an "enormous" technological leap. Your words, but I would accept the use of different words that are equivalent in sentiment, character, or meaning (huge, massive, incredible, for example). Thanks.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,971
126
Originally posted by: BenSkywalker

That is fully supported by the 260 and 280.
I think this thread would work better if we have a list of things nVidia support and we can compare it to what is left.

Of course when nVidia pull stunts like this:

?If we say what we can't do, ATI will try to have developers do it, which can only harm pc gaming and frustrate gamers."
It's probably going to be quite hard. It's also quite hilarious that nVidia are a member of the PC gaming alliance.

http://www.anandtech.com/video/showdoc.aspx?i=3334&p=7
 

VirtualLarry

No Lifer
Aug 25, 2001
56,343
10,046
126
Originally posted by: Aberforth
All DX10 cards are definitely capable of DX 10.1, there is only a little or no change in the architecture. Just like you enable PhysX on older G80 cards you can enable DX 10.1 support with a firmware or driver update, they won't do this because they'll lose the quarterly sales, there is always a risk involved at the user end during firmware update and GPU's never undergo reliability testing.

Flashing the BIOS isn't going to give the GPU new functions, sorry to say.
 

Aberforth

Golden Member
Oct 12, 2006
1,707
1
0
Originally posted by: VirtualLarry
Originally posted by: Aberforth
All DX10 cards are definitely capable of DX 10.1, there is only a little or no change in the architecture. Just like you enable PhysX on older G80 cards you can enable DX 10.1 support with a firmware or driver update, they won't do this because they'll lose the quarterly sales, there is always a risk involved at the user end during firmware update and GPU's never undergo reliability testing.

Flashing the BIOS isn't going to give the GPU new functions, sorry to say.

DX10 programs can be made to run without dx10 capable GPU. A video card is just a graphic accelerator which is programmed by drivers and firmware.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
making the CPU perform certain tasks that are supposed to happen in the GPU is gonna annihilate your FPS, there is simply no point of doing it.
 

lopri

Elite Member
Jul 27, 2002
13,209
594
126
From what I've read (yeah mostly marketing material), I could see at least one clear advantage of DX10.1 (for us) - tightening the standard (requirements). Standards that everyone (be it video card makers or developers) has to follow tend to benefit consumers. When there is no entity to force the standards, it's the consumers who are disadvantaged. I also rememember that John Carmack bemoaning about ATI/NV doing same things same way with just different names.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
well, nobody is going to program for DX10.1, most companies are not programming for DX10... there are less then 10 million DX10 cards out there, there are several hundred million DX9 cards.
How many DX10.1 cards are there? even if nvidia made the G200 DX10.1, nobody would program for it still, older cards don't magically poof out of existence when a new item shows up. It makes no sense to program three completely different methods of doing the same things just to benefit a progressively smaller market. DX10 also brought huge visual improvements, the DX10.1 just gives slightly performance boost.
The benefit of DX10.1 parts, is that DX is cumulative (except for when it is not, like DX10), that is, they will be able to do some DX11 functions when it finally arrives (or would they? at least making your cards DX11 compliant would be less work since you already have features implemented). In the meanwhile, it is a bad enough business decision to program your game for DX10, much less 10.1

As much as I love the awesome graphic improvement of DX10, I think it is a mistake for most companies to make their games for it as of right now. In 2009, well, that would be a good time to release a DX10 game from a business standpoint.

Actually, most games could use a lot more content, fun, and forethought, and a lot less graphics. Eye candy is supposed to be icing on the cake, not the whole game.

That doesn't mean I am not buying expensive video cards to GET more eye candy though :p.
 

Aberforth

Golden Member
Oct 12, 2006
1,707
1
0
Originally posted by: taltamir
there is simply no point of doing it.

exactly so it would be logical to assume a GPU will be able to process DX API much faster, in case of DX10, special architectural changes (like to process geometry shader and vertex buffer pool) are need to process truck loads of data.

Also some of the features of DX11 were removed from DX10 because of time constraints, those include GPU Based Mutli-threading, tessellation and we might also see Microsoft's own proprietary Physics system integrated into DX.

Get a sneak peak of DX 11 @ nvision 08- http://speakers.nvision2008.co...ssion.cfm?sessionid=39