nVidia cards can do some DX10.1 features in FarCry 2??

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
This

and this

It's not DX10.1 per say, its picking and choosing a single extension and then manually writing a work around to do that in 10.0.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
10.1 adds no eye candy, so the same effects can be rendered in 10.0. This may or many not yield a performance hit. Although judging from the "preview" benchmarks the game is not all that hard on the cards.
 

WelshBloke

Lifer
Jan 12, 2005
32,995
11,187
136
Originally posted by: Wreckage
10.1 adds no eye candy, so the same effects can be rendered in 10.0. This may or many not yield a performance hit. Although judging from the "preview" benchmarks the game is not all that hard on the cards.

If 10.1 adds no eye candy what effects are you taking about? :confused:

 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Actually, the newer nVidia cards aren't actually DX10.0 cards, they're more or less DX10.05 cards. IOW, they are able to do some of the DX10.1 features, just not all of them, so they can't be called DX10.1 cards.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: WelshBloke
Originally posted by: Wreckage
10.1 adds no eye candy, so the same effects can be rendered in 10.0. This may or many not yield a performance hit. Although judging from the "preview" benchmarks the game is not all that hard on the cards.

If 10.1 adds no eye candy what effects are you taking about? :confused:

DX10 effects that are "supposed" to be faster in 10.1
 

thilanliyan

Lifer
Jun 21, 2005
12,052
2,271
126
Originally posted by: myocardia
Actually, the newer nVidia cards aren't actually DX10.0 cards, they're more or less DX10.05 cards. IOW, they are able to do some of the DX10.1 features, just not all of them, so they can't be called DX10.1 cards.

How new? Only the 200 series?
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
I would swear that I read it was 9 series and higher, but I'm not positive. It may very well be only the GTX2x0's.

edit: Maybe keysplayr or nRollo know for sure. I'll dig around, and see if I can find the article, or whatever it was.

edit #2: I know for sure that G80 cards didn't have the capability. Were you wondering if your G92 8800GT has the capability?
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Will make for odd advertising. First game to support dx 10.1!! Nvidia can do the limited dx 10.1 that our game uses and we recommend Nvidia. Or just get the ps3 version as it is bug free.....

 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
And this is why MS took out the ability to use DXCAPS in DX10. Pick a standard and stick to it, otherwise you end up coding to specific cards. Both UBIsoft and NV need beaten with a large, fresh trout.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Quoted from SAMPSA at XtremeSystems: "Far Cry 2's DUNIA engine supports DirectX 10 and feature called Multi-sampled Depth Buffer Read. Game reads from a multisampled depth buffer to speed up antialiasing performance. Radeon GPUs implement this via DirectX 10.1 and it's implemented to GeForce GPUs via NVAPI (http://developer.nvidia.com/object/nvapi.html). According to NVIDIA there is no image quality or performance difference between the two implementations but I'm sure AMD/ATI will disagree on this."

So, he pretty much explains what is going on.
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Ubisoft guy 1: Hey, lets use some of those new DX10.1 effects to speed things up a bit.

Ubisoft guy 2: Wait remember what happened with Assasins Creed, DX10.1 is not the way it's meant to be played!

Nvidia: How about you code specifically for one feature of ours instead, that way you don't use features our competitor supports exclusively and they can do it their way and we will do it ours.

Ubisoft Guy 1: Whats the point of a standard then?

Ubisoft guy 2: Psst, we are a TWIMTBP title- whatever they say goes.
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Originally posted by: keysplayr2003
Quoted from SAMPSA at XtremeSystems: "Far Cry 2's DUNIA engine supports DirectX 10 and feature called Multi-sampled Depth Buffer Read. Game reads from a multisampled depth buffer to speed up antialiasing performance. Radeon GPUs implement this via DirectX 10.1 and it's implemented to GeForce GPUs via NVAPI http://developer.nvidia.com/object/nvapi.html. According to NVIDIA there is no image quality or performance difference between the two implementations but I'm sure AMD/ATI will disagree on this."

So, he pretty much explains what is going on.

keys, any word on which GPU's support Multi-sampled Depth Buffer Reads through NVAPI? I fixed your link, BTW.
 

thilanliyan

Lifer
Jun 21, 2005
12,052
2,271
126
Originally posted by: myocardia
Were you wondering if your G92 8800GT has the capability?

Yeah.

Originally posted by: Sylvanas
Ubisoft guy 2: Wait remember what happened with Assasins Creed, DX10.1 is not the way it's meant to be played!

Lol. :)
 

error8

Diamond Member
Nov 28, 2007
3,204
0
76
Originally posted by: Sylvanas
Ubisoft guy 1: Hey, lets use some of those new DX10.1 effects to speed things up a bit.

Ubisoft guy 2: Wait remember what happened with Assasins Creed, DX10.1 is not the way it's meant to be played!

Nvidia: How about you code specifically for one feature of ours instead, that way you don't use features our competitor supports exclusively and they can do it their way and we will do it ours.

Ubisoft Guy 1: Whats the point of a standard then?

Ubisoft guy 2: Psst, we are a TWIMTBP title- whatever they say goes.

That's so funny and it might also be true. :laugh:
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
And this is why MS took out the ability to use DXCAPS in DX10. Pick a standard and stick to it, otherwise you end up coding to specific cards. Both UBIsoft and NV need beaten with a large, fresh trout.

Vendor specific extensions helped D3D kill off the last remnants of OpenGL. Take them away and I'm sure nVidia, with some help from Sony, would be more then happy to bring the API back in to popularity. That enabling a nV extension could cause anyone to get upset should help clear up the issue of bias for you. If it upsets you- you are, decidedly, a fanboy. Ubisoft wouldn't be serving their customers properly if they failed to enable an extension that was readily available to them and worked flawlessly. Carmack used to have to jump through huge hoops to get ATi parts to render things properly under OpenGL- DooM3 nV cards were sailing through when he still didn't have menus working using ATi parts- he did what he needed to do to get it up and running to the best of his ability for his customers.

If you are stating gamers only deserve to have features enabled if they are from your favored vendor then you should just come right out and say it. Enabling a vendor provided extension has been commonplace in 3D gaming for more then a decade now.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Originally posted by: BenSkywalker
And this is why MS took out the ability to use DXCAPS in DX10. Pick a standard and stick to it, otherwise you end up coding to specific cards. Both UBIsoft and NV need beaten with a large, fresh trout.

Vendor specific extensions helped D3D kill off the last remnants of OpenGL. Take them away and I'm sure nVidia, with some help from Sony, would be more then happy to bring the API back in to popularity. That enabling a nV extension could cause anyone to get upset should help clear up the issue of bias for you. If it upsets you- you are, decidedly, a fanboy. Ubisoft wouldn't be serving their customers properly if they failed to enable an extension that was readily available to them and worked flawlessly. Carmack used to have to jump through huge hoops to get ATi parts to render things properly under OpenGL- DooM3 nV cards were sailing through when he still didn't have menus working using ATi parts- he did what he needed to do to get it up and running to the best of his ability for his customers.

If you are stating gamers only deserve to have features enabled if they are from your favored vendor then you should just come right out and say it. Enabling a vendor provided extension has been commonplace in 3D gaming for more then a decade now.
If being a fan of standards makes me a fanboy, then sure, lock me up and have me serve jailtime sequentially for being a fan of (in no specific order): DX10, IEEE1394, 802.11n, ISO9660, XHTML, etc.:p

Standards ensure a level playing field for all. If S3 or Intel come out with a card that similarly supports this depth buffer feature, will UBIsoft patch FC2 to take advantage of it? Or if everyone just followed the standard, then there wouldn't need to be any special case handling. Just look at the madness that was Shader Model 2.0; there was what the R300 series did, there was what the NV30 series did, and then there was what the R400 series did which was different still. Everyone wanted to add their own little feature incrementally, and the result is that if you want to support all the cards equally well, you have to write a shader path for 3 different SM2 standards. If you just target SM3, which works on NV40, G70, and R500, you only need 1 path.

And it's funny that you bring up Doom 3, since there's a great example of a disaster of different features. AMD supported some standard features and all of their own extensions correctly, and yet other standard features and NV extensions that were defacto standards did not work correctly. They were ultimately not standards compliant in any reasonable manner, and it made Carmack's job all the harder. OpenGL has a problem with standards anyhow though, and the Khronos group isn't helping matters with OpenGL 3 (although I wouldn't mind a resurgence of OpenGL, if only because it would force Apple to get their ass in gear on OS X).

Standards mean that you and I can browse sites with something other than Internet Explorer, standards mean that you and I can burn a CD and know it will work on another computer, standards mean that I can write a document on one program and open it in another. Hardware vendors pushing non-standard features on developers, and those developers using them in turn is not a good thing. It creates uncertainty and incompatibility for users. DXCAPS were removed in order to enforce a specific platform, and it's doing everyone a disservice when developers won't stick to it.

Pick a standard and stick to it, don't try to ride the line and half-ass it, otherwise you get IE6. Just because something is common (again, IE6) doesn't mean it's the right way to do things.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
10.1 a STANDARD... aka... you have to add all of the features in list X to be cerfied as DX10.1, what? you only got 50% of those features? well, no certification for you, you are still just a DX10.0 card... but yes, some games could directly access and run specific commands that the card supports that are part of a higher specification that the card does not meet.
Would have been nicer if they supported all of them (made it simpler to develop games that take advantage of such features too...)

Take for example MSAA 4x... to be DX10.1 certified a card MUST be able to do MSAA 4x...
But nothing is stopping a DX10.0 card from being capable of doing MSAA 4x...
How many cards are NOT DX10.1 and can do it?
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Standards ensure a level playing field for all.

Really? Like DXTC? The 'standard' that was still proprietary under any non D3D API(S3TC), that's a level playing field? Or how about a vendor that can handle 99% of the DX10.1 calls but MS won't let them use a DX10.1 API call, that level? If nV was the choice for the 360 then ATi would be the one lacking DX10.1 support and not nV.

Standards in no way whatsoever assure a level playing field, particularly not when the sole party deciding what the standard is going to be has a vested interest in seeing one IHVs parts succeed while at the same time having a vested interest in developers not wanting to work with another IHVs parts- to the tune of a multi billion dollar market.

And it's funny that you bring up Doom 3, since there's a great example of a disaster of different features. AMD supported some standard features and all of their own extensions correctly, and yet other standard features and NV extensions that were defacto standards did not work correctly. They were ultimately not standards compliant in any reasonable manner, and it made Carmack's job all the harder.

He made it all work. Would it have been better for him to simply use nV as refrast(which he honestly was doing at the time) and only released the game for parts that worked as they did? Not in the least- but that is exactly what you are saying developers should do after MS made up their mind on which IHV they were going to favor. I am not bashing MS here for the record in any way, the console market is a huge one for them and they are in a tight race with Sony in worldwide sales rates, of course they are going to do what they can to boost their market just as Sony will, as self serving as MS, do what it takes to back nVidia. If you think DX10.1 was decided on due to technical merit I would love to hear why, it was more like a list of features they forgot to add in to DX10 publicly until after they checked to see if nV supported them or not. Nothing in there offers any sort of major benefit except, perhaps, to the 3x00 ATi parts.