Originally posted by: BenSkywalker
And this is why MS took out the ability to use DXCAPS in DX10. Pick a standard and stick to it, otherwise you end up coding to specific cards. Both UBIsoft and NV need beaten with a large, fresh trout.
Vendor specific extensions helped D3D kill off the last remnants of OpenGL. Take them away and I'm sure nVidia, with some help from Sony, would be more then happy to bring the API back in to popularity. That enabling a nV extension could cause anyone to get upset should help clear up the issue of bias for you. If it upsets you- you are, decidedly, a fanboy. Ubisoft wouldn't be serving their customers properly if they failed to enable an extension that was readily available to them and worked flawlessly. Carmack used to have to jump through huge hoops to get ATi parts to render things properly under OpenGL- DooM3 nV cards were sailing through when he still didn't have menus working using ATi parts- he did what he needed to do to get it up and running to the best of his ability for his customers.
If you are stating gamers only deserve to have features enabled if they are from your favored vendor then you should just come right out and say it. Enabling a vendor provided extension has been commonplace in 3D gaming for more then a decade now.
If being a fan of standards makes me a fanboy, then sure, lock me up and have me serve jailtime sequentially for being a fan of (in no specific order): DX10, IEEE1394, 802.11n, ISO9660, XHTML, etc.
Standards ensure a level playing field for all. If S3 or Intel come out with a card that similarly supports this depth buffer feature, will UBIsoft patch FC2 to take advantage of it? Or if everyone just followed the standard, then there wouldn't need to be any special case handling. Just look at the madness that was Shader Model 2.0; there was what the R300 series did, there was what the NV30 series did, and then there was what the R400 series did which was different still. Everyone wanted to add their own little feature incrementally, and the result is that if you want to support all the cards equally well, you have to write a shader path for 3 different SM2 standards. If you just target SM3, which works on NV40, G70, and R500, you only need 1 path.
And it's funny that you bring up Doom 3, since there's a great example of a disaster of different features. AMD supported some standard features and all of their own extensions correctly, and yet other standard features and NV extensions that were defacto standards did not work correctly. They were ultimately not standards compliant in any reasonable manner, and it made Carmack's job all the harder. OpenGL has a problem with standards anyhow though, and the Khronos group isn't helping matters with OpenGL 3 (although I wouldn't mind a resurgence of OpenGL, if only because it would force Apple to get their ass in gear on OS X).
Standards mean that you and I can browse sites with something other than Internet Explorer, standards mean that you and I can burn a CD and know it will work on another computer, standards mean that I can write a document on one program and open it in another. Hardware vendors pushing non-standard features on developers, and those developers using them in turn is not a good thing. It creates uncertainty and incompatibility for users. DXCAPS were removed in order to enforce a specific platform, and it's doing everyone a disservice when developers won't stick to it.
Pick a standard and stick to it, don't try to ride the line and half-ass it, otherwise you get IE6. Just because something is common (again, IE6) doesn't mean it's the right way to do things.