VirtualLarry
No Lifer
Originally posted by: SickBeast
NVidia tried to do that with NV30 and failed miserably. Instead of making a DX9-compliant card, they made one that complied with their "CineFX" technology and "C for Graphics". We all know how that situation turned out.
So many people forgot - NVidia's first PC graphics hardware-accelerator, was called the NV-1. (I have one.) This was released before DirectX, and indeed, it actually supported not only a proprietary API, but had its own proprietary way of rendering 3D scenes, using quadratic patches rather than triangle meshes. It was bundled with several Sega Saturn game ports, and also had dual Saturn-compatible digital-joystick ports, and a 64-voice music synthesizer besides. (Yes, it was a "multimedia accelerator", both video and sound, and also game input.)
Needless to say, DirectX was released (actually, Direct3D), it was triangle-based, most of the game developers were switching from DOS to 32-bit Windows platforms at the same time anyways, and most were tired of supporting half-a-dozen proprietary 3D APIs for their games. (If you thought that it was bad that game developers had to release patches because of ATI or NV issues, imagine how bad if you had to test and issue patches for six different API standards? It truely was a hell.)
Anyways, NV-1 was totally incompatible with Direct3D. Sales tanked, and it almost, literally, killed the company. From then on, NV has always tried to fully support as many "open" 3D standards as they can, avoiding tying their hardware to any particular proprietary standards. This is probably also a reason why their OpenGL support is also so good - in case MS, for whatever reason, decides to take DirectX/D3D totally proprietary, or kill it off completely, it won't kill off NVidia in the process.
CineFX was more of a marketing technology, and "C for Graphics" isn't exactly dead, in fact, newer versions of DirectX are now going to sport some sort of HLSL compilier themselves, which IMHO is a good thing for developers.
I don't really think that either one of those initiatives was really an attempt to move developers to support a proprietary API rather than a "standard" one, but rather a way to get them to prefer NV-specific enhancement technologies over a competitors. Granted, this distinction may be rather slim, but it means that the devs can still fall back to the "standard", and be compatible, but with the tradeoff that they will "look better" or otherwise have an advantage on NV-based hardware, if the devs choose to spend the additional time supporting them.