The same way that the entire Fermi generation was a fail by being 6 months late: bringing working 3D gaming, unparalleled tessellation performance (about 2 years ahead of AMD), while achieving 60% market share in the discrete GPU space and giving us awesome price/performance in GTX460 and GTX560Ti cards, and at the same time putting price pressume on AMD, resulting in such gems as the HD6950 2GB at sub $299? You think AMD would have focused on Tessellation performance improvements in HD6900 and HD7900 if NV wasn't dominant in this area before? Let's consider that competition can often improve the landscape for all of us, even if the competitor does arrive late.
In the current PC era characterized by console ports/game engines that lack realism, it would be amazing to see a GPU maker trying to add some unique features that may encourage game developers to produce more realistic games. Sure, thus far physX has not worked out, but that doesn't mean a company should just give up. I'd rather take more realistic physics effects (regardless if NV or AMD brings that advantage) than 80-120 FPS in DX9 game engines and/or 20-30 fps in DX11 games with extreme levels of Tessellation (arguably the only standout DX11 feature).
At the end of the day, HD7900 series did not revolutionize much aside from Zero Power core technology. The new cards still cannot hope to cope with heavy Tessellation in games like Metro 2033, struggle in demanding games like Dragon Age 2 and are still more or less unable to pull off
Eyefinity with a single card on 3 monitors. That more or less leaves us having to resort to inefficient deferred AA settings in modern game engines that more often than not result in blur fest and/or massive performance hits in games such as BF3 to extract some value. Are we at a point of diminishing returns though?
Fundamentally, games barely look better or feel any more realistic than Crysis 1 from 2007. Maybe I am getting older, but I am no longer impressed by 40% performance increases. I want more
realistic PC games. Sure, having a card powerful enough to game on a 30 inch 2560x1600 monitor is a nice option, or being able to play on 3 screens is decent, but that's not improving realism of gaming whatsoever.
If NV is 6 or even 9 months late but actually adds something that has a long-term potential to improve the gaming experience, it might be worth the investment for them and for all of us gamers (as it might encourage AMD to focus on physics for once). The unfortunate side effect of being late is the lack of pricing pressure on AMD.
However, personally, I don't care about Eyefinity due to bezels or 120 Hz PC gaming @ 120 fps. As such, it would be a breath of fresh air if
any company actually incorporated some useful features that help improve the realism in games. After the incredible hype behind BF3's graphics/realism, the game is a laughing stock. 5 years after Crysis 1 and the PC gaming industry hasn't moved a dime, aside from more fluid character animations ripped off from EA sports game engines. Sad really.
Maybe it's time games started to focus on physics more and worry less about AA and high resolutions so much.
Softimage Lagoa ICE - Mousetrap 1080p HD
and
Real-Time Grass Rendering
and
Physically Guided Animation of Trees
GRAW with Ageia PhysX (not bad)
Or do we people want 5 monitor gaming at super high resolutions and 128x AA filters with the same ragdoll and physics effects from 2008?