Interesting article, though
Tamasi even admitted that the PlayStation 2 was faster than a PC at the time of its launch.
Is just not true. There's no way in hell a Playstation 2 was running some of the games that were out around it's launch. Deus Ex launched months before the PS2, and the PS2 could never hope to run that. (There was an awesome stab at a port, and I'm glad they did it, and am impressed with the results, but it had to be downgraded in many big ways to get it on a PS2.)
That generation Nvidia was already way ahead. The Xbox 1 basically used a Geforce 3+ (or Geforce 4- LOL) and the Gamecube's GPU was made by people quickly aquired by ATi (and seemingly the source of their success after that point?)
At any rate, save for that I have no idea why people are complaining about this. Sure it's self serving, but so what? It's also true.
They're really upset over losing out on this round of consoles. All they can do at this point is whine, since bad publicity is better than no publicity.
That's not whining, and not bad publicity. It's also unknown what kind of deal AMD did with these companies. It may be heavily that AMD could provide both CPU and GPU, or it could just be that AMD was willing to concede more than Nvidia. Back on the Xbox 1, Microsoft acted incredibly slimy towards Nvidia, demanding that they give them GPUs for less than what their contract stated, and then when they wouldn't, retaliating against them with SM2.0, which they cut Nvidia out of the loop on, resulting in Nvidia's GPUs targeting presumed targets that OpenGL used, while ATi could target the actual targets, making Nvidia look bad even though technically they had a more programmable part.
Once of the problems is we've already had technology pushed so far that the increments we get are very small now.
Do not even remotely agree. We're probably centuries away from "maxing out" what could be done graphically and in terms of a game engine, assuming Moore's Law could actually hold up, which it presumably can't. I head that same ridiculous claim last time, and this set of consoles has done more than the last did IMO. There's no reason to think the next set won't jump forward just as much, and the one after that, and... (assuming we still have dedicated consoles then).
The true pushing of technology isn't coming from slightly better graphics than the previous generation. It is in completely different areas like the Oculus Rift and even in technology like the Kinect (which isn't implemented more than a gimmick, but the technology has vast potential).
Could not possibly disagree more with every part of that. The graphics aren't "slightly better", they jump is massive. It's massive even in some first gen stuff, let alone what we'll see in 5 years. And gameplay possibilities will, as always be just as massive.
Gimmicks are not leaps forward, they're gimmicks. Kinnect serves no purpose for actual games. Wii's "motion" controls have been largely dumped, and offhand I can't think of a single real game that used them successfully.
Controls will undoubtedly evolve, but replacing well tested, accurate controls with ones that can't even function reliably, and at best do a half-assed job of mimicking things we can do better in other ways, is no progress.
Nvidia doesn't get this, it seems.
If you were right, then Sony and Microsoft and AMD and Qualcomm and Apple, etc., etc., etc. don't get it either. The whole industry doesn't get it. They should be adding more stupid gimmicks and never bother upgrading hardware.
Can't comment on Crysis yet since I still haven't played it, but I've heard good things about the gameplay, and it's weird regardless to single out one single company.