Global Ilumination Forward+ style is an utter garbage(don't make me post Dirt Showdown pics), rivaled only with Metro 2033 DOF.
What you are saying is the performance hit with Global Illumination ON is not worth the performance penalty. A lot of people would agree with you. However, what most people don't realize is how extremely intensive global illumination is. If you don't want this feature in next gen games, no problem. Say you do?
"Thankfully, we were able to mitigate the performance penalty by migrating the lighting computation to DirectCompute, giving the breathtaking compute capabilities of the GCN Architecture a runway to strut its stuff. Graphics Core Next’s compute resources are actually so powerful that we were able to take this lighting one step further with a separable bilateral filter, which further improves the performance of the simulation and scrubs the scene of potential artifacts in the shadowing."
In other words, the performance hit without DirectCompute would have been way worse (like it is on HD6970 or GTX680). I don't disagree with you that the Global Illumination is way too advanced for modern GPUs today given the huge performance hit, even with DirectCompute path. However, in the future as GPUs get 4-5x more powerful, wouldn't you rather reduce the performance hit by 70-80% if you could do so using DirectCompute? See what you guys aren't reading is DirectCompute is used in cases where GPU works faster, not slower to perform the same calculations. This is not at all like tessellation. The downside is that some of these more advanced graphical features aren't worth their cost on most GPUs today because the existing architectures aren't truly advanced enough for games to start using way more compute shaders. At least in F+ model the MSAA works awesome in Dirt Showdown while games like BF3/Crysis 3
double the MSAA performance hit under the deferred lighting path.
Both companies play dirty.
From AMD blog on Sleeping Dogs:
"You, as the user, have configured your game to run at 1920×1080, and you’ve selected 4xSSAA as your anti-aliasing method. These settings tell the graphics card to render the game’s content at a 4x larger resolution of 3840×2160 (ultra-high definition), then resize that frame back down to 1920×1080 before display on the monitor. At 3840×2160, the game might have a hard edge with 16 pixels that are obviously jagged. After the resize to 1920x1080p, however, these pixels are reduced by our SSAA factor (4x in our example), leaving you with a considerably smaller jagged edge of just four pixels. This effect is applied across the entire scene, hiding visual artifacts in the same way shrinking a picture in an image editor can hide flaws in a photo. As we did with HDAO, however, we take AA one step further in Sleeping Dogs. The “Extreme” anti-aliasing setting uses the compute horsepower of Graphics Core Next to do another anti-aliasing pass on the final frame, which will smooth out those last four pixels of aliasing we described in the example above. The resources required to drive the extreme setting are quite intense." ~ Source
Remember how I said as a joke that once AMD starts throwing $ at developers with its AMD Gaming Evolved, we'd need an AMD and NV GPU for AMD GE titles and NV's TWIMTPB titles?
Anyway, most of us here already bought HD7000/GTX600 series. What's even the point of arguing so intensely since it's not as if anyone of us will switch back to the other card. No, we'll be upgrading to much faster cards.
Supposedly a leaked image of the Titan = 8+6 pin. The card has 12 memory chips on the front, 12 on the back are expected to have a backplate. The PCB holds a total 8-phase VRM, which is situated at the backside of the PCB along with ICs and MOSFETs. Additional two VRM phases are located near the SLI connectors that power the memory. There are actually 2 SLI fingers on the GPU which allows Quad-Way SLI Support.
Here is GTX480 for comparison.