PCPer is out with an article about how the PS4 version of the UE4 demo was scaled back. I guess at the end of the day, raw GFLOPS matter (and can't be replaced by efficiency gains).
Honestly, that editorial is complete rubbish, although Epic can certainly take some of the blame for throwing out random numbers with imprecise language.
Let's do some math:
A GTX 680 is 3.09 TFLOPs, so in order to render a scene that requires a "theoretical" 2.5 TFLOPs the GTX 680 needs to be (2.5 / 3.09 * 100) 80.9% efficient, which is incredibly high.
Let's look at the
latest ratings for 1920x1200 from TPU. It's a bit imprecise, but will fit our needs adequately.
GTX 580 = 60% at 1581 GFLOPs, or 26.35 GFOPs per %
GTX 680 = 76% at 3090 GFLOPs or 40.66 GFLOPs per %
Based on these numbers we can tell that GTX 680 is only (26.35 / 40.66 * 100) 64.8% efficient per FLOP compared to GTX 580. Can you see the issue now?
Assuming that GTX 580 is 100% efficient, which it obviously isn't -- but just for the sake of further driving in an argument we'll assume it is -- this leaves GTX 680 with a (2500 - (3090 * 0.648)) 497.7 GFLOP performance shortfall compared to the 2.5 TFLOPs that are supposedly required to run the demo. This shortfall would be far greater still if we were able to factor in the efficiency rating of GTX 580 instead of assuming 100% efficiency.
The fact that GTX 680 can run the Samaritan demo at all debunks the claim that it requires 2.5 TFLOPs of theoretical power -- GTX 680 isn't anywhere close to efficient enough to bring that amount of power to bear on the task.
Epic is guilty of pulling some random numbers out of their ass (and/our misuse of the word theoretical), and the author of the editorial is guilty of not seeing how he's contradicting himself.
This isn't to say that PS4 can or cannot run the Samaritan demo, just that the editorial is complete garbage.