My point is this is yet another example of a poorly optimized and broken console-to-PC port. Sure, it has great graphics but the level of PC hardware it requires to match a $200 console is cringe-worthy. Just because a game is gorgeous looking does not give it a waiver from being labelled poorly optimized when considering the context.
This is something that most of us are aware of, so repeating it doesn't really do any good. At any rate, I don't know if I would consider Quantum Break a gorgeous game. Too much blur..
My other point was a reflection that despite you consistently claiming that console hardware has little to no benefits over PC parts when it comes to extracting low-level performance, time and time again this has been proven wrong this generation.
Console hardware is the same as PC hardware, just closed and more integrated.. It has no intrinsic advantage over similar PC hardware when it comes to low level performance. The big difference between them is the
SOFTWARE. That's where consoles have a significant advantage, because being a fixed platform means that developers can target much more focused optimizations than what's possible on PC; even if the latter uses DX12 or Vulkan.
First, an i3 + GTX750Ti could play most XB1/PS4 games and generally an i5 and a GTX950/960 is needed just to provide a similar IQ/performance to XB1/PS4. What makes Forza Horizon 3 such a startling stand-out example is the horrendous level of performance on a PC with an i7 + GTX970/R9 390 at 1080p given the gigantic gulf in hardware advantage.
You answered this question yourself. Optimization clearly matters, especially for PC games. The question is, why do you keep repeating it as though it's some revelation? Usually it takes about 3 months to see final performance on a PC game, especially if the game is using a new and untested engine, or the developer is using a new API like DX12.
The other point you made that optimizing a game for 30 fps is much easier is unproven. First, at 1080p 4xMSAA, neither the 970 nor the 390 can provide a locked 30 fps. XBOne otoh manages to more or less do just that with a GPU 3X slower. Second, there is little doubt that once XB Scorpio comes out, this game may even run at 4K or 1080p 60 FPS, something that today requires a $700-1200 1080/Titan XP.
You're using Forza 3 Horizon, a poorly optimized title for the PC as evidence? Come on, that's just disingenuous. If we're going to do comparisons, we need to do so on an equal footing; that is, an example of great optimization for both consoles and PC. Some good examples that come to mind are The Division, GTA V, Star Wars Battlefront etcetera.. By all reports, Gears of War 4 will also be highly optimized on the PC, so that will be another great example to use as it's well optimized for Xbox One as well.
But the point is, when you use highly optimized games for both, then the PC obviously comes out ahead by a significant margin. I mean, look at Doom. Consoles have to use dynamic resolution scaling (both will drop below 1080p) to maintain the 60 FPS target, and that's with below ultra settings. My PC on the other hand delivers a full native 1440p resolution, at 150 FPS, with
ABOVE ultra settings.. There's just no comparison!
Do you realize that NV themselves rated GTX580 as 9X faster than the GPU inside PS3?
Doesn't matter, because no game ever came close to fully exploiting the GTX 580, unlike the GPU in the PS3 which was totally maxed out by developers.
Assuming a 50-80% SLI scaling, your setup was between 9X (no SLI scaling) to 16X faster than PS3. There is nothing special about $1000 USD of 580 SLI destroying 90%+ of 2010-2011 console ports considering XB 360/PS3 came out in 2005/2006.
I didn't say there was anything special. I was just making a point of how easy it was to max out console ports during the 360/PS3 era..
I am not sure why you bring up the comparison of GTX580 SLI being adequate for end of generation Xbox 360/PS3 games and the current context of Quantum Break or FH3. In contrast, Forza Horizon 3 came out about 3 years since Xbox One did and it's wiping the floor with a $700 780Ti. You are saying that's legit?
You're totally missing the point. My point was that technological trends matter just as much as optimization when it comes to final performance. Games that were developed in the 360/PS3 era, are nothing like what they are now as they are
WAY more complex and use compute much more heavily, so the performance profiles have obviously changed..
You are just stating the obvious without providing any of the details. You actually set yourself up by claiming that GTX580 SLI didn't need to be upgraded until PS4/XB1 launches but you forgot that a single GTX580 is much faster than the HD7790 in Xbox One?
I haven't set myself up for anything, because as I said before, no game whatsoever, came close to tapping the GTX 580. To do so would require a low level API like what you see in consoles..
Considering GTX970/390 cannot even do 1080p 4xMSAA 30 fps locked on the PC when paired with an i7 5820K @ 4.4Ghz, how do you think a GTX580 3GB would do? It would bomb.
I'm going to ignore comments about FH3, because the game is clearly unoptimized at this stage for PC. This will likely change in the future, so until then, no point in theorizing at all..
Yes, it is how it should be except once again you are missing the details. It wasn't unusual for Xbox 360 or PS3 to have similar level of graphics/performance to a 2005-2006 PC with a $599 7800GTX 256MB in it. That's because the GPUs inside those consoles are very similar in performance. Did you know that GTX780 is almost 2.5X faster than HD7790?
Don't know why you keep bringing this up. Optimization
MATTERS a great deal, so repeatedly asking why PC hardware which is much more powerful than console hardware doesn't stack up as well, it's obviously because the idiot developers didn't optimize their own game properly for the PC..
It doesn't matter how powerful the hardware is, if the game isn't optimized to use the hardware properly then performance is going to suffer..
Except 2 things: (1) NV magically improved performance in Project CARS and The Witcher 3 on Kepler cards after Kepler owners complained -- so we have a proven history of NV's drivers adding huge performance gains post-launch; (2) Kepler's performance degraded dramatically against GCN during the same 1.5-2 year period when GCN and Maxwell remained a lot closer. Both Kepler and Maxwell have static schedulers and neither architecture was ever a compute monster. Both of these architectures don't even have async compute and I don't recall much proof that Maxwell is a superior compute architecture than Kepler. Right now a GTX780/780Ti trail R9 290/290X by 20-30% in modern titles.
And after these driver improvements, Kepler was still much slower than Maxwell in those two games. Also, Maxwell had significantly beefed up compute performance over Kepler.. Case in point:
You are trying to make an argument that older GPUs such as GTX780Ti/R9 290/290X/970 shouldn't play modern titles well at 1080p since technological advancement means future games get more demanding/complex and these GPUs/architectures were never meant to play those future titles well. This argument would be all fine and dandy EXCEPT that a 1.75Ghz 8-core Jaguar + HD7790 is miles behind in performance to GTX780/780Ti/970 GPUs and architecturally are not any better than Hawaii. Then how do you explain such mediocre performance on 970/R9 390? You are saying HD7790 is more advanced for modern game engines?
Again, you're using a specific example of a poorly optimized title on PC, and comparing it with one of the best examples of optimization on the Xbox One..
If you're going to do this, then I will just cite Doom. Xbox One can't even maintain 1080p at 60 FPS with a mixture of high and medium settings, but has to drop down 900p, and even 720p at times.. GTX 970 on the other hand is hitting and maintaining triple digit framerates with Vulkan at 1080p ultra quality..
Notice how Quantum Break, Forza Horizon 3 and Gears of War Ultimate all launched with performance issues? It's starting to form a trend that Xbox One exclusives are poorly optimized when ported to the PC.
Since the developer is already acknowledging stuttering and performance issues, I am surprised you are not acknowledging that FH3 has performance issues.
Personally I don't really care about FH3 as I'm not a big fan of racing games. Also, whilst GoW Ultimate did launch with performance issues, The Coalition patched it up nicely fairly quickly. Last time I played, I was getting triple digit frame rates at 1440p max settings, which is impressive seeing as Unreal Engine 3.5 isn't really a modern engine to begin with..
It will be exciting to see what they do with Unreal Engine 4, which is much more powerful and capable when it comes to using modern hardware..