Quantum Break тест GPU (Gamegpu.com)

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

dogen1

Senior member
Oct 14, 2014
739
40
91
The other point you made that optimizing a game for 30 fps is much easier is unproven.

Of course optimizing for 30 fps is easier. 2x as much CPU and GPU time.

I don't recall much proof that Maxwell is a superior compute architecture than Kepler. Right now a GTX780/780Ti trail R9 290/290X by 20-30% in modern titles.

Just want to mention that Maxwell was actually a big improvement in compute performance.

In some areas it improved more than 3x over kepler.
https://forum.beyond3d.com/posts/1946116/

Benches also showed large improvements.
http://www.anandtech.com/show/8526/nvidia-geforce-gtx-980-review/20
 
  • Like
Reactions: Carfax83

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
My point is this is yet another example of a poorly optimized and broken console-to-PC port. Sure, it has great graphics but the level of PC hardware it requires to match a $200 console is cringe-worthy. Just because a game is gorgeous looking does not give it a waiver from being labelled poorly optimized when considering the context.

This is something that most of us are aware of, so repeating it doesn't really do any good. At any rate, I don't know if I would consider Quantum Break a gorgeous game. Too much blur..

My other point was a reflection that despite you consistently claiming that console hardware has little to no benefits over PC parts when it comes to extracting low-level performance, time and time again this has been proven wrong this generation.

Console hardware is the same as PC hardware, just closed and more integrated.. It has no intrinsic advantage over similar PC hardware when it comes to low level performance. The big difference between them is the SOFTWARE. That's where consoles have a significant advantage, because being a fixed platform means that developers can target much more focused optimizations than what's possible on PC; even if the latter uses DX12 or Vulkan.

First, an i3 + GTX750Ti could play most XB1/PS4 games and generally an i5 and a GTX950/960 is needed just to provide a similar IQ/performance to XB1/PS4. What makes Forza Horizon 3 such a startling stand-out example is the horrendous level of performance on a PC with an i7 + GTX970/R9 390 at 1080p given the gigantic gulf in hardware advantage.

You answered this question yourself. Optimization clearly matters, especially for PC games. The question is, why do you keep repeating it as though it's some revelation? Usually it takes about 3 months to see final performance on a PC game, especially if the game is using a new and untested engine, or the developer is using a new API like DX12.

The other point you made that optimizing a game for 30 fps is much easier is unproven. First, at 1080p 4xMSAA, neither the 970 nor the 390 can provide a locked 30 fps. XBOne otoh manages to more or less do just that with a GPU 3X slower. Second, there is little doubt that once XB Scorpio comes out, this game may even run at 4K or 1080p 60 FPS, something that today requires a $700-1200 1080/Titan XP.

You're using Forza 3 Horizon, a poorly optimized title for the PC as evidence? Come on, that's just disingenuous. If we're going to do comparisons, we need to do so on an equal footing; that is, an example of great optimization for both consoles and PC. Some good examples that come to mind are The Division, GTA V, Star Wars Battlefront etcetera.. By all reports, Gears of War 4 will also be highly optimized on the PC, so that will be another great example to use as it's well optimized for Xbox One as well.

But the point is, when you use highly optimized games for both, then the PC obviously comes out ahead by a significant margin. I mean, look at Doom. Consoles have to use dynamic resolution scaling (both will drop below 1080p) to maintain the 60 FPS target, and that's with below ultra settings. My PC on the other hand delivers a full native 1440p resolution, at 150 FPS, with ABOVE ultra settings.. There's just no comparison!

Do you realize that NV themselves rated GTX580 as 9X faster than the GPU inside PS3?

Doesn't matter, because no game ever came close to fully exploiting the GTX 580, unlike the GPU in the PS3 which was totally maxed out by developers.

Assuming a 50-80% SLI scaling, your setup was between 9X (no SLI scaling) to 16X faster than PS3. There is nothing special about $1000 USD of 580 SLI destroying 90%+ of 2010-2011 console ports considering XB 360/PS3 came out in 2005/2006.

I didn't say there was anything special. I was just making a point of how easy it was to max out console ports during the 360/PS3 era..

I am not sure why you bring up the comparison of GTX580 SLI being adequate for end of generation Xbox 360/PS3 games and the current context of Quantum Break or FH3. In contrast, Forza Horizon 3 came out about 3 years since Xbox One did and it's wiping the floor with a $700 780Ti. You are saying that's legit?

You're totally missing the point. My point was that technological trends matter just as much as optimization when it comes to final performance. Games that were developed in the 360/PS3 era, are nothing like what they are now as they are WAY more complex and use compute much more heavily, so the performance profiles have obviously changed..

You are just stating the obvious without providing any of the details. You actually set yourself up by claiming that GTX580 SLI didn't need to be upgraded until PS4/XB1 launches but you forgot that a single GTX580 is much faster than the HD7790 in Xbox One?

I haven't set myself up for anything, because as I said before, no game whatsoever, came close to tapping the GTX 580. To do so would require a low level API like what you see in consoles..

Considering GTX970/390 cannot even do 1080p 4xMSAA 30 fps locked on the PC when paired with an i7 5820K @ 4.4Ghz, how do you think a GTX580 3GB would do? It would bomb.

I'm going to ignore comments about FH3, because the game is clearly unoptimized at this stage for PC. This will likely change in the future, so until then, no point in theorizing at all..

Yes, it is how it should be except once again you are missing the details. It wasn't unusual for Xbox 360 or PS3 to have similar level of graphics/performance to a 2005-2006 PC with a $599 7800GTX 256MB in it. That's because the GPUs inside those consoles are very similar in performance. Did you know that GTX780 is almost 2.5X faster than HD7790?

Don't know why you keep bringing this up. Optimization MATTERS a great deal, so repeatedly asking why PC hardware which is much more powerful than console hardware doesn't stack up as well, it's obviously because the idiot developers didn't optimize their own game properly for the PC..

It doesn't matter how powerful the hardware is, if the game isn't optimized to use the hardware properly then performance is going to suffer..

Except 2 things: (1) NV magically improved performance in Project CARS and The Witcher 3 on Kepler cards after Kepler owners complained -- so we have a proven history of NV's drivers adding huge performance gains post-launch; (2) Kepler's performance degraded dramatically against GCN during the same 1.5-2 year period when GCN and Maxwell remained a lot closer. Both Kepler and Maxwell have static schedulers and neither architecture was ever a compute monster. Both of these architectures don't even have async compute and I don't recall much proof that Maxwell is a superior compute architecture than Kepler. Right now a GTX780/780Ti trail R9 290/290X by 20-30% in modern titles.

And after these driver improvements, Kepler was still much slower than Maxwell in those two games. Also, Maxwell had significantly beefed up compute performance over Kepler.. Case in point:

67744.png


You are trying to make an argument that older GPUs such as GTX780Ti/R9 290/290X/970 shouldn't play modern titles well at 1080p since technological advancement means future games get more demanding/complex and these GPUs/architectures were never meant to play those future titles well. This argument would be all fine and dandy EXCEPT that a 1.75Ghz 8-core Jaguar + HD7790 is miles behind in performance to GTX780/780Ti/970 GPUs and architecturally are not any better than Hawaii. Then how do you explain such mediocre performance on 970/R9 390? You are saying HD7790 is more advanced for modern game engines? ;)

Again, you're using a specific example of a poorly optimized title on PC, and comparing it with one of the best examples of optimization on the Xbox One..

If you're going to do this, then I will just cite Doom. Xbox One can't even maintain 1080p at 60 FPS with a mixture of high and medium settings, but has to drop down 900p, and even 720p at times.. GTX 970 on the other hand is hitting and maintaining triple digit framerates with Vulkan at 1080p ultra quality..

Notice how Quantum Break, Forza Horizon 3 and Gears of War Ultimate all launched with performance issues? It's starting to form a trend that Xbox One exclusives are poorly optimized when ported to the PC. Since the developer is already acknowledging stuttering and performance issues, I am surprised you are not acknowledging that FH3 has performance issues.

Personally I don't really care about FH3 as I'm not a big fan of racing games. Also, whilst GoW Ultimate did launch with performance issues, The Coalition patched it up nicely fairly quickly. Last time I played, I was getting triple digit frame rates at 1440p max settings, which is impressive seeing as Unreal Engine 3.5 isn't really a modern engine to begin with..

It will be exciting to see what they do with Unreal Engine 4, which is much more powerful and capable when it comes to using modern hardware..
 
  • Like
Reactions: tviceman

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
But the point is, when you use highly optimized games for both, then the PC obviously comes out ahead by a significant margin. I mean, look at Doom. Consoles have to use dynamic resolution scaling (both will drop below 1080p) to maintain the 60 FPS target, and that's with below ultra settings. My PC on the other hand delivers a full native 1440p resolution, at 150 FPS, with ABOVE ultra settings.. There's just no comparison!

Your CPU costs $600 and the GPU costs $700+, or $1500+ for just two components of your build. Those alone is 5 times the cost of the xbox one s.

You are ignoring the price of the parts in your post which is what he was trying to reference. XB1 / PS4 are running very outdated hardware which performs better than a lot of newer hardware. Comparing their much cheaper older hardware vs your much more expensive brand new stuff isn't showing that PC games are super optimized well ;)
 
  • Like
Reactions: Final8ty

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Your CPU costs $600 and the GPU costs $700+, or $1500+ for just two components of your build. Those alone is 5 times the cost of the xbox one s.

I paid $349.99 for my CPU, on sale from Microcenter (Das beste!).

You are ignoring the price of the parts in your post which is what he was trying to reference. XB1 / PS4 are running very outdated hardware which performs better than a lot of newer hardware. Comparing their much cheaper older hardware vs your much more expensive brand new stuff isn't showing that PC games are super optimized well ;)

The price is irrelevant, it's the experience that matters. If I wanted the standard experience, I would buy a console and put up with blurry graphics, framerate drops, poor performance, low or no customization, and all the other crap.. I play on PC because I cannot tolerate such things.. But if you want to consider price, then as I said later on in my post, even the GTX 970 can easily attain triple digit framerates at 1080p ultra quality in Doom. Also, the Xbox One and PS4 may only perform better if the PC version is badly programmed..

But that can happen to consoles as well. For instance, the Witcher 3 had plenty of performance issues on both PS4 and Xbox One (took many patches to rectify), whilst on PC it was very optimized from the onset. AC Unity also ran much better on PC, and the performance issues were never really fixed on consoles if I recall. Buying Dragon Age Inquisition on Xbox 360 and PS3 was akin to watching a slideshow, even though the IQ was scaled back tremendously. See, the consequences of technological advancement affects consoles as well:

 

psolord

Golden Member
Sep 16, 2009
1,939
1,195
136
Hello. I did custom gameplay benchmarks on my systems for anyone interested.

I benchmarked the campus escape mission. It's a 13min run, which as I understand it's not the heaviest part of the game, but it seemed suitable for my benchmarking format. I only care about the performance delta of my systems anyway.

Speaking of which, it's around 60%, but I have to note here, that the 7950 had 90% gpu load. I was afraid that this was some peculiarity of MSI Afterburner, but even with it unloaded, gpuz was also showing 90% gpu load. So I guess we are looking at plain old cpu limits.

The cpu usage graphs, are suspiciously similar, which typically indicates thread switching.

Anyhoo, here the benches if anyone is interested (spicy wallpapers warning :) ).

Quantum Break 1920x1080 Ultra GTX 970 @1.5Ghz Core i5 2500k @4.8GHz - 72fps

Quantum Break 1920x1080 Ultra 7950 @1.1Ghz CORE i7-860 @4GHz - 44fps

Even if this is not the heaviest part of the game and granted, with the most part of the benchmark indoors, I expected worse from what I had heard. Ok these are overclocked systems but still...! I mean my gpus are barely 10% overclocked.

Other than that, seems like an interesting and fun game to play. Nice story too.

Regarding image quality, I didn't notice anything weird on either system. I tried to put together a side by side, youtubedoubler link, but it will not help much. GTX 970 seems more washed out, but that's due to a peculiarity of my recording device (which I haven't figured out since I use the exact same recording settings).

http://youtubedoubler.com/?video1=https://www.youtube.com/watch?v=tcdLFrELtSs&start1=195&video2=https://www.youtube.com/watch?v=UtpCVNT1Ga0&start2=202&authorName=quantum+break+970+vs+7950
 
  • Like
Reactions: AtenRa