AMD Ryzen (Summit Ridge) Benchmarks Thread (use new thread)

Page 172 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.
D

DeletedMember377562

I agree with this. Low percentile frame rates are where higher core counts seem to shine though, possibly due to poorly controlled variables like background processes.

That's a myth, and it's untrue. Did you not bother to read the TH link I just gave you? They even have frame rate charts to look at...
 

crashtech

Lifer
Jan 4, 2013
10,521
2,111
146
That's a myth, and it's untrue. Did you not bother to read the TH link I just gave you? They even have frame rate charts to look at...
I've seen it myself plenty of times, don't just swoop in here and think everyone's going to see it your way on the basis of a few graphs.
 
D

DeletedMember377562

I've seen it myself plenty of times, don't just swoop in here and think everyone's going to see it your way on the basis of a few graphs.


Unlike yourself I have evidence of actual tests to back up my claims. The TH numbers are right there for you to see. Even other sites with similar tests (1440p) will show that any difference in performance will be as big in average as in minimum (low percentile).
 

dfk7677

Member
Sep 6, 2007
64
21
81
Heard that story thousands of times. What you and many others don't seem to understand is just because more threads are used, doesn't mean the game is reliant on them. Even the developers themselves admit to only 4 threads being necessary, whereas the rest are "worker cores", that they can split the tasks up on. And you even see in the performance gains. You mention Battlefield, for example. I don't know of any proper CPU tests of that game. What I do know, however, is that BF4 (running the same Frostbite engine) will give you the same exact performance on 4 cores as 6 and 8 cores, even at 1080p, with a GTX 1080. As proven here: http://www.tomshardware.com/reviews/multi-core-cpu-scaling-directx-11,4768-2.html

Ok, man, suit yourself. The IPC tests were done by myself using intel's PCM. All CPU tests were done with the lowest settings that make sure the game is not bottlenecked by the GPU.

BF1 and BF4 use a different version of Frostbite engine (that is why BF1 is more heavy on the CPU and also makes a great use of HT).

BF1 load on the CPU is not dependent on resolution or GFX settings (less than 5% difference), only on individual map (I am guessing depending on the amount of destructible environment as 'Amiens' map is the heaviest one).

I am just giving you proper CPU tests for that game. I am not trying to convince someone that more threads are (or will be better) for gaming. I just shared some of my own specific knowledge about this game and CPU usage in it, it is everybody's choice to believe it or not.

I can attest that this game is the first one I have owned that 4 threads are not enough.
 
D

DeletedMember377562

Ok, man, suit yourself. The IPC tests were done by myself using intel's PCM. All CPU tests were done with the lowest settings that make sure the game is not bottlenecked by the GPU.

Which makes the tests completely irrelevant. You don't do benchmark tests for games, if they don't give a correct description of reality. Simple as that. Games are GPU-bound, have been and will continue to be. It won't stop now that we are moving to higher resolutions to 1080p. Either play the game the resolutions that people play at with those settings, or don't. Also, things like RAM latency and RAM frequency have been to matter a lot.

BF1 and BF4 use a different version of Frostbite engine (that is why BF1 is more heavy on the CPU and also makes a great use of HT).

Battlefield 4: Frostbite 3

Battlefield 1: Frostbite 3

BF1 load on the CPU is not dependent on resolution or GFX settings (less than 5% difference),

More lies from you, I see:
https://www.youtube.com/watch?v=BQUSq2N6218

CPU usage with GTX 1070 and 4790K at 1440p: 70%
CPU usage with GTX 1070 and 4790K at 4K: 40%

[QUOTE="dfk7677, post: 38694001, member: 221661"

I can attest that this game is the first one I have owned that 4 threads are not enough.[/QUOTE]

Wouldn't be the first time you lied or made up stuff (as this post as shown). You clearly don't know what you're talking about. Of course, we are using two complete definitions too. I'm sure you are talking about trying a 6600K with a GTX 1080 at 1080p -- because that makes a <redacted>ton of sense, right?

Profanity, insulting others is not allowed here. Knock it off
Markfw
Anandtech Moderator
 
Last edited by a moderator:

dfk7677

Member
Sep 6, 2007
64
21
81
Which makes the tests completely irrelevant. You don't do benchmark tests for games, if they don't give a correct description of reality. Simple as that. Games are GPU-bound, have been and will continue to be. It won't stop now that we are moving to higher resolutions to 1080p. Either play the game the resolutions that people play at with those settings, or don't. Also, things like RAM latency and RAM frequency have been to matter a lot.

Battlefield 1 is not GPU bound:
Here are the minimum PC requirements for Battlefield 1 on PC.


OS: 64-bit Windows 7, Windows 8.1 and Windows 10
Processor (AMD): AMD FX-6350
Processor (Intel): Core i5 6600K
Memory: 8GB RAM
Graphics card (AMD): AMD Radeon HD 7850 2GB
Graphics card (NVIDIA): nVidia GeForce GTX 660 2GB
https://forums.battlefield.com/en-us/discussion/19282/battlefield-1-minimum-pc-requirements

If you believe, that a game that needs the best and latest (when BF1 launched) i5 and a 4.5 years old mid range GPU as minimum requirements , is GPU bound you are just ignorant.

Battlefield 4: Frostbite 3

Battlefield 1: Frostbite 3

There is a Frostbite Team that handles and upgrades the engine continuously. Not changing the major version number doesn't mean that the engine is the same. But of course I am a liar and the FB engine is the same exactly as it was 3 years ago.

More lies from you, I see:
https://www.youtube.com/watch?v=BQUSq2N6218

CPU usage with GTX 1070 and 4790K at 1440p: 70%
CPU usage with GTX 1070 and 4790K at 4K: 40%



Wouldn't be the first time you lied or made up stuff (as this post as shown). You clearly don't know what you're talking about. Of course, we are using two complete definitions too. I'm sure you are talking about trying a 6600K with a GTX 1080 at 1080p -- because that makes a shit ton of sense, right?

I don't really see where I made up stuff. If you want to see how a CPU performs in a game, you make certain that the GPU is not utilized at 100%. Like it obviously happens in the video shown. The game is GPU bound both in 1440p and 4K in ultra settings (what a surprise). The CPU usage in the particular video is useless, as it tells us nothing about how the CPU performs...

There are people that have 4k monitors and there are people that have 144Hz monitors (with the latter being more in my opinion). If you want all of the eye candy and 4K, yes, BF1 can be GPU bound. If you want more FPS, the game is certainly CPU bound.

I don't know if you are an Intel fan, which probably is the case as you are very persistent to convince everyone that in all games IPC and clocks matter. I am not fan of any particular microprocessor company, I just want the best for consumers (and/or gamers as myself).
 
Last edited:
D

DeletedMember377562

Battlefield 1 is not GPU bound:

https://forums.battlefield.com/en-us/discussion/19282/battlefield-1-minimum-pc-requirements

If you believe, that a game that needs the best and latest (when BF1 launched) i5 and a 4.5 years old mid range GPU as a minimum requirements , is GPU bound you are just ignorant.


So, let me get this straight: your argument is based on minimum requirement lists (that almost never give a proper picture of the reality)? Not actual tests and benchmarks (that point towards it not being the case), that show numbers, CPU load and all other data? Bwhahahahhahahahha.

Battlefield 1 is not GPU bound:

There is a Frostbite Team that handles and upgrades the engine continuously. Not changing the major version number doesn't mean that the engine is the same. But of course I am a liar and the FB engine is the same exactly as it was 3 years ago.

It was you who made the claim a different Frostbite engine was used. It was you who alluded to this "new Frostbite engine" somehow being more CPU-dependent than the one on BF4, despite BF4 using 8 threads and having the exact same amount of usage as BF1.

But, please, do tell about this "Frostbite Team ", that has made their new game soooooooo much more CPU bound. You clearly know a lot of inside info that we don't. I'm all ears!

Battlefield 1 is not GPU bound:
I don't really see where I made up stuff. If you want to see how a CPU performs in a game, you make certain that the GPU is not utilized at 100%. Like it obviously happens in the video shown. The game is GPU bound both in 1440p and 4K in ultra settings (what a surprise). The CPU usage in the particular video is useless, as it tells us nothing about how the CPU performs...

It's actually more useful than the DigitalFoundry test, as it tests setups that normal people actually use. What's useless it testing with hardware and at settings that are abnormal and that are hardly used. Then the numbers produced are artificial. They don't give an actual representation of how people use their systems. If DF or you want to find out how CPUs bottleneck, then go out and try it with a GTX 1060 or RX 480. Testing cards like GTX 1080 or overclocked Titan XP at 1080p is stupid; hardly anybody plays like that. I get the point is to stress the CPUs as much as possible; but why bother, when it gives no proper picture of reality? It's more than enough that they have picked out the most CPU intensive games out there...

Battlefield 1 is not GPU bound:
There are people that have 4k monitors and there are people that have 144Hz monitors (with the latter being more in my opinion). If you want all of the eye candy and 4K, yes, BF1 can be GPU bound. If you want more FPS, the game is certainly CPU bound.

Again wrong. BF1 is probably one of the most intensive FPS games (you know, the ones you actually get 144Hz monitors for). And for that game, you can reach your average 120 FPS target with your GTX 1080 at even 1440p. I can easily reach that same target with a GTX 1070 at 1080p. Then there's the factor og graphical settings: nnot everyone have everything on max at all times either -- especially not if achieving 144Hz is so important to you than graphics.

And ask the people in here who own GTX 1080s what kind of monitors they own. I have this odd feeling that the overwhelming majority of them have either 1440p displays or higher.....

I don't know if you are an Intel fan, which probably is the case as you are very persistent to convince everyone that in all games IPC and clocks matter. I am not fan of any particular microprocessor company, I just want the best for consumers (and/or gamers as myself).

After being called out for knowing nothing about the topic of which you speak of, you resort to calling me a fanboy of a specific company (despite me never even comparing Intel to AMD once, but rather Intel to Intel).
I never said all games. One thing is the fact that the only games that are usually tested by sites amount to maybe the 5-10% (that are the most CPU bound) of all games. Another is the fact that even in those games, it has been proven, as shown by TH's test and many other CPU benchmark testst that include the 1440p settings for said powerful GPUS, 4 threads are enough.

And what the hell has this to do with Intel; aren't AMD releasing 4 core Ryzen chips, also?
 
Last edited by a moderator:
D

DeletedMember377562

This is what we have heard years ago when Bulldozer launched.

Don't listen to that guy. He clearly knows nothing of what he talks about, as I just demonstrated. He just used "PC minimum requirements" lists as an argumenet for why a game is more CPU bound, against me linking to benchmarks done by recognized sites like Tomshardware, or by people uploading videos of them playing said games.

The "more games will be multithreaded" line has been repeated over and over again for a decade now. What people like dkte don't seem to understand is that parallelization in games is difficult, and game developers -- who can't even release games without them being severly broken (which is why AMD and Nvidia always have to clean up after them, when they release big drivers at the launch of AAA titles) -- can't do that job easily, and react lazily to this. Nor is Intel making processors with 4 cores just for fun: they do it based on the requests and the needs of actual developers of applications that those chips are being used for. Furthermore, if you'd ever need more than 4 threads, which I'm not saying you don't (though in those small cases, the difference in performance does not justify the price imo), you still have a 4 core i7 to use. So wheras a 4c/4t might give you up to 100% CPU usage in BF1 at even 1440p, you can use the i7 4c/8t and get around 60-70% usage (but still only gain maybe an average of 5% more FPS).
 
  • Like
Reactions: lopri

dfk7677

Member
Sep 6, 2007
64
21
81
@generalako I have made my point, you have made yours. I will not answer to you anymore, because I believe you are just trying to win an argument, not have a conversation. People that own BF1 and an i5 will have understood what I mean by now.
 

Crumpet

Senior member
Jan 15, 2017
745
539
96
@generalako I have made my point, you have made yours. I will not answer to you anymore, because I believe you are just trying to win an argument, not have a conversation. People that own BF1 and an i5 will have understood what I mean by now.

Admittedly I only played during the BF1 beta.

But my i5 6600 wasn't a happy bunny about it. It certainly wasn't buttery smooth, I had some serious FPS drops and decided 1) not to purchase BF1, and 2) that i'm never buying a non hyperthreaded 4 core processor for gaming ever again.
 
D

DeletedMember377562

Again, let me make this very clear: we are discussing gaming (and BF1) at 1440p (which is only relevant with GPUs like GTX 1080 or higher). Naturally, 1080p will stress your GPUs more, and you'll notice it with a GTX 1070, for example. Not to mention a processor like the i5 6600 (which has such a lock clockspeed to even be taken seriously).

But if you want to go for 4c/8t, by all means, do so. I actually recommend you to do so. But know this: playing at higher resolutions will drop your CPU usage significantly. There's hardly any necessity for anything more than 4 threads at 1440p. Maybe very, very few instances. But as TH's test showed, even in the majority of CPU intensive games, the 6600K will come out as a winner.
 
Jan 15, 2017
39
54
61
This is just ridiculous. Games today use multiple threads a lot more than decade ago, there is no denying that. Its long process and takes time to get more and more threads utilized. For ages the four cores has been maximum one got in his gaming system, there really was no point to try to optimize for more (of heavy threads). When game is designed, the low end is also considered, that results in some trade off on what game could do and what is feasible. This same argument about "more threads will be utilized" has been going from first two cores, now the two cores is basically minimum anyone has and those two are well utilized. When we got to the point that four cores is minimum anyone has, those will be very well utilized and at that point 16 cores will give some advantage over 8 cores. Its just long process that is driven by hw that gamers have, not what can be done.
 
  • Like
Reactions: lopri
D

DeletedMember377562

This is just ridiculous. Games today use multiple threads a lot more than decade ago, there is no denying that. Its long process and takes time to get more and more threads utilized. For ages the four cores has been maximum one got in his gaming system, there really was no point to try to optimize for more (of heavy threads). When game is designed, the low end is also considered, that results in some trade off on what game could do and what is feasible. This same argument about "more threads will be utilized" has been going from first two cores, now the two cores is basically minimum anyone has and those two are well utilized. When we got to the point that four cores is minimum anyone has, those will be very well utilized and at that point 16 cores will give some advantage over 8 cores. Its just long process that is driven by hw that gamers have, not what can be done.

This is actually wrong. The hardware that gamers have are based on the hardware that are made available to them. And Intel aren't pushing 4 cores to desktop consumers just randomely. those chips with those amount of cores are going through careful and detailed testing with lots of cases, and through discussions and requests by the application (and game) developers themselves. Don't make it sound like those chips just fell down from heaven and somehow the gamers themselves ended up making the rules for how many threads to use by what they purchased. It's the opposite.

But let's say you are right. Gaming consoles combined make up for more sales than desktop PCs, not to mention they also sell the same products for much higher prices. So clearly they are the bigger priority. Why is it then, after these many years with 8 core CPUs on those consoles, the overwhelming majority of games still don't use more than 4 cores effectively (effectively as in "need to have it"). By the time the CPU becomes a bottleneck is when you use a high-end CPU (but then gaming in 1080p doesn't make any sense at that point). That's why DF has to test the most CPU bound games in 1080p with a OCed Titan XP, because if they didn't, the difference would hardly be there. As proven by their Skylake review, when 6600K only performed a few FPS below the 6700K in all the same tested games with a GTX 980 at 1080p.

And that's what matters. Effective use of more than 4 cores. Just because games use more threads, it doesn't mean they are dependent on them. It also doesn't remove the fact that those said games will still be heavily single core bound. That's why 6700K beat out 5820K and 5960X in DF's Haswell-E vs. Skylake test, when at same clockspeeds in 1080p. It won by the same amount Skylake was ahead in terms of IPC.

And it's the same reason a 6c/12t 3930K won't fare any much better in, say, BF1, than the 6700K. In the end, single core performance is the most important aspect. A current KL 4 core can do what, 5 GHz? Whereas BW-E can do 4.3 Ghz. So you have about 3% better IPC, as well as an additional 16% of higher frequency (which of course won't amount to 16% improved gaming performance, but maybe 5%).

Just to sum it up, I'll copy and paste the intereview TH had with one of the Project Cars developers:

Tom’s Hardware: Given the choice between a quad-core Skylake CPU operating at higher clocks and slightly better IPC or 10-core Broadwell-E thermally limited to lower frequencies, which do you choose for gaming and why? Are we destined to always be graphics-bound at the resolutions enthusiasts play at? Even in our 2560x1440 graphs, with all processors normalized at 3.9 GHz, quad-core Skylake is slightly faster than six-core Broadwell-E.

Ged: All tests we've run (and seen) show Skylake to be better than Broadwell-E for gaming, including our games. We don't see that changing any time soon. Most games have relatively limited CPU requirements compared to GPU. We're probably one of the more intensive on CPU (physics mainly, of course, but also AI), but still not enough to seriously stress a well-clocked quad-core Skylake-based i5/i7.
 
Last edited by a moderator:

Crumpet

Senior member
Jan 15, 2017
745
539
96
You should never quote anything related to Project CARS.. They lie through their teeth and the game is a buggy un-optimised mess. It's also almost entirely GPU bound...
 
D

DeletedMember377562

You should never quote anything related to Project CARS.. They lie through their teeth and the game is a buggy un-optimised mess. It's also almost entirely GPU bound...

<redacted>...I linked the TH test. They interview developers from several games. Read anything anyone of them say.

Profanity is not allowed in the technical forums
Markfw
Anandtech Moderator
 
Last edited by a moderator:

Crumpet

Senior member
Jan 15, 2017
745
539
96
Again, let me make this very clear: we are discussing gaming (and BF1) at 1440p (which is only relevant with GPUs like GTX 1080 or higher). Naturally, 1080p will stress your GPUs more, and you'll notice it with a GTX 1070, for example. Not to mention a processor like the i5 6600 (which has such a lock clockspeed to even be taken seriously).

But if you want to go for 4c/8t, by all means, do so. I actually recommend you to do so. But know this: playing at higher resolutions will drop your CPU usage significantly. There's hardly any necessity for anything more than 4 threads at 1440p. Maybe very, very few instances. But as TH's test showed, even in the majority of CPU intensive games, the 6600K will come out as a winner.

I missed this one..

I have 2 monitor set ups that I switch between depending on game/design program. a 1440 32" and a trile 24" 1080 setup running 5760x1080.

And I can not be convinced that the cpu usage makes no difference at higher resolutions. It does.

And if what you said was true, then live streaming on 5760x1080 or 1440p would be significantly easier than on a single 1080p res. Which, no, it REALLY isn't.
 

lolfail9001

Golden Member
Sep 9, 2016
1,056
353
96
So, let me get this straight: your argument is based on minimum requirement lists (that almost never give a proper picture of the reality)? Not actual tests and benchmarks (that point towards it not being the case), that show numbers, CPU load and all other data? Bwhahahahhahahahha.
Every single number on planet demonstrates that in unless you force it to be GPU bound, BF1 is CPU bound and is very thread hungry on top of that. If you are in denial about that, i can only direct you to play exclusively at 8k resolution and not visit CPU forums ever again.
 
  • Like
Reactions: dfk7677 and Crumpet

Sven_eng

Member
Nov 1, 2016
110
57
61
Again, let me make this very clear: we are discussing gaming (and BF1) at 1440p (which is only relevant with GPUs like GTX 1080 or higher). Naturally, 1080p will stress your GPUs more, and you'll notice it with a GTX 1070, for example. Not to mention a processor like the i5 6600 (which has such a lock clockspeed to even be taken seriously).

But if you want to go for 4c/8t, by all means, do so. I actually recommend you to do so. But know this: playing at higher resolutions will drop your CPU usage significantly. There's hardly any necessity for anything more than 4 threads at 1440p. Maybe very, very few instances. But as TH's test showed, even in the majority of CPU intensive games, the 6600K will come out as a winner.

CPU usage drops at higher resolutions because the GPU becomes the limiting factor. What do you think happens when you add a faster GPU or two?
 
  • Like
Reactions: dfk7677

CentroX

Senior member
Apr 3, 2016
351
152
116
Which makes the tests completely irrelevant. You don't do benchmark tests for games, if they don't give a correct description of reality. Simple as that. Games are GPU-bound, have been and will continue to be. It won't stop now that we are moving to higher resolutions to 1080p. Either play the game the resolutions that people play at with those settings, or don't. Also, things like RAM latency and RAM frequency have been to matter a lot.



Battlefield 4: Frostbite 3

Battlefield 1: Frostbite 3



More lies from you, I see:
https://www.youtube.com/watch?v=BQUSq2N6218

CPU usage with GTX 1070 and 4790K at 1440p: 70%
CPU usage with GTX 1070 and 4790K at 4K: 40%

[QUOTE="dfk7677, post: 38694001, member: 221661"

I can attest that this game is the first one I have owned that 4 threads are not enough.

Wouldn't be the first time you lied or made up stuff (as this post as shown). You clearly don't know what you're talking about. Of course, we are using two complete definitions too. I'm sure you are talking about trying a 6600K with a GTX 1080 at 1080p -- because that makes a shit ton of sense, right?[/QUOTE]

Lol, frostbite stopped with the number thing and are updated on a regular basis. Frostbite back in 2013 is not nearly the same as frostbite in 2017.
 
D

DeletedMember377562

I missed this one..

I have 2 monitor set ups that I switch between depending on game/design program. a 1440 32" and a trile 24" 1080 setup running 5760x1080.

And I can not be convinced that the cpu usage makes no difference at higher resolutions. It does.

And if what you said was true, then live streaming on 5760x1080 or 1440p would be significantly easier than on a single 1080p res. Which, no, it REALLY isn't.


I just showed you case in which a guy played a GTX 1070 at 1440p and then 4K with different CPU usages in percetnage. Or the TH test, where the 6600K, when falling back in several games at 1080p, is in general performing the best or the same in 1440p. How hard is that for you to see?

Streaming at those resolutions is a completely different matter. That's a strictly CPU-bound issue.

Every single number on planet demonstrates that in unless you force it to be GPU bound, BF1 is CPU bound and is very thread hungry on top of that. If you are in denial about that, i can only direct you to play exclusively at 8k resolution and not visit CPU forums ever again.


Hahaha, what a horrible argument. It's not me, but rather you and others that force games to be CPU bound. When you play games with GTX 1080s and Titan XPs at 1080p, that's what's called FORCING a game to be something (in this case, CPU bound). When playing those said games with those said GPUs at 1440p, there's no forcing going on. That's the normal case. Of course, you are welcome to make the opposite claim by linking to tests with BF1 in 1440p. It feels strange how I with my references tests and videos have to argue against the words of forum users, as if they are both equal in terms of weight...
 
D

DeletedMember377562

CPU usage drops at higher resolutions because the GPU becomes the limiting factor. What do you think happens when you add a faster GPU or two?

Well, of course it's a limiting factor. And that's my whole point. People play in those settings.

I don't know. But I do know, however, that there are examples of people using 1080 SLI with a 6700K on BF1, but in 4K -- which, lets be fair, is the most probable example to have. You don't use 1080s in SLI for gaming lower resolutions, excactly. The 1080 alone can already achieve 110-120 FPS in BF1 at 1440p.

This is what happens: https://www.youtube.com/watch?v=MTaCpmvhT-I&t=86s

CPU usage is still around 60% for the OCed 6700K with GTX 1080s in SLI. So still not stressed to its limit.
 

guskline

Diamond Member
Apr 17, 2006
5,338
476
126
Has AMD updated the exact release date of RyZen? Is there a computer show etc where it is likely to be "officially" released?
 

lolfail9001

Golden Member
Sep 9, 2016
1,056
353
96
It's not me, but rather you and others that force games to be CPU bound.
Yes, by pushing as low of frame times as we can on reasonable image quality. How bad of us to pursue smoothest gameplay possible, isn't it.
When playing those said games with those said GPUs at 1440p, there's no forcing going on.
Wrong, you just force more load on GPU over CPU, forcing GPU limitations, nothing else.
That's the normal case.
There is no "normal" case with enthusiast GPUs, man. Some people pursue 144hz with them, others play at 4k on 30 fps.
Of course, you are welcome to make the opposite claim by linking to tests with BF1 in 1440p.
Did anyone test BF1 on 1440p with SLId Titan XPs? Ask them.
 
Status
Not open for further replies.