Official AMD Ryzen Benchmarks, Reviews, Prices, and Discussion

Page 168 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

unseenmorbidity

Golden Member
Nov 27, 2016
1,395
967
96
As usual, you are cherry picking outliers, and making sweeping generalizations based on them. Even the host says it is an outlier....

https://youtu.be/LjFu-onLA68?t=4m21s

A cpu pushed to it's limits does benefit a lot more from faster memory than it otherwise would, but that isn't a common situation in gaming.

Of course that would apply to a ryzen 4 core at it's limits too.
 
Last edited:

cytg111

Lifer
Mar 17, 2008
23,047
12,715
136
Anyways, i caught a wind of ever more support for my hypothesis from PCPer podcast. They mentioned that 3.5Ghz Ryzen on single thread topped out at pushing ~110k IOPS (with high QD, but i digress) compared to ~200k IOPS Intel does. Similar behavior i noticed on other I/O tests in other reviews, but once again, i wait for looncraz's test.

So drivers firmware and updates can possibly change all that tomorrow. A lackluster PCIe could also explain some of the not so great gaming results. I suggest we put it on the list for things to mature instead of going on a crusade either way.
A bottleneck identified. Is it fixable? I would think so. Question is when.
 
  • Like
Reactions: Gikaseixas

Agent-47

Senior member
Jan 17, 2017
290
249
76
I on

ly mentioned MP because they are the most obvious. Even single player games get the same treatment with reviews. They test in areas with great CPU performance, so their GPU reviews show the differences best. Unfortunately, when it comes to review CPU's, the FPS are really high compared to other parts of the game.
Probably something to do with the fact that most games are GPU limited at high resolution and the fact that game benchmarks are usually appended to GPU reviews. Also it is next to impossible to have a decent multiplayer benchmark. But still I see some games giving out CPU benchmarks/fps as well, i.e. now many fps it can draw without GPU bottlenecks.

regarding SP vs MP, chances of a CPU bottleneck is exponentially higher in MP games, this i can tell you from experience, and i think you will agree.

While it is not the setup you'd use, that doesn't mean it's useless information. It shows what CPU's perform best in that game, which help you know what CPU will do best in those areas which drop below the FPS you desire. For example, I want 80 FPS minimum when I game. Using a low resolution CPU test will let me know what CPU is faster, which should give me an idea of which CPU will help me in those areas not tested in the same game, which do not normally reach 80 FPS due to a CPU bottleneck.
yes, for your nausea, i remember. And I agree. But in the end SP will pull that as minimum fps fps without CPU bottleneck, unless a FX/i3 cpu is involved.

I know you are not using this as a measure of how long the CPU will last, as core counts becomes a bigger concern in that case, even in games from 2016.

I'm not entirely sure what point you are making here. If it's that you'll still be GPU bound, despite a CPU bottleneck, then we are clearly different gamers. If I ran into a GPU bottleneck, preventing me from hitting 80 FPS, I'd turn down the settings until I do, or until I'm CPU bottlenecked.
How can you be GPU bottlenecked despite a CPU bottleneck? It's either this or that. if both are bottlenecks than that system is in perfect balance. Personally I would prefer a GPU bottleneck

What I tried to say is that BF1/4/3 MP do not run into CPU bottleneck which forces the fps to dip below 80 fps (60fps for BF1) at 1080p/2k/4k (so obviously not at lower res either). those videos had both GPUs pushing 100%. Neither of those system in the two video were CPU bottlenecked, as both CPUs were doing 50-70% on any given core.

Yes we are different gamers obviously. Personally, I have a 1080p@60Hz TV that I game on. So I could not care less as long as i get 60 fps minimum. I even keep Vsync on when I am playing SP. 60 fps an FX-6 core can pull on BF4/3 at full HD/max details. On other games, which i only play on single player, I have always been able to increase performance by lowering details. Granted I donot own a Titan XP.

However, BF1 shows dips to 40 fps at max details 1080p. This is due to a CPU bottleneck in my case (100% CPU Usage with 60% GPU usage on RX470 during intense battles on 64p servers). So Ii am awaiting R5 6c CPUs.

<this is my story and i am not saying that FX is good (which it definitely is not) or 470 is good enough for 1080p, or anything of that sort>
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Probably something to do with the fact that most games are GPU limited at high resolution and the fact that game benchmarks are usually appended to GPU reviews. Also it is next to impossible to have a decent multiplayer benchmark. But still I see some games giving out CPU benchmarks/fps as well, i.e. now many fps it can draw without GPU bottlenecks.

regarding SP vs MP, chances of a CPU bottleneck is exponentially higher in MP games, this i can tell you from experience, and i think you will agree.


yes, for your nausea, i remember. And I agree. But in the end SP will pull that as minimum fps fps without CPU bottleneck, unless a FX/i3 cpu is involved.

I know you are not using this as a measure of how long the CPU will last, as core counts becomes a bigger concern in that case, even in games from 2016.


How can you be GPU bottlenecked despite a CPU bottleneck? It's either this or that. if both are bottlenecks than that the system is in perfect balance. Personally I would prefer a GPU bottleneck

What I tried to say is that BF1/4/3 MP do not run into CPU bottleneck which forces the fps to dip below 80 fps (60fps for BF1) at 1080p/2k/4k (so obviously not at lower res either). those videos had both GPUs pushing 100%.

Yes we are different gamers obviously. Personally, I have a 1080p@60Hz TV that I game on. So I could not care less as long as i get 60 fps minimum. which an FX-6 core can pull on BF4/3 at full HD/max details. On other games, which i only play on single player, I have always been able to increase performance by lowering details. Granted I donot own a Titan XP.

However, BF1 shows dips to 40 fps at max details 1080p. This is due to a CPU bottleneck in my case (100% CPU Usage with 60% GPU usage on RX470 during intense battles on 64p servers). So Ii am awaiting R5 6c CPUs.

<this is my story and i am not saying that FX is good (which it definitely is not) or 470 is good enough for 1080p, or anything of that sort>

I think we are having a slight disconnect here. And I think it revolves around this line: "But in the end SP will pull that as minimum fps fps without CPU bottleneck, unless a FX/i3 cpu is involved."

From my experience, that simply is not true. Just because reviewer don't show case the CPU bottlenecks in games, does not mean they do not exist, because they do. I run into them all the time, despite having an i7 5820K @ 4.4Ghz. I see many other people complain about their i7 6700K's also running into them, though they usually have a hard time believing that is why their FPS drop below 60 no matter what settings they use.

I didn't say that you can be GPU bottlenecked and CPU bottlenecked at the same time. What I said is, if you are GPU bottlenecked, and you need more FPS, you lower your settings until you hit those FPS. Showing GPU bound videos just shows the areas that are not CPU bottlenecked. That does not change that there are many games, and areas of many games which are.
 

Agent-47

Senior member
Jan 17, 2017
290
249
76
From my experience, that simply is not true. Just because reviewer don't show case the CPU bottlenecks in games, does not mean they do not exist, because they do. I run into them all the time, despite having an i7 5820K @ 4.4Ghz. I see many other people complain about their i7 6700K's also running into them, though they usually have a hard time believing that is why their FPS drop below 60 no matter what settings they use..
on single player at 1080p or higher with max/medium settings they cannot hit 60 fps min due to CPU bottleneck? (I am discounting 120/144Hz monitor as that is a niche). May I ask what games, i am just curious.
I didn't say that you can be GPU bottlenecked and CPU bottlenecked at the same time. What I said is, if you are GPU bottlenecked, and you need more FPS, you lower your settings until you hit those FPS. Showing GPU bound videos just shows the areas that are not CPU bottlenecked. That does not change that there are many games, and areas of many games which are.

agreed. GPU bottleneck is better than a CPU bottleneck. but those videos were very intense matches, although I did not notice the number of players in the matches.
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
on single player at 1080p or with max/medium settings they cannot hit 60 fps min due to CPU bottleneck? (I am discounting 120/144Hz monitor as that is a niche). May I ask what games, i am just curious.

The most common complained games that come to mind are GTA V (always some specific locations with a high speed chase), and Arma games. WoW comes up a lot too. You also get games RTS's and even Civ games which brutalize CPU's. There are a few spots in quite a few games which flirt around 60 FPS, such as DA:I, Metro 2033 (in the library in particular), Tomb Raider (in a few locations, mostly over looking mountains), Grim Dawn (certain fights), etc. Remembering them all is difficult. Given my sensitivity, I'm quick to spot low FPS, and test out the areas. Then when it comes to 3D Vision and VR, you often run into much bigger issues, due to rendering twice the frames for any give FPS.

And yes, a lot of people want 120/144hz/FPS, and throw all kinds of money at it, with Titan X's, and are befuddled as to why they can't get to 120 FPS no matter what settings are used. You might consider them niche/fringe, but so is anyone who buys a high end CPU.

Basically what I've been saying all along is low resolution CPU benchmarks serve a purpose for gamers right now. It's not simply a matter of future proofing.
 

Agent-47

Senior member
Jan 17, 2017
290
249
76
Basically what I've been saying all along is low resolution CPU benchmarks serve a purpose for gamers right now. It's not simply a matter of future proofing.
This is where we disagree.

lets say a CPU bottleneck appears at 40 fps at 720p. what is the CPU bottleneck at 4K for that exact scene: its approx 40 fps.

low res benchmarks will have the same minimum (assuming its a minimum due to CPU bottleneck) but a larger avg FPS gap of 300 fps vs 350 fps. at those rates, the average fps difference is meaningless and hence the results are misleading

any CPU bottleneck will equally manifest itself in 480p/720p/1080p/1440p/4K games in the form of lower minimum FPS. and that minimum fps is all that matters. if you game at 80 fps, and the minimum FPS is 70 FPS @ 4K, it will be approx. 70 fps at a lower res too.

you can verify it from the 1080p and 4k results of civ V:
1080_Civ.png

1440_Civ.png

http://www.techspot.com/review/1348-amd-ryzen-gaming-performance/page4.html
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
This is where we disagree.

lets say a CPU bottleneck appears at 40 fps at 720p. what is the CPU bottleneck at 4K for that exact scene: its approx 40 fps.

low res benchmarks will have the same minimum (assuming its a minimum due to CPU bottleneck) but a larger avg FPS gap of 300 fps vs 350 fps. at those rates, the average fps difference is meaningless and hence the results are misleading

any CPU bottleneck will equally manifest itself in 480p/720p/1080p/1440p/4K games in the form of lower minimum FPS. and that minimum fps is all that matters. if you game at 80 fps, and the minimum FPS is 70 FPS @ 4K, it will be approx. 70 fps at a lower res too.
I don't disagree that the low minimum FPS CPU bottlenecks remain the same at all resolutions. That's quite true. But the problem is, review sites do not go out of their way to benchmark the CPU at those CPU bound areas.

But I know they exist in these games and the CPU which performs better at low resolution in those games, almost always (or is it always?) perform better in those spots. The low resolution test served a purpose. It let me know what CPU will help my CPU bottlenecks the most in a particular game, without the reviewer hunting down all the CPU bound locations.

It's certainly better to do a low res CPU benchmark to fabricate a bottleneck, than to run a GPU bound CPU benchmark because it's too much work to find CPU bound locations in every game.
 

Agent-47

Senior member
Jan 17, 2017
290
249
76
1. But the problem is, review sites do not go out of their way to benchmark the CPU at those CPU bound areas.
2. It let me know what CPU will help my CPU bottlenecks the most in a particular game, without the reviewer hunting down all the CPU bound locations.

1. how would benchmarking at low res make up for potential CPU bottlenecks in CPU bound areas which are not tested either ways? its benchmarking the same approx scene, at lower details with same AI. so essentially you are just testing the draw call limits, which themselves are less stressful due to lower details.

2. using your logic 7700k is a CPU bottleneck with 1080ti in Doom@1440p compared to 1800x, hence it must be equally bad in high res as well. but it has approx same min fps compared to 1800x at 4k. Note that its not a GPU bottleneck as the same GPU can 15% higher min fps. Point being- its misleading. you should check the scores for the res you will play at. not some unrealistic res
Ryzen-Doom-1440p-Ultra-Preset-Vulkan.png

Ryzen-Doom-2160p-Ultra-Preset-Vulkan.png
 
Last edited:

R0H1T

Platinum Member
Jan 12, 2013
2,582
162
106
1. how would benchmarking at low res make up for potential CPU bottlenecks in CPU bound areas which are not tested either ways?

2. using your logic 7700k is a CPU bottleneck with 1080ti in Doom@1440p compared to 1800x, hence it must be equally bad in high res as well. but it has approx 10% better min fps compared to 1800x. Note that its not a GPU bottleneck as the same GPU can 15% higher min fps. Point being- its misleading. you should check the scores for the res you will play at. not some unrealistic res
Ryzen-Doom-1440p-Ultra-Preset-Vulkan.png

Ryzen-Doom-2160p-Ultra-Preset-Vulkan.png
Sorry I must've missed the resolution you're talking about, in the charts you posted the 1800x has better min & avg FPS at both 1440p & 2160p.
 
Last edited:

Agent-47

Senior member
Jan 17, 2017
290
249
76
^^ I though we established that a CPU bottleneck may occur over small intervals leading to larger dips
 
Last edited:

R0H1T

Platinum Member
Jan 12, 2013
2,582
162
106
^^ I though we established that a CPU bottleneck may occur are only small intervals leading to larger dips
I see that you've edited the earlier post, anyway I still don't get your point considering the following conclusion from the review site ~
When we started this testing, we had no idea what the final results would be. Our best guesses were that Ryzen was going to sit between the 7700K and 5820K in most tests or suffer worse performance at resolutions above 1080P due to the DDR4 memory latency issues we saw when testing the CPU and motherboard last week. Of course, in real-world testing like this, it didn’t seem to make much of a difference, at least regarding what you would notice as a consumer. In almost all tests (things didn’t go perfectly in Far Cry Primal) the Ryzen 1800X gave the best frame rates at all resolutions, and even more so when pushed to 1440P and 2160P, where the 8-core 16-thread design of the CPU was able to relieve the GTX 1080 Ti of any bottlenecks in performance.

The most exciting thing we saw so far wasn’t the average frame rates, but the minimum. People are often quick to leap to the maximum frame rate, and I’ll give you a bit of a tip, high frame rates are great, but they’re not all that. You can run some games as low as 20FPS and it’ll play great, just look at classics like Zelda 64 and Goldeneye, they ran at 20FPS and they felt pretty smooth. The trick is they didn’t drop frames. If you’re gaming at 100+ FPS and your frame rate drops below 60FPS for a moment, you’re going to notice; the same is true from 60-40, and so on. The Ryzen 1800X helped maintain the highest minimum frame rates we’ve ever seen, and that means a more consistent, smoother and overall better gameplay experience. When it comes down to it, this higher minimum number is what you want from a gaming chip, not just the bigger average or maximum number.

Our testing may seem strange to some, particularly our choice of CPUs to pit against Ryzen. I chose the 7700K as it’s currently a very popular choice for those building a high-end gaming PC, and it’s an excellent choice too, it’s a powerful chip, features the latest Kaby Lake architecture and works very well. With Ryzen launching, the price of the Intel chip is now £320-340, although it’s not a huge leap up from Skylake, it’s still great for gaming.

The Ryzen 1800X is more expensive at just under £500, but when you see 20-50% improvements in minimum frame rate, and gains regarding average frame rate, that certainly makes sense. As I’m writing this, the Ryzen 1700 is arriving at eTeknix HQ, and the 1700X is on its way, and we’re expecting the 1700X to be more in line with the 7700K performance and price, so that’s food for thought. Of course, this certainly won’t be the last time we test out the gaming capabilities of the new AMD or Intel CPUs for that matter.

The BIOS and drivers in general for Ryzen are still pretty fresh, there’s most certainly room for improvement there, and memory performance bugs are hopefully going to be worked out soon. On top of that, AMD is working with 1000’s of developers to improve overall market support of multi-core processors, and that’s sure to bring benefits to all PC gamers, not just AMD users. Either way, it looks like the Ryzen 1800X is offering some serious performance for high-end PC gaming, and we’re eager to see both Intel and AMD battle it out over the next couple of years. There are more Intel and AMD chips on the horizon, and the current ones are already pretty exciting, so we’re eager to see more battles regarding performance and retail prices, as that’s going to be a big win win for consumers.
Though it could be argued that an OCed 7700k would probably have better min frame rates than the 1800x but going forward the 1800x will be more future proof.
 

Agent-47

Senior member
Jan 17, 2017
290
249
76
I see that you've edited the earlier post, anyway I still don't get your point considering the following conclusion from t.

aa. yes. i thought you saw that on the plots :)

my point is looking at lower res does not give tell you the right answer. rather the min fps at the resolution you want to play at.

This discussion is vastly independent of 7700k vs 1800x in terms of which is better, but I just used that as an example. but yes a OCed 7700k will be better given 1800x has lower IPC and clock which priced considerably higher. its as much a gaming CPU as a 6900k
 

looncraz

Senior member
Sep 12, 2011
722
1,651
136
This is what a CPU bottleneck looks like:

CPU_Bottleneck_BF4.jpg


That is at 1080p Ultra, Battlefield 4 multiplayer (Golmud Railway), 20 minute session, Radeon R9 Fury (1050MHz), with an i7-2600k @ 4.5GHz.

When I first started playing I stayed mostly in a helicopter - you can see the high consistency. I then played on the ground for a good eight minuts or so (all those dips from large, close, explosions and heavy action), I then jumped in a helicopter again, then a tank, then back on the ground, then in a chopper again (didn't last long). I quit after being shot down.

I'll be attempting to play a similar round with this same card in a Phenom II system to see how much worse it might be. Tomorrow, it will be with a Ryzen 1700X.

I should also note gameplay was smooth and perfect the entire time - I was surprised to see that these dips existed. I also have frametimes that align with this, but graphing that is a bit more tricky as I need to time-align them (which I can do from file creation and modification times).
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
1. how would benchmarking at low res make up for potential CPU bottlenecks in CPU bound areas which are not tested either ways? its benchmarking the same approx scene, at lower details with same AI. so essentially you are just testing the draw call limits, which themselves are less stressful due to lower details.

2. using your logic 7700k is a CPU bottleneck with 1080ti in Doom@1440p compared to 1800x, hence it must be equally bad in high res as well. but it has approx same min fps compared to 1800x at 4k. Note that its not a GPU bottleneck as the same GPU can 15% higher min fps. Point being- its misleading. you should check the scores for the res you will play at. not some unrealistic res
Ryzen-Doom-1440p-Ultra-Preset-Vulkan.png

Ryzen-Doom-2160p-Ultra-Preset-Vulkan.png
1) Low resolutions do not reduce draw calls. It's the same number regardless of resolution, unless the game scales back details, which is extremely rare. Low settings can sometimes change things, but resolution doesn't affect the number of draw calls. It does affect how much work the GPU has to do. The point of low resolution testing, is to test the CPU in the game, rather than the GPU.

2) I'm not sure what is going on with that Doom example. It's not a CPU bottleneck exactly, as you can see, at 4K, the minimums are half that of 1440p. It's also not a common example. Though it does look as though the lower resolution benchmarks showing the i7 7700k bottlenecking, and not so much at 4K.

Low resolution CPU tests may also not be perfect, but it sure beats benchmarking the CPU while GPU bound, which says almost nothing. Ideally, reviewers should try and benchmark CPU bound areas of a game, but they don't. It's too much work, so they compromise and use low resolutions to create a bottleneck.

I realize that some of the low rest gaming comparisons don't show Ryzen in the best light, but so what. Some do show it well. This isn't an Intel vs AMD thing. Just a benchmark to help buyers know what to expect.
 
Last edited:

tamz_msc

Diamond Member
Jan 5, 2017
3,708
3,554
136
I want to know for how long the IMC can handle 1.9V on the memory. Sandy Bridge can't handle 1.65V reliably and was designed for 1.5V memory standards...

DDR4 is usually 1.2V... overclocked DDR4 is usually 1.35V...
Yes, this was my concern as well. Memory is much more sensitive to voltage than the CPU.
 

imported_jjj

Senior member
Feb 14, 2009
660
430
136
You do understand that memory has way more than just 4 timings, even if Ryzen right now gives you no control for any beyond main 4? As i have said, you do not have to take my word for it, buildzoid explains nicely why the good memory OCs are done as they are, as well as few other cool bits if you follow his posts.

Actually, "memory support is getting better literally every day" is news to me. So far i has seen the screenshot of someone who got IMC to run at over 2Ghz bus clock (so 4113MT for memory), announcement of Dancops guide with 3700MTs ( or something) on B-die. But nothing so far indicates that it will actually work without touching BCLK.

Sure, i would be glad to see evidence of this claim, because i must have missed it. Besides, even if 6900k was running at 4.5Ghz, that would not be enough to explain the disrepancy, but i digress: https://www.pcper.com/image/view/79205?return=node/67214

Yeah there is limited access to timings now and that only means that there is room for better once they open up the timings.
Very high memory OC is done with Bclk because the BIOS setting are lacking for now, that's all and you know it. Nothing to do with 24/7 though.
The 3700MHz result is an extreme OC at CL12 with 1.9V, it's more about timings than clocks but you have your point to make so are ignoring that.
The evidence for PCper's failings in setting the stated clocks for the 6900k is in their benchmarks, the results show it and it's a matter of competence and trusting any numbers they ever post.

6900k at claimed 3.5GHz in LAME 191 and CB R15 ST 151
6900k at claimed stock in LAME 192 and CB R15 ST 151
Both are wrong, the claimed 3.5GHz is at 3.7GHz and the stock is at same 3.7GHz since they don't enable Turbo 3

clock-audacity.png

clock-cb15-1.png

audacity.png

cb15-1.png
 

imported_jjj

Senior member
Feb 14, 2009
660
430
136
I want to know for how long the IMC can handle 1.9V on the memory. Sandy Bridge can't handle 1.65V reliably and was designed for 1.5V memory standards...

DDR4 is usually 1.2V... overclocked DDR4 is usually 1.35V...

Look at timings, it's not a 24/7 OC, there is a huge difference.
 

looncraz

Senior member
Sep 12, 2011
722
1,651
136
Yeah there is limited access to timings now and that only means that there is room for better once they open up the timings.
Very high memory OC is done with Bclk because the BIOS setting are lacking for now, that's all and you know it. Nothing to do with 24/7 though.
The 3700MHz result is an extreme OC at CL12 with 1.9V, it's more about timings than clocks but you have your point to make so are ignoring that.
The evidence for PCper's failings in setting the stated clocks for the 6900k is in their benchmarks, the results show it and it's a matter of competence and trusting any numbers they ever post.

6900k at claimed 3.5GHz in LAME 191 and CB R15 ST 151
6900k at claimed stock in LAME 192 and CB R15 ST 151
Both are wrong, the claimed 3.5GHz is at 3.7GHz and the stock is at same 3.7GHz since they don't enable Turbo 3

clock-audacity.png

clock-cb15-1.png

audacity.png

cb15-1.png

Don't worry, I'm doing solid 3Ghz numbers starting tomorrow - and I have data aplenty on 3Ghz for Phenom II, Sandy Bridge, and Excavator. And other sources for 3Ghz benchmarks are available as well.