Official AMD Ryzen Benchmarks, Reviews, Prices, and Discussion

Page 155 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Atari2600

Golden Member
Nov 22, 2016
1,409
1,655
136
You keep trying to insert "patches", but I'm struggling to think of patches that have come out and drastically changed game performance.

You are struggling to think full stop.

Not just patches for games, but for firmware and OS.

Unless you think BD didn't benefit from OS optimisation and improved platform performance?
Or Nehalem or Sandy bridge?

Zen is a massive shift from other architectures on the market. It is currently a bit of an unoptimised mess. Performance will improve from software/firmware patches.



Yes, moving from 90 to 150FPS floats my boat.

Good for you (and the other 0.1% like you).

Go shell out for a 7700K then and be happy... knowing you'll have to repeat the process in about 18 months.
 
  • Like
Reactions: french toast

Puffnstuff

Lifer
Mar 9, 2005
16,178
4,869
136
I don't pay any attention to any benchmarks where the person performing it has OC'd the chips. In the real world a normal person using hardware will run them stock and this is what I want to see as it's indicative of out of the box performance. I also want to see cpu/gpu performance at real world resolutions and I wouldn't even play a game on my cell phone at some the resolutions being used to test the Ryzen. 1080p and up are the real world resolutions that are relevant so testers should stick to them if they want to be taken seriously. There are just too many apples to oranges to banana's comparisons going on and all they are doing is confusing people.

Also if something is performing poorly I won't defend it no matter how much I might like it. For example, the best truck I've ever owned was a Toyota Tundra crew cab but does that make it the best truck right now? Nope, not when they've got a 10 year old powertrain and their competitors are continuously improving their offerings. Too many reviewers are blinded by brand loyalty and a desire to skew results in favor of one over the other. They should take a page from the Fast Lane Truck's Ike Gauntlet truck trailering test where each truck pulls the same weight trailer up the mountain pass for an apples to apples comparison of performance. Ford claims to have the highest output diesel engine and trailering capacity but on the test they lost to a truck with the lowest output and trailering capacity. Testing hardware should be done the same way and it should be used just as the manufacturer has spec'd it to.
 

inolvidable

Junior Member
Mar 30, 2009
5
2
81
It's been a while since I enjoyed this much about a piece of tech. All the discoveries about Ryzen's strengths and limitations is thrilling (to me, and to many here for sure). I praise AMD a lot for bringing something truly capable, new and reasonably priced.

All this facts are compelling enough to go AMD for my next workstation rig except for one thing. IMO, even if the limitations of Ryzen are here to stay (worst case scenario), its strengths make up for them as of today. However I am worried about Intel heavily investing in adding AVX2 (and AVX-512) support to the productivity software that can make use of it (which is a lot) in the next 1-2 years. In this scenario the value of first gen Ryzen would plummet over time. Intel CPUs are already beating Ryzen's up to 50% in software that make use of AVX2.

On the one hand I know that, even with Intel heavily investing in doing so, it would take some time. On the other hand, given the performance boost we are talking about, you could be forced to update the CPU in 1-3 years. One good thing about AM4 is that it is here to stay for nex gen zen+ CPUs, so you would only need to update the CPU, but zen+ might still not support 256bit AVX2 or might introduce improvements (more PCIe lines, quad memory channel etc..) that require a new mobo anyways.

I know this scenario is only in my head but IMO is plausible enough to give it a thought
 
  • Like
Reactions: lak_rok

Puffnstuff

Lifer
Mar 9, 2005
16,178
4,869
136
With the 6800k only being $405 and the 6850k being $580 its hard to justify a Ryzen. Right now my hope is that Intel will drop the price of the 6900k like a rock before I get ready to build my next pc and I will use it as the foundation of that build.
 
Last edited:
  • Like
Reactions: hotstocks

PotatoWithEarsOnSide

Senior member
Feb 23, 2017
664
701
106
Ok, so I've watched that YT video of the 4.0GHz OC actually running at 4.2GHz after coming out of sleep mode. Colour me impressed.
Got me thinking about the 1700 being stock 3.0GHz, but with many showing 3.2GHz as what the CPU is defaulting to.

That 4.2GHz looks like there's performance on the table, but that BIOS is somehow preventing access to it. I know that performance doesn't always scale linearly with frequency, but that's 5% just there.

edit: CB15 score of 1750 @ 4.0GHz, and 1830 @ 4.2GHz
edit2: scratch the above - time stretching; Windows is being tricked into running 4.2GHz when it isn't.
 
Last edited:

french toast

Senior member
Feb 22, 2017
988
825
136
GPU bound benchmarks are also false - they paint a picture of false equivalence between two parts. We have two imperfect methods of benchmarking, so I ask again: What is more useless; contrived low res CPU bound CPU benchmarks that give you some idea of the relationship between two parts, or flatline GPU bound CPU benchmarks that tell you absolutely nothing?
Right we have two things going on here then.
Do you agree with Adoredtv that threads and not low res gaming are the best indicator of future performance in NEW GAMES?

Second we both have stated total gpu bound benchmarks are silly and useless other than showing particular CPU is not bottlenecking 4k gaming.
What I have repeatedly said is that 1080p ultra seems like a great/realistic balance.
1080p is a low or basic resolution now, in fact it's not likely you would buy a lower resolution screen, which would be totally dumb even if you could considering the amount of money these things cost, anyway 1080p and then ultra settings, adding ultra does put some stress on the GPU but not too much to make it unrealistic, but also is representative of what settings someone is likely to run at 1080p, even with a great 1060 never mind a Titan.
 

ManyThreads

Member
Mar 6, 2017
99
29
51
Photoshop is garbage for multi-threaded scaling beyond a few cores. Pretty sad considering how well photo editing should actually lend itself to parallelism.

Frequency and IPC will reign supreme until Adobe fixes that (which is supposedly in the works... so... that's a matter of 7700k being better now, but possibly 70% slower in a couple years).

Still, it's not like Ryzen is slow - it's just not as fast as Intel's fastest quad core... but neither is Intel's fastest 8-core...

Thanks for your reply - your technical knowledge on these boards is appreciated by many I'm sure.

Any guesses why the 6900K was still faster than the Ryzen? Most other benchmarks show the Ryzen beating the 6900K.

Further what is confusing to me is that Photoshop is not supposed to be mult-threaded, like you say and like I keep reading, however when I open HW monitor all 8 threads of my 3770K are absolutely slammed 95%-100% with many of the things I do. These include a third party HDR creation tool (NIK Software's HDR EFEX Pro 2), and batch processing large amounts of 36MP RAW files. Now, I am not sure if my HDR plug-in and Photoshop's built in HDR processor (which is what Puget Systems benchmarked) work the same. It's great to hear Adobe is working on better multi-threaded support though, I did not know that.
 
  • Like
Reactions: french toast

krumme

Diamond Member
Oct 9, 2009
5,956
1,595
136
It's been a while since I enjoyed this much about a piece of tech. All the discoveries about Ryzen's strengths and limitations is thrilling (to me, and to many here for sure). I praise AMD a lot for bringing something truly capable, new and reasonably priced.

All this facts are compelling enough to go AMD for my next workstation rig except for one thing. IMO, even if the limitations of Ryzen are here to stay (worst case scenario), its strengths make up for them as of today. However I am worried about Intel heavily investing in adding AVX2 (and AVX-512) support to the productivity software that can make use of it (which is a lot) in the next 1-2 years. In this scenario the value of first gen Ryzen would plummet over time. Intel CPUs are already beating Ryzen's up to 50% in software that make use of AVX2.

On the one hand I know that, even with Intel heavily investing in doing so, it would take some time. On the other hand, given the performance boost we are talking about, you could be forced to update the CPU in 1-3 years. One good thing about AM4 is that it is here to stay for nex gen zen+ CPUs, so you would only need to update the CPU, but zen+ might still not support 256bit AVX2 or might introduce improvements (more PCIe lines, quad memory channel etc..) that require a new mobo anyways.

I know this scenario is only in my head but IMO is plausible enough to give it a thought

If you take handbrake h265 encode that uses avx2 a ryzen 8c is about 6800 6c perf level. And thats a h265 cornercase in handbrake.
I mean this up to 50% is optimistic and even then so what? For the rest of the stuff zen is bwe 6900 like perf and a dirt cheap pro tool.
Zen have 2 fpu units and it shows in a lot of productivity workloads and it compensates a lot for not having avx2. Especially in those loads it can use its fpu ressources with its superior smt. So it compensates.
Intel have segmented themselves into a corner and removed avx2 from a huge part of their portfolio.
Avx2 uptake is slow. We are more in sse than avx for many applications as i can tell.

My guess is avx2 is a 7nm thing due to transistor cost. So a zen plus plus in 2-3 year.

My take on this avx2 issue is its actually the ommision of avx2 that makes zen so darn fast. Its a fine priority. Amd usually historically brought tech to early to market. This time they did it right.
 
Last edited:

krumme

Diamond Member
Oct 9, 2009
5,956
1,595
136
Ok, so I've watched that YT video of the 4.0GHz OC actually running at 4.2GHz after coming out of sleep mode. Colour me impressed.
Got me thinking about the 1700 being stock 3.0GHz, but with many showing 3.2GHz as what the CPU is defaulting to.

That 4.2GHz looks like there's performance on the table, but that BIOS is somehow preventing access to it. I know that performance doesn't always scale linearly with frequency, but that's 5% just there.

edit: CB15 score of 1750 @ 4.0GHz, and 1830 @ 4.2GHz
I can change my ram from 2666 to 2933 and ran fine for 10 hrs but cant make a cold boot and get past bios on my asus b350m. I stops me. Lol.
I thing there is a lot of protection. But fine. When they sort out the kinks we will know but i personally expect very little uplift if any. Windows and software is where it is.
 

guachi

Senior member
Nov 16, 2010
761
415
136
A CPU test where every CPU performs the same because the game is GPU bottlenecked is fairly useless (like at 4k, probably) as it tells you nothing about future potential.

However, if there are enough CPUs tested and at least one of them is significantly below the others (like >5% so it's probably not random testing) then that tells you something useful. You now know how fast a CPU today needs to be for 4k gaming in that particular game. Test enough games and enough CPUs and you get a good idea what a baseline 4k CPU is and you can compare whatever CPU you want with that CPU. If, as I recall, a 2500K or 8350 is roughly good enough for 4k gaming (ironically, this makes 4k gaming fairly cheap as you can get a cheap CPU and you don't need more than 60 fps) then you can say "the 7700k/R7 is 2x, 5x, 1000x faster than that chip. I'll probably be good for 4k gaming for X years."

That's useful. At least, that's useful to me as I have a 4k monitor and I'd love a productivity chip that I am assured will be more than enough to handle any game I toss at it either through its current IPC or its ability to handle 16 threads. At least with the R7, I don't really have to care which it is. I'd rather spend my money on a 4k *Sync monitor with greater than 60 Hz refresh rate than another CPU because I guessed wrong.
 

hotstocks

Member
Jun 20, 2008
81
26
91
I have been building computers and gaming for over 25 years. I was excited to build a Ryzen system but haven't. Why? Firstly, whether you like it or not you are all beta testers. NOTHING works stabily, mobos, ram, hell even windows needs patching for Ryzen. So unless you want to pull your hair out with instability, heat, and dead hardware, Ryzen is not ready yet, they should have waited till these things were fixed. WHEN they are fixed I will then decide if the price difference between Ryzen and a 6850k is worth it or not. If Intel has a 6 or 8 core anywhere close to Ryzens price, Ryzen will be dead, that is probably why they rushed it out. And as for games, like about .1% of the people care about 240hz or even 120hz monitors and getting 120-240fps. They are not professional csgo players. 99.9% of monitors are 1080p and so are 99.9% of good laptop panels, and these monitors run at 60hz. 60fps 1080p is where it is at and will be for quite a while. Devs develop games for PC at that target and it will be the target for PS4 pro and Xbox Scorpio (sure scorpio will claim 4k, but that will be 30 fps, not powerful enough gpu). I have a 4.7gh i5 and it beats Ryzen in almost every game benchmark, but I am considering a 6 or 8 core because I don't want to have to shut down my 2nd monitor with browsers, chat apps, video, and whatever else I am doing off to make FUTURE games faster. I know right now a fast 4 core beats a slow 8 core like Ryzen at games, but someone should benchmark what REAL PEOPLE do on their computers WHILE gaming. They run AV software and all the other stuff I mentioned. My guess is with all that bloatware and crap running in the background while gaming, Ryzen might be faster than what I have. So I will be building a new system, but I am not going to have a million headaches with incompatible parts, crashes, overheating, etc. just to save $200 over a STABLE Intel system. So I guess we will know which is the better chip soon. Either AMD fixes their mobo/ram/windows problems within a month, or everyone will be building cheaper 6 and 8 core Intels in 2 months. Hell you'll be able to get a 6850k cheap in a month when Intel release new 6 and 8 cores. In any case any of these cpus will get 60 fps at 1080p which is where 99.9% of the people are gaming and want. My problem is not with Ryzen's slightly worse gaming it is with all it's other problems that make it so you can't even build a reliable system.
 
  • Like
Reactions: CHADBOGA

krumme

Diamond Member
Oct 9, 2009
5,956
1,595
136
The point for zen is that it excactly gets those 60fps 100% and can do a lot of stuff at the same time.
For 330 plus 80 usd for mb.
Thats cheap vs competing solutions.
I think you missed this
[quote uid=262957 name="krumme" post=38779090]Everyone on this forum should see this<br /><br />http://www.youtube.com/watch?v=ylvdSnEbL50&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;Ryzen-The tech presses loses the plot[/QUOTE]<br />WOW, I agree good video
 
  • Like
Reactions: looncraz

HutchinsonJC

Senior member
Apr 15, 2007
466
205
126
I'm perplexed by all the comments about how useless lower resolution gaming benchmarks are.

People care about such benchmarks because it shows what the CPU is capable of in isolation of the GPU. It's a way to test the CPU's performance in a particular type of task (gaming) without other types of hardware (GPU) influencing the result.

Low resolution game benchmarks ARE a CPU benchmark. They are NOT a gaming benchmark. They aren't used to show you gaming performance. They are used to show you the CPU's performance in gaming like tasks, even if not in a realistic gaming environment.

It doesn't matter how unrealistic or bonkers it sounds to the lot of you that hate seeing low resolution gaming benchmarks, there's several of us that are interested in seeing how the CPU performs in isolation of the GPU. Been that way for a long time in the various tech forums... goes back easily two decades.

Since Ryzen is a CPU, and since people are looking to judge its performance...

Guess what? Low resolution gaming benchmarks is one way you test CPU performance. The idea has been around for a long time. Grow up folks.
 

dahorns

Senior member
Sep 13, 2013
550
83
91
Right we have two things going on here then.
Do you agree with Adoredtv that threads and not low res gaming are the best indicator of future performance in NEW GAMES?

You're making a false equivalency here. Assuming Adoredtv's numbers are right, it does appear there is trend of 8-threaded CPUs outperforming 4-threaded CPUs in certain more recent titles. But, that does not mean that we will see a similar trend going from 8-threaded to 16-threaded CPUs. Reasons why that may not be the case include diminishing returns associated with increased thread count in non-parallel workloads. Additionally, Adoredtv's own data showed that up until 5-years after the release of the CPU, low-res gaming performance was relatively predicative of future gaming performance. Finally, even back in 2012 it was possible to find games where the 8350 outperformed the 2500k. Accordingly, the observed trend may be exaggerated (one way or the other) by sample bias.

Second we both have stated total gpu bound benchmarks are silly and useless other than showing particular CPU is not bottlenecking 4k gaming.
What I have repeatedly said is that 1080p ultra seems like a great/realistic balance.
1080p is a low or basic resolution now, in fact it's not likely you would buy a lower resolution screen, which would be totally dumb even if you could considering the amount of money these things cost, anyway 1080p and then ultra settings, adding ultra does put some stress on the GPU but not too much to make it unrealistic, but also is representative of what settings someone is likely to run at 1080p, even with a great 1060 never mind a Titan.

I think your suggestion is reasonable as a part of a total picture. But there still isn't any reason lower-res benches can't also be utilized, so long as they provided in proper context (at some point with a low-res bench you're most likely measuring a very particular aspect of the CPU, not generalized performance.)
 

french toast

Senior member
Feb 22, 2017
988
825
136
I'm perplexed by all the comments about how useless lower resolution gaming benchmarks are.

People care about such benchmarks because it shows what the CPU is capable of in isolation of the GPU. It's a way to test the CPU's performance in a particular type of task (gaming) without other types of hardware (GPU) influencing the result.

Low resolution game benchmarks ARE a CPU benchmark. They are NOT a gaming benchmark. They aren't used to show you gaming performance. They are used to show you the CPU's performance in gaming like tasks, even if not in a realistic gaming environment.

It doesn't matter how unrealistic or bonkers it sounds to the lot of you that hate seeing low resolution gaming benchmarks, there's several of us that are interested in seeing how the CPU performs in isolation of the GPU. Been that way for a long time in the various tech forums... goes back easily two decades.

Since Ryzen is a CPU, and since people are looking to judge its performance...

Guess what? Low resolution gaming benchmarks is one way you test CPU performance. The idea has been around for a long time. Grow up folks.
Context is everything, no one stated that low res benchmarks dont stress the CPU, no one is denying it isn't interesting for us tech geeks, your missing the point entirely.
Reviewers use these tests to establish which which cpu is a better performer in future games =better gaming cpu, this has been debunked.
I'm all for including these tests if they are taken for what they are, corner case tests for niche scenarios and of interest for tech geeks, NOT to determine future gaming performance.
 
  • Like
Reactions: Vaporizer

thesmokingman

Platinum Member
May 6, 2010
2,302
231
106
I'm perplexed by all the comments about how useless lower resolution gaming benchmarks are.

People care about such benchmarks because it shows what the CPU is capable of in isolation of the GPU. It's a way to test the CPU's performance in a particular type of task (gaming) without other types of hardware (GPU) influencing the result.

Low resolution game benchmarks ARE a CPU benchmark. They are NOT a gaming benchmark. They aren't used to show you gaming performance. They are used to show you the CPU's performance in gaming like tasks, even if not in a realistic gaming environment.

It doesn't matter how unrealistic or bonkers it sounds to the lot of you that hate seeing low resolution gaming benchmarks, there's several of us that are interested in seeing how the CPU performs in isolation of the GPU. Been that way for a long time in the various tech forums... goes back easily two decades.

Since Ryzen is a CPU, and since people are looking to judge its performance...

Guess what? Low resolution gaming benchmarks is one way you test CPU performance. The idea has been around for a long time. Grow up folks.

You're right it is "one" way to bench cpus. However, it has no bearing on real world situations. It's an academic test, good to know but does not define the cpu in question. The problem is ppl run around waxing on about low res testing as a definition, and that is problematic.
 
  • Like
Reactions: french toast

Despoiler

Golden Member
Nov 10, 2007
1,967
772
136
I'm perplexed by all the comments about how useless lower resolution gaming benchmarks are.

People care about such benchmarks because it shows what the CPU is capable of in isolation of the GPU. It's a way to test the CPU's performance in a particular type of task (gaming) without other types of hardware (GPU) influencing the result.

Low resolution game benchmarks ARE a CPU benchmark. They are NOT a gaming benchmark. They aren't used to show you gaming performance. They are used to show you the CPU's performance in gaming like tasks, even if not in a realistic gaming environment.

It doesn't matter how unrealistic or bonkers it sounds to the lot of you that hate seeing low resolution gaming benchmarks, there's several of us that are interested in seeing how the CPU performs in isolation of the GPU. Been that way for a long time in the various tech forums... goes back easily two decades.

Since Ryzen is a CPU, and since people are looking to judge its performance...

Guess what? Low resolution gaming benchmarks is one way you test CPU performance. The idea has been around for a long time. Grow up folks.

Low resolution testing is nothing more than creating a synthetic benchmark. Everyone knows that synthetic benchmarks do not reflect the real world. Low rez benchmarking has continually been passed off as just as important as real world testing. It's at least been put above synthetic benchmarks. Therein lies the problem.
 

french toast

Senior member
Feb 22, 2017
988
825
136
You're making a false equivalency here. Assuming Adoredtv's numbers are right, it does appear there is trend of 8-threaded CPUs outperforming 4-threaded CPUs in certain more recent titles. But, that does not mean that we will see a similar trend going from 8-threaded to 16-threaded CPUs. Reasons why that may not be the case include diminishing returns associated with increased thread count in non-parallel workloads. Additionally, Adoredtv's own data showed that up until 5-years after the release of the CPU, low-res gaming performance was relatively predicative of future gaming performance. Finally, even back in 2012 it was possible to find games where the 8350 outperformed the 2500k. Accordingly, the observed trend may be exaggerated (one way or the other) by sample bias.



I think your suggestion is reasonable as a part of a total picture. But there still isn't any reason lower-res benches can't also be utilized, so long as they provided in proper context (at some point with a low-res bench you're most likely measuring a very particular aspect of the CPU, not generalized performance.)
I agree with part of your comment, low res gaming DID predict future performance up to a certain point that multicore became prevalent and software started to take advantage, so yes if taken over a 15 year span, your correct it doesn't hold up before software matured, but that is not the case today, the only thing that could change this present realty is if some other uarch feature could affect gaming, such as AVX2, but that is also multicore related so I suppose the same rules would apply, regardless AVX2 is not gonna be used in games any time soon, thanks mostly to Intel fusing off cheaper SKUs, and because amd makes console CPUs and dictates API landscape.
So in summary yes there is some merit to what you say, but threads are the best indicator of future performance, have been for 5 years at least and look to be for the immediate future.

I agree and have stated there is a role to have 720p low setting game benchmarks, it is a valid cpu test, but my issue is when those tests are miss represented to come to false conclusions, unless you specifically play 720p games, want to play the same game for years whilst upgrading a gpu, or perhaps are curious of a cpu under that test scenario, you have no reason to base your cpu purchase on such a test.
 
  • Like
Reactions: JimKiler

krumme

Diamond Member
Oct 9, 2009
5,956
1,595
136
You're making a false equivalency here. Assuming Adoredtv's numbers are right, it does appear there is trend of 8-threaded CPUs outperforming 4-threaded CPUs in certain more recent titles. But, that does not mean that we will see a similar trend going from 8-threaded to 16-threaded CPUs. Reasons why that may not be the case include diminishing returns associated with increased thread count in non-parallel workloads. Additionally, Adoredtv's own data showed that up until 5-years after the release of the CPU, low-res gaming performance was relatively predicative of future gaming performance. Finally, even back in 2012 it was possible to find games where the 8350 outperformed the 2500k. Accordingly, the observed trend may be exaggerated (one way or the other) by sample bias.



I think your suggestion is reasonable as a part of a total picture. But there still isn't any reason lower-res benches can't also be utilized, so long as they provided in proper context (at some point with a low-res bench you're most likely measuring a very particular aspect of the CPU, not generalized performance.)
Real Dx12 games is still not here. It will make a huge jump for pc mt. More threads is always a pain but dx12 more than compensates for it. The problems arise when dx12 is here in 3 years. Then at that time we look to be at a fmax ipc and (less steep) core wall.
 

HutchinsonJC

Senior member
Apr 15, 2007
466
205
126
Watch the video (prior post). You missed the point of the discussion.
No need to say "grow up". I mean ....it kind of bites you 2 sec later :)

No, it doesn't bite me two seconds later. Because no where in my post did I make mention or suggest that the results of the benchmark would be an indicator of future gaming. Everyone knows gaming will move to more threads. It's not hard to take the results of a CPU benchmark and us them as only an indicator of how the CPU performs in that particular game and a benchmark for that particular game.

People like to extrapolate data in different ways, and THAT'S where people can make mistakes. It's nothing but obvious that games WILL move to more threads. The whole console gaming industry is on weak (low frequency), but highly multi-threaded CPUs. A large majority of gaming, these days, is console first, PC afterthought. If you extrapolate the data correctly, if you write a review properly, it's not hard to write some verbiage in there to indicate that, even though a particular CPU performs poorly in this particular game or that, odds are, that relatively speaking, it's gaming prowess will increase over the years as game developers start to write their games to take advantage of more threads.
 
  • Like
Reactions: french toast

thesmokingman

Platinum Member
May 6, 2010
2,302
231
106
If you extrapolate the data correctly, if you write a review properly, it's not hard to write some verbiage in there to indicate that, even though a particular CPU performs poorly in this particular game or that, odds are, that relatively speaking, it's gaming prowess will increase over the years as game developers start to write their games to take advantage of more threads.

But its a lot easier getting site traffic when you write Ryzen Fails at Gaming, all predicated on ridiculous 640x480 gaming test!
 
  • Like
Reactions: french toast