Official AMD Ryzen Benchmarks, Reviews, Prices, and Discussion

Page 154 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Aristotelian

Golden Member
Jan 30, 2010
1,246
11
76
http://www.gamersnexus.net/hwreviews/2827-amd-r7-1700-review-amd-competes-with-its-1800x

I didn't see this posted and am unsure how popular gamersnexus is here, but they do very comprehensive reviews that work well for me. I really wanted Ryzen to blow Intel's offerings away in every way (beyond performance per dollar, which looks quite good). The review I linked gave great food for thought and from what I got from it:

(i) if you're just gaming, the 1700 is the champion of the current Ryzen lineup, with its overclock outperforming the 1800x's overclock in games;
(ii) the 7700K (overclocked or stock) still beats the Ryzen 1700 (overclocked or stock), and at least where I am, both are roughly EUR 350.

Having said that, I still think that AMD has come really far. I'm optimistic about future offerings/competitiveness, because I suppose I'm used to viewing Intel as being fairly complacent in its market position (and I'm aware that my position is technically naive because it must cost an immense amount of money to research how to make chips faster and faster given physical limitations that exist).

I'm also interested in comments I've read from AMD's side about the gaming performance lagging but its (summarized) being an issue that can be dealt with. If AMD could improve its gaming performance with these chips that could be a fantastic boost to the perception of this lineup.
 
  • Like
Reactions: french toast

guachi

Senior member
Nov 16, 2010
761
415
136
New game engines are increasingly better threaded. This is not a novel idea. That does not invalidate low res CPU benchmarks. Regardless of whether you think they're important or not, there are still a lot of people to whom those benchmarks are relevant. The best compromise would be if the reviewers just included both.

Ceteris paribus, a low res benchmark is probably useful. But between the 8350 and 2500K, ceteris weren't paribus. Just like between the R7 and the 7700K.

If you look at the results of the 8350 vs. the 2600K, the 2600K stays basically the same distance ahead. But this was a test where other things (well, core/thread count) were equal.

Whatever problems the 8350 might have had scaling with faster GPUs was completely overwhelmed by (presumably) games taking advantage of more threads. When we see benchmarks of the Ryzen chips with 4-cores/8-threads I don't think anyone will claim they will get faster over time relative to the 7700K. Once behind, always behind.
 
  • Like
Reactions: french toast

french toast

Senior member
Feb 22, 2017
988
825
136
Re-read my last two posts. You're asking me to recreate an Intel version of his flawed analysis and commit exactly the same fallacy that I was criticizing the video for.

New game engines are increasingly better threaded. This is not a novel idea. That does not invalidate low res CPU benchmarks. Regardless of whether you think they're important or not, there are still a lot of people to whom those benchmarks are relevant. The best compromise would be if the reviewers just included both.
Yes it does invalidate low res benchmarks when comparing a 8/16 processor vs 4/8 processor and erroneously suggesting to consumers such benchmarks predict future performance, THEY DON'T. You have been shown why this the case, if you feel strongly otherwise feel free to offer a good explanation, preferably with a vaid example.

An example where low res gaming IS VALID to use as future performance prediction is when comparing two processors with the same thread count, such as 7700k vs 1400x, that is a valid comparison to let consumers weigh up whether its worth spending nearly twice the money (after mobo/cooler costs inluded) for a 7700k that is obviously better performing for present and future.
That is without discussing the effect of ryzen having a brand new uarch, with many game/bios/OS updates that will also affect that decision process.
 

Atari2600

Golden Member
Nov 22, 2016
1,409
1,655
136
I'm making an argument against GPU limited CPU benchmarks in CPU reviews/the idea that CPU limited CPU benchmarks have no value.

Well, why do they have value? What do they show?

It would seem all they show is current performance of the CPU on a current GPU on the current version of a game.

They don't show future performance of the same CPU on the same GPU on a future version (patch) of the game. Or a future GPU.
 
  • Like
Reactions: krumme

french toast

Senior member
Feb 22, 2017
988
825
136
Well, why do they have value? What do they show?

It would seem all they show is current performance of the CPU on a current GPU on the current version of a game.

They don't show future performance of the same CPU on the same GPU on a future version (patch) of the game. Or a future GPU.
Technically there would be more future head room for a more powerful gpu, or demonstrating 240fps+ competitive gaming, providing your comparing two cpus with the same threads.
Obviously basing your purchase off of 4k ultra benchmarks solely is unwise also, as its far too gpu bound, but i would suggest 1440p or preferably 1080p ultra as a realistic balance.
Testing different threaded cpus at 720p is as useful as a chocolate teapot, can you even buy a 720p monitor to play snake at 500fps?
 

french toast

Senior member
Feb 22, 2017
988
825
136
No it doesn't. If X game doesn't utilize more than 4/8 threads, X game is currently GPU bottlenecked, and you still intend to play X game in 2-4 years time when GPU horsepower has doubled, then they absolutely do help predict the CPU power reserves you might have in that event. Again i'm not using this as an argument against Ryzen specifically --right at this point it's ST performance is pretty competitive-- but this has played out countless times in the past (games like Skyrim, Crysis).

You're so fixated on the core count of the CPU, and so confident in your assumption that everyone is going to be playing exclusively highly threaded titles from the future in the future, that you've repeatedly missed the point I'm making, despite my having spelled it out it the simplest possible terms.

As I've said from the start, it all comes down to the individual consumer's priorities. You don't help consumers by hiding the potential differences between CPUs by benchmarking them in a quasi-real-world GPU bound scenario.
You cant base a full cpu review on the basis consumers are going to be playing the very same game 4 years later! :/
Come on dude.
 

guachi

Senior member
Nov 16, 2010
761
415
136
If you buy a new GPU that's twice as fast and still intend to play a four year old game, I don't think your CPU is particularly relevant at that point. If your resolution hasn't changed then you'll be getting such crazy frame rates it won't matter.

I only have a RX480 and no game I have that's four years old is even a problem when I game.

"Better at old games!!" isn't a particularly compelling selling point.
 

french toast

Senior member
Feb 22, 2017
988
825
136
Have you noticed how popular Skyrim and GTX 5 still are? And you're not "basing the full CPU review on that". No one will ever answer me this, I think I've already asked it twice in this thread: What is more useless; contrived low res CPU bound CPU benchmarks that give you some idea of the relationship between two parts, or flatline GPU bound CPU benchmarks that tell you absolutely nothing?
You are mixing things up here so this will be my last comment on the subject.
Adoredtv (as are we originally ) are talking about reviewers erroneously using super low res benchmarks to predict future performance, this is in the main directly related to 7700k vs r7 1700/ryzen, so we are originally speaking of 4/8 vs 8/16.
You have yet to give a valid example as to why adored is being biased/wrong, instead you have introduced a corner case argument into the discussion of consumers playing the same game for 4 years, it would be obvious for anyone reading the review that a more powerful gpu would speed current tested games up without bothering with such benchmarks.
When reviewers use the low res benchmarks to predict future performance (their words) they are quite obviously expecting you to be playing new games in 2 years, not the same bloody game, therefore future advances in game technology are valid.

Ps, Even if they introduce such benchmarks for corner case usages, it should NOT be used to form the review conclusion, which is what the whole fuss is about in the first place!.
 
Last edited:

krumme

Diamond Member
Oct 9, 2009
5,956
1,595
136
No it doesn't. If X game doesn't utilize more than 4/8 threads, X game is currently GPU bottlenecked, and you still intend to play X game in 2-4 years time when GPU horsepower has doubled, then they absolutely do help predict the CPU power reserves you might have in that event. Again i'm not using this as an argument against Ryzen specifically --right at this point it's ST performance is pretty competitive-- but this has played out countless times in the past (games like Skyrim, Crysis).

You're so fixated on the core count of the CPU, and so confident in your assumption that everyone is going to be playing exclusively highly threaded titles from the future in the future, that you've repeatedly missed the point I'm making, despite my having spelled it out it the simplest possible terms.

As I've said from the start, it all comes down to the individual consumer's priorities. You don't help consumers by hiding the potential differences between CPUs by benchmarking them in a quasi-real-world GPU bound scenario.



No, that's what the so-called "real world" GPU bound benchmarks show, and that's precisely why they're useless.



They can show what performance you might be able to attain with a future GPU.
Yep. But this argument is only valid as long as you run a 8350 that couldnt get the most out of the games when it launched. It was to slow out the gate for even 60fps gaming.
This situation now is different.

There is a few/handfull games today where the ones with a 144 monitor benefits from a 7700. They get eg 144 vs 120 fps with better frametimes with a very fast gfx at 1080p.

And if they can buy a new i7/cpu in 2 years and they play those game certainly 7700 is clearly fastest. Its pretty aparent as the numbers is all over. But its also a 144 situation and not 60fps that the huge majority still runs.

In bf1 today you can make a modern i7 tank below 60fps. Absolute a corner case in the game and as i know the only game that kills the processor like that. But its actually a point in relation to the 8350 2012 non 60 fps situation.

As we know its the min 1%/0.01% defining experience for fps/action at least.

I think for people playing with a 60/90 monitor and/or keep their processor over 2 years 8c is far more futureproof than a fast 4c.
 
  • Like
Reactions: Trender

guachi

Senior member
Nov 16, 2010
761
415
136
Jesus wept, is it possible that you guys could a more spectacular job of missing the point? You won't be getting those "crazy frame rates" if you're CPU bound.

Incidentally, a GTX 780 couldn't sustain 60fps in Crysis at 1920x1080 with 4xAA.

Your example was of a game that was GPU bound, not CPU bound, that is 2-4 years old.By the time that game becomes CPU bound the fps is going to be crazy high because we know that the R7 is fine in every 1920x1080 game and in 4k gaming you can get by with a lot slower CPU. By the time 4k gaming can stress the CPU 8/16 should trounce any 4/8 CPU, anyway.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,249
136
Predicting future cpu gaming performance longevity using the past performance trends seems silly at best. It's probably best to use some imagination and try to predict (based on current trends) where it's headed in the end.

Sadly some bigger players in the market tend to sandbag the evolution process.

Based on the past Ryzen should have been hot, power hungry, undesirable, and lacking in performance! This phylosophy could explain why it kind of looks like the MB makers got caught with their pants down. AMD delivered a competitive product which seemed impossible....If that makes sense.

The past doesn't dictate the future. In the end it only shows what happened then and doesn't really reflect on what will happen tomorrow, next month, year, etc.
 
  • Like
Reactions: lightmanek

french toast

Senior member
Feb 22, 2017
988
825
136
Predicting future cpu gaming performance longevity using the past performance trends seems silly at best. It's probably best to use some imagination and try to predict (based on current trends) where it's headed in the end.

Sadly some bigger players in the market tend to sandbag the evolution process.

Based on the past Ryzen should have been hot, power hungry, undesirable, and lacking in performance! This phylosophy could explain why it kind of looks like the MB makers got caught with their pants down. AMD delivered a competitive product which seemed impossible....If that makes sense.

The past doesn't dictate the future. In the end it only shows what happened then and doesn't really reflect on what will happen tomorrow, next month, year, etc.
I think adored showed that threads is the one metric that does indicate longevity.
Perf at low res vs threads, history has clearly shown threads are the best indicator.
 

unseenmorbidity

Golden Member
Nov 27, 2016
1,395
967
96
I have said this like a dozen times already, but maybe it's worth repeating. The future of the gaming industry isn't hard to decipher. In fact, it's basically written in stone! Consoles! Game developers make games for consoles, and then port them to PC. The PC is an afterthought for most developers.

The current generation of consoles are old and outdated. Yet, despite this fact, the consumers expect better and better games each year. So, what does that mean for game developers? They need to squeeze more and more performance out of those consoles. Which in turn means they have to better optimize the games to work on the old 8 thread CPUs. This will continue for 2 years, until the next generation of consoles come out. Those consoles will likely have an 8 core Zen CPU.
 
Last edited:

Atari2600

Golden Member
Nov 22, 2016
1,409
1,655
136
They can show what performance you might be able to attain with a future GPU.

Sorta. Kinda.

Leaving aside game patches for a minute, it shows very little of relevance.

Unless moving from 90 FPS to 150 FPS really floats your boat?
 

french toast

Senior member
Feb 22, 2017
988
825
136
And I said they predict it in more ways than one. He said low res benches are useless, I raised an objection to that. Don't try to tell me he was referring exclusively to Ryzen vs current Intel, because the majority of his video dealt with legacy hardware.



I introduced that "corner case" clearly in my first post (future gains for current games), so if you want to dismiss my argument based on that, why didn't you do it then? Oh, you didn't expect 2-4 years? Give me a break, that's hardly an outlandish upgrade cycle for a GPU.



Yes, and that's why the CPU is even in question at all. If the game wasn't GPU bound at the time, there would be no dilemma in choosing the CPU because you'd just choose the fastest one.



Who said anything about Ryzen? Again, this is not an anti Ryzen argument.
Yes you did (missed that) but it doesn't change my responses.
I already accounted for one corner case; 240fps+ low res gaming such as CS:go, you introduce another corner case, fine i do not complain about benchmarking for such things, but i will repeat the whole discussion is about reviewers using such benchmarking to form overall conclusions when giving their stamp of approval, they make no such comments about those corner cases and so it is assumed (very clearly) that they are trying to demonstrate better performance in future in typical usages (new games) using 720p benchmarking, this is false.
Corner cases are just that.
 

Despoiler

Golden Member
Nov 10, 2007
1,967
772
136
Windows sleep bug related to Ryzen. Clock rates are reporting higher after coming out of sleep AND benchmark results are higher too. Although results not 100% following the reported clockrate. ie if reported clockrate is 4.8, score is inline with a 4.2 OC. 4.2 OC not possible to achieve manually though.

 
Last edited:

beginner99

Diamond Member
Jun 2, 2009
5,231
1,605
136
Gears of War on PC is DirectX 12 purely.

It's based on Unreal engine 4 which itself is not purely dx12. Sorry, no. There is no pure dx12 game or engine anywhere, not even AotS. To fully take advantage of dx12 (or vulkan), no dx11 path can exist because of fundamentally different paradigms. It will take years till such a game will actually come to market.
 

Joric

Junior Member
Mar 4, 2017
14
6
16
Sorta. Kinda.

Leaving aside game patches for a minute, it shows very little of relevance.

Unless moving from 90 FPS to 150 FPS really floats your boat?

Actually the initial tests of the FX-8350 that AdoredTV used showed a 17% disadvantage at 640x480 compared to the 2500K in 2012 using a GTX 680 (while showing a 9% advantage in synthetic benchmarks).
By February 2017, using a 980 Ti at 720P, the FX-8370 was at a 12% disadvantage to the 2500K.
So yeah, I'd say that was a pretty reasonable indicator of future GPU performance with equivalent CPUs (and the 1080P performance gap stayed around 9-10%).
In March of course they updated to DX12 benchmarks, games with more multithreading, along with the Titan XP.
Then the FX-8370 overtook the 2500K and achieved it's 10% theoretical advantage.
 
Last edited: