• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."
  • Community Question: What makes a good motherboard?

Official AMD Ryzen Benchmarks, Reviews, Prices, and Discussion

Page 108 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Rifter

Lifer
Oct 9, 1999
11,518
745
126
It boggles the mind that they didn't work out the Windows driver before launch.

Seems like that's critical.
Yes i agree not optimizing windows to see the CCX and not move threads from one CCX to the other is for sure the reason for the bugs. This would cause a big performance hit as the cache is not shared between CCX's.

Luckily this will be a easy windows software fix.
 
  • Like
Reactions: Space Tyrant

moonbogg

Diamond Member
Jan 8, 2011
9,892
1,541
126
what? Cherry picked benchmarks ? All companies show themselves in the best light but there is no evidence amd led you up the garden path. All the benchmarks including single thread have checked out, with ryzen being faster in some, slower in others.

They didn't say it would be great in 120fps 1080p gaming did they? They specifically showed 4k gaming scenarios which for the most part is representative of the settings people buying these systems would use.
Neither did they promise or even hint at amazing overclocks,broadwell at twice the price only gets to 200mhz more.
You are getting 6900k performance for half the price unless you ordered the 1700, in which the value level is even better.
By all accounts any issues are likely software based, which may also marginally improve overclocks.

Be honest did you buy the system for 120fps 1080p gaming sessions? If not what are you complaining about?
I game 3440x1440@100hz. The Ryzen CPU failed to break 100fps in some games compared to the intel chips. Some of those game results were really terrible. I don't care about 200fps vs 500fps, but I do care when Ryzen can't even feed a GPU to max out a 100hz monitor. I'm planning on a 1080ti and I don't want 1070 performance from it because my CPU can't keep up. I'm not alone here. A lot of people are kind of freaking out about this.
 
  • Like
Reactions: Arachnotronic

Crumpet

Senior member
Jan 15, 2017
745
538
96
It's crazy to think that way about a corporation, imo.

Sounds like you are talking about Santa Claus. :)

AMD doesn't give a darn about us, just like Intel.
No but we should give a damn about Intel having good competition. Plus AMD has made advances more than just FPS and IPC gains, some of the things AMD have managed is groundbreaking.
 
Last edited:

psolord

Golden Member
Sep 16, 2009
1,308
352
136
If that CCX performance hit is real, did anyone test with one ccx disabled? If I am not mistaken there are Ryzen Master and BIOS options to disable cores, right?

Maybe playing with Task Manager's affinity in problematic games, would shed some light too?

Imagine if disabling one CCX (4cores) would improve performance in some games. Now imagine if some R3s were salvaged R7s and some other R3s were single CCXed ones. Hmmm...!
 

french toast

Senior member
Feb 22, 2017
978
803
106
I game 3440x1440@100hz. The Ryzen CPU failed to break 100fps in some games compared to the intel chips. Some of those game results were really terrible. I don't care about 200fps vs 500fps, but I do care when Ryzen can't even feed a GPU to max out a 100hz monitor. I'm planning on a 1080ti and I don't want 1070 performance from it because my CPU can't keep up. I'm not alone here. A lot of people are kind of freaking out about this.
Thats fair enough, i can see your point in that case, but honestly i Wouldn't worry about it, gigabyte mobo with different bios shows ryzen 3.9@ ghz trading blows with a 7700k @ 5ghz, i Don't think that is a coincidence.
 

Dygaza

Member
Oct 16, 2015
176
34
101

1. Windows is load-balancing across CCXes.

This means that a thread is being moved around on the CPU - which is normal - so that a single core isn't used more than others. On Ryzen, that needs to happen ONLY within a CCX, otherwise you will incur a massive penalty when that thread no longer finds its data in the caches of the CCX.

2. SMT hurts single threaded performance due to shared structure.
Ryzen statically partitions three structures to support SMT:
a. Micro-op queue (dispatcher)
b. Retirement queue
c. Store queue

This means that, with SMT enabled, these resources are cut, potentially, in HALF (mind you, these are just queues that impact throughput of a single thread).
This can be easily tested with process hacker. Just give each game thread pre-defined core affinity and we see if this really makes a difference.
 
  • Like
Reactions: looncraz

AtenRa

Lifer
Feb 2, 2009
13,548
2,522
126
I game 3440x1440@100hz. The Ryzen CPU failed to break 100fps in some games compared to the intel chips. Some of those game results were really terrible. I don't care about 200fps vs 500fps, but I do care when Ryzen can't even feed a GPU to max out a 100hz monitor. I'm planning on a 1080ti and I don't want 1070 performance from it because my CPU can't keep up. I'm not alone here. A lot of people are kind of freaking out about this.
You are 99% GPU limited at that resolution, i dont believe you will see a huge performance difference from CPU A to B.
 

hotstocks

Member
Jun 20, 2008
81
26
91
If you are only getting 47fps with a 1080 on 1080p ultra then that tells me you are bottlenecked somewhere, im certain a r7 1700 @ 3.9ghz would murder it even without any updates.
No my system is fast as hell. You obviously have not played The Witcher 3 at 1080p Ultra settings in a town. The game is gorgeous and 47fps is about all you can get at times. My system pegs DOOM at 200fps constantly, so no it is just that a beautiful engine like in The Witcher (and games to come) are still gpu limited even with a Nvidia 1080 at 1080p if using ULTRA settings.
 

krumme

Diamond Member
Oct 9, 2009
5,898
1,524
136
The only problem I have with the CPU is the non existing OC. Otherwise the CPU is great and will have a huge impact in Server market the coming months.
An 1700 oc 25% to 32%.
It cost 320 usd.
Its insane amount of computing power for low cost.
Its fast where it counts.
If this is not an enthusiast cpu i dont know what is.
 

zinfamous

No Lifer
Jul 12, 2006
103,512
18,067
136
I game 3440x1440@100hz. The Ryzen CPU failed to break 100fps in some games compared to the intel chips. Some of those game results were really terrible. I don't care about 200fps vs 500fps, but I do care when Ryzen can't even feed a GPU to max out a 100hz monitor. I'm planning on a 1080ti and I don't want 1070 performance from it because my CPU can't keep up. I'm not alone here. A lot of people are kind of freaking out about this.
...then why are you basing all of your angst over 1080p60hz benchmarks?

You spent an awful lot of money on a fancy name sign to hang over your desk to fail so hard at understanding benchmarks and new tech growing pains, bro. :p
 
Mar 10, 2006
11,719
2,003
126
I game 3440x1440@100hz. The Ryzen CPU failed to break 100fps in some games compared to the intel chips. Some of those game results were really terrible. I don't care about 200fps vs 500fps, but I do care when Ryzen can't even feed a GPU to max out a 100hz monitor. I'm planning on a 1080ti and I don't want 1070 performance from it because my CPU can't keep up. I'm not alone here. A lot of people are kind of freaking out about this.
Buy a 7700K. Seriously.
 
  • Like
Reactions: lightmanek

AtenRa

Lifer
Feb 2, 2009
13,548
2,522
126
An 1700 oc 25% to 32%.
It cost 320 usd.
Its insane amount of computing power for low cost.
Its fast where it counts.
If this is not an enthusiast cpu i dont know what is.
It OC 25-30% because it only has a low base clock of 3.0GHz, i was hopping for 4.4 to 4.5GHz. Has anyone tried OC with closed SMT ??
 
  • Like
Reactions: Drazick
Mar 10, 2006
11,719
2,003
126
...then why are you basing all of your angst over 1080p60hz benchmarks?

You spent an awful lot of money on a fancy name sign to hang over your desk to fail so hard at understanding benchmarks and new tech growing pains, bro. :p
Uh, he's not "failing so hard" in his understanding. He's got a 100Hz monitor and he'll have that 1080 Ti which can push 100fps at his resolution, so having a CPU that can actually keep up is desirable.

If I were him, I'd cancel the Ryzen, buy a 7700k and a z270 board to go with the 1080 Ti and call it a day. Or hold out for Skylake-X.
 
  • Like
Reactions: Conroe

Crumpet

Senior member
Jan 15, 2017
745
538
96
Uh, he's not "failing so hard" in his understanding. He's got a 100Hz monitor and he'll have that 1080 Ti which can push 100fps at his resolution, so having a CPU that can actually keep up is desirable.

If I were him, I'd cancel the Ryzen, buy a 7700k and a z270 board to go with the 1080 Ti and call it a day. Or hold out for Skylake-X.
At 4k there is no difference between 7700k and Ryzen. Except he'd have 4 extra cores to help him as games develop.
 

Rifter

Lifer
Oct 9, 1999
11,518
745
126
I mean, if we are going to talk about most used applications, aren't talking about office productivity and doing things like opening pdfs and webpages? And aren't those things that KL also does much better?

The suggestion that KL only wins if all you care about is gaming is clearly false. For the vast majority of users, KL is going to be faster in their day to day tasks. But a large segment of people need the type of horsepower that Ryzen provides, and now they can get it for a fraction of what Intel was charging. We all win!
That depends, anything threaded for more than 4 cores is faster on zen than KL. I was referring to productivity/content creation applications, such as would be used in a professional/office environment, which is also likely the most used use of PC's. Everyone i know who works in a office has a PC in it to accomplish their jobs, most do not have PC's at home anymore. Productivity/professional uses are likely the most used application for PC's, not gaming. And now you can build a zen high end workstation(the whole system) for less than the cost of just the intel equivalent HEDT processor itself, just the CPU. This is huge cost advantage for anyone who can use the cores.

Like i said above, if you only game(or facebook, surf the net, etc) then a KL is likely better for you, but if you use any applications that scale past 4 cores, or use your PC for productivity/content creation/professional uses, then zen is a huge cost savings for you over intel HEDT.
 
  • Like
Reactions: RussianSensation

french toast

Senior member
Feb 22, 2017
978
803
106
No my system is fast as hell. You obviously have not played The Witcher 3 at 1080p Ultra settings in a town. The game is gorgeous and 47fps is about all you can get at times. My system pegs DOOM at 200fps constantly, so no it is just that a beautiful engine like in The Witcher (and games to come) are still gpu limited even with a Nvidia 1080 at 1080p if using ULTRA settings.
Sounds like a cpu bottleneck to me, probably combined with bandwidth, im not saying it is slow, but a gtx 1080 is a beast, hell a ps4 pulls 1080p medium 30fps in novigrad.
 
Last edited:
  • Like
Reactions: IEC

raghu78

Diamond Member
Aug 23, 2012
4,093
1,474
136
I would like to see AMD improve Zen+ in quite a few aspects.
1. Memory latency.
2. Inter CCX bandwidth.
3. SMT resource allocation in a manner which does not adversely affect single thread performance.

I also hope AMD and GF tweak the 14nm rocess and improve physical design of Zen+ to run at 4.5+ Ghz. Anyway Zen is a decent foundation to build on for the next 4 years.

Sent from my SM-G935V using Tapatalk
 

krumme

Diamond Member
Oct 9, 2009
5,898
1,524
136
It OC 25-30% because it only has a low base clock of 3.0GHz, i was hopping for 4.4 to 4.5GHz. Has anyone tried OC with closed SMT ??
I am sure many of you were hoping for 4.5.
But a bwe 8c draws 230 watt at 4.2 even excluded the heavy avx stuff.
So you either expected and accepted a 4.5 toaster at 290w or some miracle that bend the laws of physics.
Yet its still a 14nm processor.

Time to get back to reality and take a pause from the hypetrain guys.
We need a break.
 

french toast

Senior member
Feb 22, 2017
978
803
106
I am sure many of you were hoping for 4.5.
But a bwe 8c draws 230 watt at 4.2 even excluded the heavy avx stuff.
So you either expected and accepted a 4.5 toaster at 290w or some miracle that bend the laws of physics.
Yet its still a 14nm processor.

Time to get back to reality and take a pause from the hypetrain guys.
We need a break.
Arise sir krumme....
 

Majcric

Golden Member
May 3, 2011
1,345
21
81
You are 99% GPU limited at that resolution, i dont believe you will see a huge performance difference from CPU A to B.
1440p is still a cpu dependent resolution. With Titan (XP) like performance he will need all the cpu that can be thrown at it.
 

ASK THE COMMUNITY