AMD Ryzen (Summit Ridge) Benchmarks Thread (use new thread)

Page 190 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
You'll need to spoon feed me with that link. The top performer is a 4-core part. But, that is a far newer part. So, looking at the next two performers as they were both the same generation and the 6-core is basically identical to the 4-core. Again, I just don't see that as proving that we need 16 threads to game.

So, the processor with more cores was slower, and triple the price, so I'm supposed be convinced to buy the processor with more cores to game?
Sigh. This generation's technology conservative.

You realize the exact same argument has played out word-for-word verbatim when we moved from single to dual cores, dual to quad, and now quad to octo core? Yes most games don't care about if its 4 or 8 cores. But more and more do, and most people expect to keep their CPU for at least 5 years or more at the minimum. Can you promise to know the future 5 years from now? Any smart shopper will reach for at least 6 cores if they are building a gaming rig.
 

CHADBOGA

Platinum Member
Mar 31, 2009
2,135
832
136
You know I typed out "8", then went back and changed it to 6 for the more conservative people. Personally I'm not upgrading from a 2500K for less than 8 cores.
If you upgrade from a 2500K to a 6 core with HT, you also go from 4 threads to 12 threads. ;)
 

Magic Hate Ball

Senior member
Feb 2, 2017
290
250
96
Sigh. This generation's technology conservative.

You realize the exact same argument has played out word-for-word verbatim when we moved from single to dual cores, dual to quad, and now quad to octo core? Yes most games don't care about if its 4 or 8 cores. But more and more do, and most people expect to keep their CPU for at least 5 years or more at the minimum. Can you promise to know the future 5 years from now? Any smart shopper will reach for at least 6 cores if they are building a gaming rig.

That's what I'm saying to him too.

I was playing BF1 today on a 64 player map. My core utilization on my 8 logical cores (i7 4770k @ 4.3ghz) hit 90-10% very often for all 8 threads. We're already pushing the limits in multiplayer especially.
 
  • Like
Reactions: swilli89

sm625

Diamond Member
May 6, 2011
8,172
137
106
Any smart shopper will reach for at least 6 cores if they are building a gaming rig.

No way. 7700k is faster in games vs the entire HEDT lineup, and HEDT prices are all jacked up due to the outrageous motherboard pricing. Going >4 cores is simply an absurd waste of money, unless you're buying off-lease, in which case its actaully great deal. And that in itself only illustrates how broken is the HEDT market.
 
D

DeletedMember377562

No way. 7700k is faster in games vs the entire HEDT lineup, and HEDT prices are all jacked up due to the outrageous motherboard pricing. Going >4 cores is simply an absurd waste of money, unless you're buying off-lease, in which case its actaully great deal. And that in itself only illustrates how broken is the HEDT market.

There are several faults in your claim:

1. 7700K isn't better because it is 4 core. It's better because the way Intel has segmented the market, their HEDT have 1-2 generation older architecture. It's because of architecture and architecture alonet that their 4 cores are better. A 6700K doesn't beat 5820K and 5960X by ~5% in CPU heavy games because it is 4 core. It beats them by ~5% (as proven by DF), because Skylake has around 4-5% better IPC than Haswell.

2. 8 true cores are better than 4 cores and 8 threads. As is 8 coress and 16 threads. Of course you can argue that the cases are so few that it's stupid to even care, if you don't do CAD writing, rendering, streaming and such things. But there are those few cases, and ignoring them is not fair. Take for example Battlefield 1. Even in 1440p, there is a difference between the 6900K and the 6700K in MP: http://techbuyersguru.com/battlefie...700k-6900k-and-r9-270x-through-titan-x?page=1

I think the same is the case with Watch Dogs 2.

Also, not recommending 8 core has happened for a good reason. When Intel prices their HEDT for the insane prices they do, not many people buy them. When few buy them, the incentive for developers to make their GPUs dependent on much more than a few cores, is small. But if AMD can release 6c/12t and 8c/16t for relatively good prices, then it changes the whole picture. Also, we already know that Ryzen has problems with increasing frequency because of yield issues. If their 4c/8t can't reach any higher clock speeds than their 8c/16, them their single core performance will stay the same. So more cores and threads in Ryzen's case will only give addtional performance in the cases where a game will make use of it. It's different with Intel, where both clock speeds and architecture differ. Of course you could argue those extra cores/threads mostly go to waste. But for people who play particuliar games, like BF1 for me, it won't. Not to mention that AMD bringing 8c/16 into the mainstream, as well as supplying consoles with 8 core chips, and DX12 becoming more pre-valent, will make more cores a good thing to have for the future. Just as BF1 is more CPU heavy than BF4, I expect the next BF in two years time to be even more CPU heavy than BF1.


That's what I'm saying to him too.

I was playing BF1 today on a 64 player map. My core utilization on my 8 logical cores (i7 4770k @ 4.3ghz) hit 90-10% very often for all 8 threads. We're already pushing the limits in multiplayer especially.


That's strange. As my link above shows, you can a bit more FPS with more cores in BF1 MP. But with my OCed 2600K, I have usage of 80%. With the 6700K at stock speeds I topped at 70% usage across al cores in CPU heavy moments -- as has been the case with 4790K and 7700K too, according to friends and videos on YouTube.
 
  • Like
Reactions: swilli89

sm625

Diamond Member
May 6, 2011
8,172
137
106
2. 8 true cores are better than 4 cores and 8 threads. As is 8 coress and 16 threads. Of course you can argue that the cases are so few that it's stupid to even care, if you don't do CAD writing, rendering, streaming and such things. But there are those few cases, and ignoring them is not fair. Take for example Battlefield 1. Even in 1440p, there is a difference between the 6900K and the 6700K in MP: http://techbuyersguru.com/battlefie...700k-6900k-and-r9-270x-through-titan-x?page=1

You're talking about relatively insignificant gains (5% @ 4K) given the massive platform cost difference. You're way better off putting that money into better cooling (for higher overclock) or a better GPU. In that one cherry-picked case, the GPU is already top of the line, so I guess maybe you could upgrade the cooling on that too and get better overall perf/$? I dont know because I have no intention of buying a TitanXP lol let alone buying exotic cooling for it. But I bet if you did, the 7700k system would end up faster for less money.

As for streaming, no one streams 4K, so there is still no real need for cores for that purpose. If you set your bitrate too high, people wont even watch your stream because it buffers too much. 720p30 is still the best choice for streaming, and that takes like half a core, including running the webcam.
 
Last edited:
D

DeletedMember377562

You're talking about relatively insignificant gains (5% @ 4K) given the massive platform cost difference. You're way better off putting that money into better cooling (for higher overclock) or a better GPU. In that one cherry-picked case, the GPU is already top of the line, so I guess maybe you could upgrade the cooling on that too and get better overall perf/$? I dont know because I have no intention of buying a TitanXP lol let alone buying exotic cooling for it. But I bet if you did, the 7700k system would end up faster for less money.

As for streaming, no one streams 4K, so there is still no real need for cores for that purpose. If you set your bitrate too high, people wont even watch your stream because it buffers too much. 720p30 is still the best choice for streaming, and that takes like half a core, including running the webcam.

Dude, are you even looking at the benchmarks? Look at the 1440p MP chart, where the gains are pretty big. The 6900K is around 10% better in average at lowest and average FPS. And that's taking the ~3% lower IPC in architecture (of Broadwell) into the equation. The 1080p chart shows even bigger differences, but I decided to not include it, as only idiots run GPUs that powerful in 1080p.

Calling this a "cherry picked case" is incredibly biased. BF1 is one of the most popular FPS games out there. How is it even "cherry picked"? Also, check CPU benchmarks of the game on Gamer Nexus, or check Watch Dogs 2 CPU benchmarks. They all show gains for CPUs with more than 4 cores. And this is today -- the differences will be even more evident tomorrow. Especially when AMD moves 8 core chips into the mainstream.

And once again, I talked about how this would become an alternative because of Ryzen. I never argued in favor of the 6900K, which goes for $1100. I argued in favor of Ryzen 8c/16, which will probably debut at less than half the price of the 6900K, and be a viable alternative for the mainstream market. I mean, the current top of the line 4c/8t chip from Intel goes for an insanely price tag of $320. If AMD releases an 8c/16t for $400, would you not recommend people to get it?

As for Titan XP. The GTX 1070 performs as well as a 980 Ti did a generation ago. I'm sure the performance a Titan XP has, will be seen in pretty affordable GPUs into the futures (Pascal Refresh, Volta, etc.).
 
Last edited by a moderator:
  • Like
Reactions: swilli89

Magic Hate Ball

Senior member
Feb 2, 2017
290
250
96
That's strange. As my link above shows, you can a bit more FPS with more cores in BF1 MP. But with my OCed 2600K, I have usage of 80%. With the 6700K at stock speeds I topped at 70% usage across al cores in CPU heavy moments -- as has been the case with 4790K and 7700K too, according to friends and videos on YouTube.

It's hard to capture but there are spikes to 90%-100% on my CPU:
Wv6krL9.jpg


If I lowered certain graphics settings I'm sure my CPU instead of my GPU would be the true bottleneck.

Also, a CPU doesn't need to be at 100% usage to be limiting performance, especially when you're aiming for higher FPS.
 

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
No way. 7700k is faster in games vs the entire HEDT lineup, and HEDT prices are all jacked up due to the outrageous motherboard pricing. Going >4 cores is simply an absurd waste of money, unless you're buying off-lease, in which case its actaully great deal. And that in itself only illustrates how broken is the HEDT market.

First of all we are discussing this in a thread dealing with Ryzen. Keeping that in mind,

No way. 7700k is faster in games vs the entire HEDT lineup.

Intel artificially constricts their HEDT lineup to a generation old architecture. Alot of the 7700K's wins are due to it benefitting from a newer uarch. AMD won't be doing this with Ryzen.

HEDT prices are all jacked up due to the outrageous motherboard pricing.

Again this is all Intel's doing. AMD won't be doing this either. This doesn't apply when deciding whether to get an 8-core Ryzen or a 4-core. Platform costs will remain constant.

Going >4 cores is simply an absurd waste of money....And that in itself only illustrates how broken is the HEDT market.

See the thing is, Intel has done SUCH a wonderfully masterful job of "creating" the HEDT market that people literally can't even think outside of this model now. There's exactly zero reasons why you shouldn't be able to max out cores on cheap chipsets! Welcome to renewed competition! Forget about "HEDT" systems.
 

Dave2150

Senior member
Jan 20, 2015
639
178
116
It's hard to capture but there are spikes to 90%-100% on my CPU:
Wv6krL9.jpg


If I lowered certain graphics settings I'm sure my CPU instead of my GPU would be the true bottleneck.

Also, a CPU doesn't need to be at 100% usage to be limiting performance, especially when you're aiming for higher FPS.

You're using a rather dated Z97 Haswell setup at a low clock rate of 4.3Ghz though, it cannot compare with a Skylake/Kabylake at 4.8-5.1Ghz.
 

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
You're using a rather dated Z97 Haswell setup at a low clock rate of 4.3Ghz though, it cannot compare with a Skylake/Kabylake at 4.8-5.1Ghz.

Which is great until a 4.8ghz Kabylake isn't fast enough either. This is literally a repeat argument of single vs dual core and dual core vs quad core. Each and every time moar cores has proven to age better and give a better return on investment over time.

AMD will eventually be popping zen cores into consoles and when that happens we'll be locked into needing moar cores.

I have a 4.5ghz Skylake 6700k. I'm a fan of Intel but not a fan of them sitting on their laurels due to lack of competition. I welcome Zen and hope that AMD succeeds spectacularly.
 

CHADBOGA

Platinum Member
Mar 31, 2009
2,135
832
136
Which is great until a 4.8ghz Kabylake isn't fast enough either. This is literally a repeat argument of single vs dual core and dual core vs quad core. Each and every time moar cores has proven to age better and give a better return on investment over time.

AMD will eventually be popping zen cores into consoles and when that happens we'll be locked into needing moar cores.
You seem to forget that much code is just serial in nature and likely always will be.

This is far and away the biggest reason we see so few games that benefit from more cores than 4.

Intel and AMD could double core count every two years, but that doesn't mean in 6 years time most games will run better on a 32 core CPU than on an 8 core, especially with a likely speed deficit doing the 32 core no favours.

Moar cores is what we will get because process technology advances are slowing down dramatically and the CPU companies have to offer something that looks like a better CPU.
 

beginner99

Diamond Member
Jun 2, 2009
5,208
1,580
136
That's what I'm saying to him too.

I was playing BF1 today on a 64 player map. My core utilization on my 8 logical cores (i7 4770k @ 4.3ghz) hit 90-10% very often for all 8 threads. We're already pushing the limits in multiplayer especially.

Exactly. Battlefield 64-player maps are notoriously known to be CPU limited. It seems most people here forgot what mantle did in BF4: It greatly helped with frametimes, machines with old CPUs and cross-fire setups. Phenom 2s as far as I remember almost got double the FPS with mantle.

No way. 7700k is faster in games vs the entire HEDT lineup, and HEDT prices are all jacked up due to the outrageous motherboard pricing. Going >4 cores is simply an absurd waste of money, unless you're buying off-lease, in which case its actaully great deal. And that in itself only illustrates how broken is the HEDT market.

Maybe if all you do is play Starcraft or single-player then yes a quad-core will be good enough for some time. The point you are missing is, that with Zen you will be able to buy an octo-core without an expensive mobo and quad-channel RAM. Now you are paying about $150 more over a 7700k for an octo-core with similar IPC (assuming $500 for 8/16 Ryzen and Intel not lowering prices, they never have).
 
  • Like
Reactions: Magic Hate Ball

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
I have found a link to a PDF explaining the AVFS, behind a paywall, on IEEExplorer. Since i am a researcher, my organization has access to this article and i read it.

In short, for Carrizo (and almost surely for all subsequent products) there are 10 units spread across the chips each of which have 50 replica of critical path circuits (for a total of 500) and with a complex circuit collect statistics on its delay at various Vcore, to calculate Fmax at those Vcores. This Statistic is calculated on demand by microcode or SMU and allow to precisely know what Vcore apply to a certain frequency. Obviously in Zen this will be useful also to calculate the Fmax, given the current situation (temperature, chip age, and so on)...


That would be neat.
To have small critical parts of CPU replicated running alongside but with slightly reduced voltage than the core itself. And some sort of corruption detection, noticing when it fails and determining safe offset voltage for an actual core.

Sadly, I think it will just be extended table of frequency and voltage that goes slightly (depending on BIOS) above TDP. If the cooling (and mobo capacity) is there, the CPU will work out of spec. Something along the line of aggressive turbo boost, a'la burst mode in mobile.
 

bjt2

Senior member
Sep 11, 2016
784
180
86
That would be neat.
To have small critical parts of CPU replicated running alongside but with slightly reduced voltage than the core itself. And some sort of corruption detection, noticing when it fails and determining safe offset voltage for an actual core.

Sadly, I think it will just be extended table of frequency and voltage that goes slightly (depending on BIOS) above TDP. If the cooling (and mobo capacity) is there, the CPU will work out of spec. Something along the line of aggressive turbo boost, a'la burst mode in mobile.

The paper is this: http://ieeexplore.ieee.org/document/7062937/ , but without access, the abstract does not say much. This is another interesting extract: "Robust sampling of the chip-wide critical path variations is provided by 10 dispersed instances of the CPA. Each CPA exercises 50 critical paths for a total of 500 paths (300 gate-dominated, 100 wire-dominated and 100 macro replica paths). AVFS extracts Gaussian distribution statistics for the paths and infers the timing margin for the actual core path using sampling statistics (Fig. 4.8.5). Gates, wires, and macros are treated separately to differentiate the distribution, and appropriate guard-bands added for sampling uncertainty. Timing-margin prediction vs. actual timing margin indicates the ability of AVFS to set the minimum voltage required across the entire voltage range, resulting in up to 30% power savings."

Probabily for Zen the test is extended to higher voltage, temperature and TDP ranges, maybe under control of the BIOS or an AMD OC software...
 
  • Like
Reactions: Erenhardt

DrMrLordX

Lifer
Apr 27, 2000
21,583
10,785
136
Would we be able to get a worthwhile manual overclock above the XFR auto-overclock? If we can only get 100Mhz, for example, it wouldn't be worth the effort.

I'm guessing not, though I could be wrong. We'll have to see how aggressive it is. But consider this: XFR might be able to momentarily boost clocks above stable levels every few minutes, and sustain those clocks for periods of 5-7 minutes before being pushed back down by thermals. I've had plenty of marginal overclocks where the chip could do Prime95 (or whatever) for 10-15 minutes but not an hour, much less 24 hours. XFR could theoretically take clocks that high (or higher) and then drop them back down before thermals would indicate upcoming instability.

No way. 7700k is faster in games vs the entire HEDT lineup.

Ehhh I disagree. The 6950X and 5960X win some benches just from l3 alone.
 
  • Like
Reactions: CatMerc

dullard

Elite Member
May 21, 2001
24,998
3,327
126
Sigh. This generation's technology conservative.

You realize the exact same argument has played out word-for-word verbatim when we moved from single to dual cores, dual to quad, and now quad to octo core? Yes most games don't care about if its 4 or 8 cores. But more and more do, and most people expect to keep their CPU for at least 5 years or more at the minimum. Can you promise to know the future 5 years from now? Any smart shopper will reach for at least 6 cores if they are building a gaming rig.
Quad cores have been around since 2009 and just now (8 years later) they are beginning to be maxed out in games (and like I said above, that is due to not enough cache and not due to more cores). The transition to 8 cores being really needed in games also won't probably happen until 7+ years later. You are arguing to buy an expensive HEDT chip that we don't need now and there is no evidence that we'll need anytime in its typical lifespan. Your point isn't incorrect, but you are demanding more cores several years too soon. Anyone with a 7700k now will be just fine in games in 5 years from now as long as they didn't GIMP their gaming system in some other way (like not having a GPU, or having insufficient slow memory). Yes, I can promise that.
 

CatMerc

Golden Member
Jul 16, 2016
1,114
1,149
136
An Intel Lynnfield 4C is 296mm^2 - Dual channel IMC
Sandy Bridge-E is 435mm^2 - Quad channel IMC
Haswell-E 8C is 356mm^2 - Quad channel IMC
Broadwell-E 10C is 246mm^2 - Quad channel IMC
Ryzen 8C is ~190mm^2 - Dual channel IMC

Now tell me, is there any reason that we need to pay 1700$ for a 10C outside of market segmentation and lack of competition?
Ryzen can break the core stagnation HARD. Intel has been pushing 4C's for so long that it left AMD massive room for undercutting while still having massive margins.

Expect 6C to become the new 4C consumer favorites.
 
Last edited:

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
But the 7700K is under $350.00, so are you really winning with the more expensive chips? Would you pay north of $500 to beat the 7700K by 5-10% in games?
Clearly some gamers would, but they are probably not the majority.
 

Arzachel

Senior member
Apr 7, 2011
903
76
91
But the 7700K is under $350.00, so are you really winning with the more expensive chips? Would you pay north of $500 to beat the 7700K by 5-10% in games?
Clearly some gamers would, but they are probably not the majority.

The 6 core Ryzen is likely to retail around the same price as the 7700K though.
 
  • Like
Reactions: CatMerc

CatMerc

Golden Member
Jul 16, 2016
1,114
1,149
136
But the 7700K is under $350.00, so are you really winning with the more expensive chips? Would you pay north of $500 to beat the 7700K by 5-10% in games?
Clearly some gamers would, but they are probably not the majority.
Would you buy a 7700K for $350 when a 6C Ryzen is available for $350 and performs within 10% worst case?
 
  • Like
Reactions: Drazick
Status
Not open for further replies.