News AMD previews Ryzen 3rd generation at CES

Hitman928

Golden Member
Apr 15, 2012
1,600
57
136
#1
More info to come. . .
 
Apr 27, 2000
10,476
326
126
I mean according to AMD the first ZEN was already supposed to blow intel out of the water,they made it look sooooooo gooooood against intel hedt in their presentation.
It was "so good" against Intel HEDT, in the real world. $999 6800k got walked on price/performance.

I think TheELF was simply pointing out that we were shown Cinebench only. And it is reasonable to assume that AMD has chosen the best case scenario for their chip.
CBR15 was the first head-turning benchmark for Zen. The next one was Blender. So I really would like them to bench it using the same Blender demo they used for Zen. There's a history there, and it's important for us to know how much things have changed between generations of the Zen uarch. Plus Blender can be compiled to use AVX2 (if only The_Stilt were still here).

But a proper (CES spirited) test for new chip would have been some CPU limited gaming scenario, AMD has just moved I/O and MC further away from cores, that's where THE real questions are.
We didn't get game benches out of them when they first demoed Zen so . . . are you surprised? Also I think it's funny that people actually think a fp-friendly benchmark is now a "best case" scenario for AMD processors. My, how times have changed.
 
Last edited:

EXCellR8

Platinum Member
Sep 1, 2010
2,701
28
126
#14
I can totally live with a mid 2019 release... gives me a chance to stash away a bit more funds and allows my 1700 to have one last good hurrah. Seems like the PCIe 4.0 support might prompt a board upgrade though.
 
Sep 18, 2011
2,215
31
106
#74
imho, I would rather get a μRipper.

64-core Jaguar, that is between two 7nm logic chiplets and sharing an I/O die. Yeah, no machine can rival the low-cost, easily ported Jaguar!
 
Apr 7, 2011
888
66
91
#78
But a proper (CES spirited) test for new chip would have been some CPU limited gaming scenario, AMD has just moved I/O and MC further away from cores, that's where THE real questions are.
Cinebench mimics a real world workload and scales well enough that the results can be extrapolated to other CPU-heavy tasks pretty reliably. It's also well-known and has a massive database of results. You propose they instead use a CPU limited gaming scenario where the results are by nature going to be extremely application-specific, a huge pain to verify and revelant only to a very small subset of consumers.

Might as well ask them for javascript benches ;)
 
Apr 7, 2011
888
66
91
Wow, I was just playing cinebench last night. Really had a lot of fun. Cant wait to get home from work tonight and play it again. Guess I can ignore my whole steam library now. I have no problem with cinebench as a preliminary benchmark, but it *is* pretty much a best case scenario for AMD. We will just have to wait for more benchmarks, but gaming is still a question, and a primary use case for many (most) users. Final clocks may be better, but actually the only thing I am impressed with so far is the power consumption. By the time it comes out it will be almost a year after the 9900k, and to only match it in a best case artificial benchmark is underwhelming to me.
I'm not sure how to tell you this but there are people out there using computers for things other than videogames.
 
Jan 24, 2014
2,962
249
136
Your point was that AMD purposefully made the 9900K look bad in order to compare favourably with the Zen 2 sample. Where is your proof?
He's probably unaware that stock 9900K does 1760 points in CB15. AMD let 9900K run for it's life, then sent Zen 2 to play tag while running at 60% power.

Reminds me of the people who did not believe 9900K would be obsolete in under 12 months from launch.
 

Mopetar

Diamond Member
Jan 31, 2011
4,270
199
126
It's debatable whether this is the best possible combination, though I fail to see what extra one can add to this setup in order to make it look better wrt Cinebench scores.
They could always hire Principled Technologies to set everything up and run the benchmarks. :D
 

Hitman928

Golden Member
Apr 15, 2012
1,600
57
136
#2
AMD demo compared Ryzen 3rd gen to Intel 9900k on cinebench. Ryzen not at final clocks.

AMD system running at 133W vs 180W for 9900K

Ryzen at 2057 points vs 2040 points on 9900k
 

ub4ty

Senior member
Jun 21, 2017
749
304
96
#8
Would have loved to have gotten more info but knew this wouldn't be the case when it was delayed to the end of presentation. Very nice to see that they used the chiplet on Ryzen Desktop. They also demo'd it beating the 9900k in Cinebench w/ 50-60 less wattage. Given that EPYC has 64 cores and 8 compute chiplets.. that means that the 8 cores are all flat on the demo'd Ryzen 3 and have equal access to I/O through the chiplet.

As I stated in the spec thread, the 9900k is a dead processor and so is Intel until they deliver in 2020/2021. That is now confirmed. What I now await is how much I/O they've crammed into that I/O chiplet, whether or not PCIE 4.0 is supported, performance, price and clocks.
 

ub4ty

Senior member
Jun 21, 2017
749
304
96
#10
Only 8 core/16 thread shown.
The fact that its wedged so high up leads me to believe they might and can put another 8 cores on there... especially given that they have that massive I/O chip. If they went out of their way to make a custom desktop I/O chip, I am sure there will be another compute chiplet tossed on there.
 

ub4ty

Senior member
Jun 21, 2017
749
304
96
#13
Yep, just caught it.. Very nice Screen Shot 2019-01-09 at 1.35.37 PM.png

Something tells me CCIX from CPU to their 7nm GPU is lurking and undisclosed and that new GPU supports PCIE 4.0 as well... potentially
 

Mopetar

Diamond Member
Jan 31, 2011
4,270
199
126
#15
The fact that its wedged so high up leads me to believe they might and can put another 8 cores on there... especially given that they have that massive I/O chip. If they went out of their way to make a custom desktop I/O chip, I am sure there will be another compute chiplet tossed on there.
I would imagine that they'll eventually do it, but right now they can beat Intel's best with just one chiplet and at less power. I'm willing to bet that they can drive the clocks even higher and have a reasonable performance lead at similar power levels.

What do you think AMD is going to sell that for? I don't know the exact dollar amount, but I'm willing to bet it's a lot less than the $500 9900k.

They probably don't have enough chiplets to make a 16 core Ryzen right now, not when anything that's fully functional can be put in an Epyc CPU instead. But it doesn't really matter what Intel counters with since AMD will eventually be able to hammer them with a two chiplet part, even if it only offers 12 functional cores.
 
Oct 10, 1999
64,850
97
126
#16
The fact that its wedged so high up leads me to believe they might and can put another 8 cores on there... especially given that they have that massive I/O chip. If they went out of their way to make a custom desktop I/O chip, I am sure there will be another compute chiplet tossed on there.
Quite possible.
 

Saylick

Senior member
Sep 10, 2012
573
16
91
#18
The fact that its wedged so high up leads me to believe they might and can put another 8 cores on there... especially given that they have that massive I/O chip. If they went out of their way to make a custom desktop I/O chip, I am sure there will be another compute chiplet tossed on there.
So Ian calculates that the Ryzen IO die is slightly larger than a 1/4 the size of the EPYC IO die. I would not be surprised if the Ryzen IO die is different than the EPYC IO die so that it is possible to match up two Zen 2 compute dies, or a Zen 2 compute die with a Vega II compute die. The server IO die doesn't have to worry about having GPUs as part of the package while APUs will be released for Ryzen 3000.
 

mohit9206

Senior member
Jul 2, 2013
974
40
116
#19
I can totally live with a mid 2019 release... gives me a chance to stash away a bit more funds and allows my 1700 to have one last good hurrah. Seems like the PCIe 4.0 support might prompt a board upgrade though.
Why not keep your 1700 till Zen 3 atleast?
 

ASK THE COMMUNITY