• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

Discussion AMD Cezanne/Zen 3 APU Speculation and Discussion

dr1337

Member
May 25, 2020
50
101
66
Fresh leak out today, not much is known but at least 8cu's is confirmed. Probably an engineering sample, core count is unknown and clocks may not be final.

This is very interesting to me because cezanne is seemingly 8cu only, and it seems unlikely to me that AMD could squeeze any more performance out of vega. A cpu only upgrade of renoir may be lackluster compared to tigerlake's quite large GPU.

What do you guys think? Will zen 3 be a large enough improvement in APU form? Will it have full cache? Are there more than 8cus? Has AMD truly evolved vega yet again or is it more like rdna?
 

soresu

Golden Member
Dec 19, 2014
1,492
714
136
Fresh leak out today, not much is known but at least 8cu's is confirmed. Probably an engineering sample, core count is unknown and clocks may not be final.

This is very interesting to me because cezanne is seemingly 8cu only, and it seems unlikely to me that AMD could squeeze any more performance out of vega. A cpu only upgrade of renoir may be lackluster compared to tigerlake's quite large GPU.

What do you guys think? Will zen 3 be a large enough improvement in APU form? Will it have full cache? Are there more than 8cus? Has AMD truly evolved vega yet again or is it more like rdna?
I'm beginning to wonder if they are holding off strictly RDNA based APU's until they make one that can at least put some oomph behind RT - possibly for use in a standalone VR headset.

Even so, if Cezanne is still only 8 CU's I will be disappointed if it is the only commercially available update for a year after Renoir's time is done.

I think that perhaps the only thing stopping them making a console class APU is the major console deals themselves - a shame if so, APU's will be cursed to be 5 to 7 years behind consoles in perpetuity.
 

ScopedAndDropped

Junior Member
Feb 15, 2020
7
3
41
Would Renoir refreshed on 6nm have been viable? It would end up being way cheaper and not being that much slower than the top end TGL-H. They may as well wait for 5nm and ddr5.
 

amd6502

Senior member
Apr 21, 2017
821
271
106
yup it looks like Zen3 quad plus 8CU Vega again. rogame leak. Frequencies are a little higher again than Renoir. This 8CU business isn't going make a few people happy but whatever, if there is a big 1080p-medium APU market then they need to address that with an MCM.

Would Renoir refreshed on 6nm have been viable? It would end up being way cheaper and not being that much slower than the top end TGL-H. They may as well wait for 5nm and ddr5.
No it wouldn't make sense but a Zen3 version of it might.
 

gorobei

Diamond Member
Jan 7, 2007
3,089
243
106
moore'slawisdead did a 2 yr roadmap of rumor/leaks from his sources. amd and nv have multi chiplet gpus possibly coming with rdna3/hopper generation.

if true that is maybe 2 years(mid to late 2022) for proper gpu multi chiplets on gpu cards and possible cpu + gpu chiplets on interposer for apus. if it is 2 product cycles away, then there is no need to go crazy on trying to cram cu on monolithic for the intervening years. more cu is more die space and more power/heat, which erodes the speed and power efficiency of monolithic. if laptop makers are going to cram crappy dedicated nv 1660/mx250 class gpu on most models, there is less to gain in making an apu with monster gaming ability if no one will use it.

intel's 10nm and 7nm troubles means they wont have an apu capable of reaching 1080p 60fps high-quality for a while. so laptop makers will always have to plan/provision for adding a dgpu. until all apus from intel/amd can reach that minspec there wont be a demand for powerful cu counts. so unless amd takes over most of intel's market share or intel graphics somehow make a massive leap, it is a waiting game for the inflection point where all cpu igp are 'good enough'
 

beginner99

Diamond Member
Jun 2, 2009
4,625
1,012
136
As with Renoir I really don't see the issue with the small iGPU. Bandwidth most likley is limiting anyway, it's good enough for a very large part of laptop users, 95%? or even 99%?. Bigger GPU would be a waste of silicon mostly. On top of that AMD now actually pretty much has the CPU performance lead in mobile and any gaming related models will ship with a dGPU anyway. Just from that alone this is a very clever design decision.

On top of that it wouldn't have made sense to create a new "vega block" for just one short lived product. They can simply reuse the existing "block", maybe tune it a bit and be done. Even more money saved besides on wafers. That's probably why it comes with Vega again and not RDNA1. Both bigger Vega or RDNA1 iGPU would have been too much work for too little gain. With APU after that they can go RDNA2 which seems to be much more efficient than RDNA1 and DDR5 for the needed bandwidth.
 

zacharychieply

Junior Member
Apr 22, 2020
3
3
41
the reasoning behind using vega apu's instead of RDNA MAYBE because agreements with Microsoft and Sony who probably don't want AMD to release other rdna apu's potentially "piggybacking" off of funding they provided AMD to develop their console apu's
 
Last edited:

soresu

Golden Member
Dec 19, 2014
1,492
714
136
the reasoning behind using vega apu's instead of RDNA MAYBE because agreements with Microsoft and Sony who probably don't want AMD to release other rdna apu's potentially "piggybacking" off of funding they provided AMD to develop their console apu's
AMD are using that knowledge anyways with RDNA2 dGPU's.

SFF PC's using APU's don't sell nearly as well as any console product - even the loser of any given generation are in another sales league entirely.

Plus the console manufacturers make most of their money off of software rather than hardware from what I have gathered over the years.

A cheaper console which makes a tidy profit is obviously nothing to sniff at, but they can handle some initial low HW profits from heavier designs as long as software sales are good.

My guess is that AMD are hoping their first RDNA2 APU will coincide with a run of eye tracked VR HMD standalones that can fully utilise foveated rendering to make RT viable on APU class graphics.

The first eye tracked models are already out (VIVE Pro Eye), so that doesn't seem like such a stretch considering Rembrandt is still at least a year to a year and a half away.
 

dr1337

Member
May 25, 2020
50
101
66
My guess is that AMD are hoping their first RDNA2 APU will coincide with a run of eye tracked VR HMD standalones that can fully utilise foveated rendering to make RT viable on APU class graphics.
I'm not sure that's their target market at all with next generation apus, and to be quite honest AMD isn't the best at VR support. RT in VR would be wasted energy compared to solving the screen door effect with higher resolution. VR headsets simply aren't 100% there yet in visual quality and resolution performance is by far the main bottleneck.

But really AMD doesn't care about VR, historically and currently support is abysmal. Them waiting for DDR5 makes sense in some ways. But also I really dont see them pushing RT on APUs that much at all. Unless their implementation is over 2x what nvida can do, any APU graphics outside of 36cu would probably be lackluster.

Also sadly there hasn't been a lot of developments in the VR world lately. The index and the reverb G2 are the latest improvements, and are in someways quite incremental. It doesn't seem like we're going to get proper 4k+ headsets with eye tracking and foveated rendering any time soon. Which is a total shame because I would love to buy something better than an index to replace my oculus rift.
 

moinmoin

Golden Member
Jun 1, 2017
1,913
2,139
106
Until DDR5 arrives and offers other rooms to grow into the APU Vega is very likely a learning experience for AMD at getting the most energy efficiency they can get out of an existing GPU IP. They then can apply that knowledge to their future RDNA/CDNA versions, due to regular new versions none of them will ever get the repeated rounds of further efficiency improvements that (otherwise stagnant) APU Vega currently gets.
 

soresu

Golden Member
Dec 19, 2014
1,492
714
136
RT in VR would be wasted energy compared to solving the screen door effect with higher resolution.
Screen door effect is no more than a function of flat panel LCD or OLED display pixels and the distance between their active emissive bundaries, as they were not originally designed to be viewed from such a close distance their design emphasized picture quality and power efficieny (or lifetime too in the case of OLED), rather than eliminating such a problem which is not even visible from a significant distance on a HDTV or many monitors.

Solving it doesn't actually require higher resolution at all, as evidenced by the likes of the MEMS SLM based Avegant Glyph which has a low resolution per eye at 720p, but due to the way the display optics work it does not appear to suffer significant screen door effect.

I believe that waveguide optics may provide a similar solution - using small, high PPI microdisplays in the side frames and essentially routing the pixel emissive light to the eye through waveguides, Oculus Research have admitted their future goal to be this anyway for higher resolution and FOV in one go.

Either way, RT in VR need not be nearly so demanding as on a desktop monitor with a full bag of compute saving tricks to diminish the actual graphics workload both per frame and per second, especially if the VR HMD uses eye tracking.

Variable Rate Shading, Foveated Rendering (which VRS also benefits from), x Warp techniques (Async Space Warp etc) all add up to take a serious chunk out of the necessary compute for a pleasant experience - meaning even a measly APU could possibly provide the necessary horsepower for a standalone experience on the right HMD.
 

Shivansps

Diamond Member
Sep 11, 2013
3,066
744
136
It has nothing to do with the console deal and it has nothing to do with DDR5, as RDNA is likely to give higher perf on DDR4 due to be more efficient.

The reason is that they have zero competition on the IGPs. If they still have 8CU Vega it is because thats more than enoght to beat the competition, this is performance stagnation at its finest, last seem on Intel quad cores from 2nd gen to 7th gen when AMD had nothing. Who needs more than 4 cores if thats enoght right? well read that again and replace "4 cores" with "8CU Vega", that is exactly what is happening here.
 

soresu

Golden Member
Dec 19, 2014
1,492
714
136
It has nothing to do with the console deal and it has nothing to do with DDR5, as RDNA is likely to give higher perf on DDR4 due to be more efficient.
Wait. DDR5 is less efficient than DDR4??!!

I agree with the stagnation part though - hopefully they are designing with contingencies in mind though.
 

NostaSeronx

Diamond Member
Sep 18, 2011
3,202
713
136
Have we honed down if K19.4/K19.5 is on 5nm w/ Apple?

There is information since 2018, implying that certain APUs were switch from 7nm EUV to 5nm EUV.
 

turtile

Senior member
Aug 19, 2014
491
114
116
It has nothing to do with the console deal and it has nothing to do with DDR5, as RDNA is likely to give higher perf on DDR4 due to be more efficient.

The reason is that they have zero competition on the IGPs. If they still have 8CU Vega it is because thats more than enoght to beat the competition, this is performance stagnation at its finest, last seem on Intel quad cores from 2nd gen to 7th gen when AMD had nothing. Who needs more than 4 cores if thats enoght right? well read that again and replace "4 cores" with "8CU Vega", that is exactly what is happening here.
I've read that the improved Vega is actually more dense so it's cheaper. It will be interesting to see if they use CDNA or RDNA in future APUs.

I don't think AMD expected much out of their mobile chips so they didn't invest as much in them. I assume that will change in the future but it's too late for chips that are already designed.
 

beginner99

Diamond Member
Jun 2, 2009
4,625
1,012
136
The reason is that they have zero competition on the IGPs. If they still have 8CU Vega it is because thats more than enoght to beat the competition
Intel is pretty competitive with iGPU and tiger lake will most likley beat renoir and cezanne at the same price point. Hence I don't think this is the case. It's how AMD (Lisa Su?) ticks nowadays and I like it. Don't sink money into features that don't have a clear ROI. I still fail to see a use-case for powerful iGPUs. Just get a model with a dGPU giving you easily 2x the performance any iGPU ever will.
 

soresu

Golden Member
Dec 19, 2014
1,492
714
136
I still fail to see a use-case for powerful iGPUs.
Compact, moderately performing PC's - pure and simple.

Intel made Kaby G with AMD's Vega M, but AMD never made any equivalent package with entirely their own processors.

I don't game that much, so if I could get a SFF system with an APU package with Polaris 10 performance or close to it I would in a heartbeat.
 

uzzi38

Golden Member
Oct 16, 2019
1,151
1,919
96
You will get an RDNA2 APU on AM4. Just not now.

Cezanne is TTM and cost focused. AMD want something that can compete much better in the -U segment against Tiger Lake than Renoir can, because multi-core performance alone doesn't sell if you don't have comparable ST perf, and they want it out fast. They also want to ensure supply constraints are minimised as well, so smaller dies and smaller transistor budgets on the iGPU are ideal.
 

Kryohi

Junior Member
Nov 12, 2019
5
8
41
You will get an RDNA2 APU on AM4. Just not now.

Cezanne is TTM and cost focused. AMD want something that can compete much better in the -U segment against Tiger Lake than Renoir can, because multi-core performance alone doesn't sell if you don't have comparable ST perf, and they want it out fast. They also want to ensure supply constraints are minimised as well, so smaller dies and smaller transistor budgets on the iGPU are ideal.
So, should we expect Cezanne to be launched in 2020, or is that a pipe dream?
Because if OEMs are as sluggish with Cezanne as they were with Renoir, decent availability of decent products will only materialize after ~6 months from launch, and Tiger Lake will dominate the sales for most of 2021.
I'm also wondering if the new Vega will be able to at least match Intel Xe, although that seems unlikely.
 

uzzi38

Golden Member
Oct 16, 2019
1,151
1,919
96
2020's too early lol.

I'm expecting late Q1 to early Q2, but as you can imagine things are extremely up in the air given wafer supply.

Cezanne time to reach products should be significantly lower though. It's a drop in upgrade with the same socket as Renoir, and there hopefully won't be a major pandemic causing 2 month delays.
 

Shivansps

Diamond Member
Sep 11, 2013
3,066
744
136
Intel is pretty competitive with iGPU and tiger lake will most likley beat renoir and cezanne at the same price point. Hence I don't think this is the case. It's how AMD (Lisa Su?) ticks nowadays and I like it. Don't sink money into features that don't have a clear ROI. I still fail to see a use-case for powerful iGPUs. Just get a model with a dGPU giving you easily 2x the performance any iGPU ever will.
You are overstimating how powerfull entry level dGPUS are... a 3400G Vega 11 with DDR4-3200 is inside 5 to 10% on AVG below the perf of a RX550 128bit GDDR5, so will the 4300G, maybe add an extra 5%. Then you have the GT1030 GDDR5 and the RX550 64-bit that are below 3200G levels in a few cases. And a $99 3200G is not that much slower than a 3400G.

Not even the 1050ti gives you 2x over 3400G/DDR4-3200(its around 60-80% most of the time). So if by dGPU you mean the RX570/GTX1650 Super, yeah i agree. But those cost as much as the 3400G and you do also need a better PSU for the 570/1650, the 3400G alone runs on anything.

I think 1050ti perf is possible on DDR4 with RDNA2 and DDR4-4266(or very, very close to it) but AMD is not interested in doing it, why would they? the competition has nothing (when tiger lakes launches well see, because competitive is not the same as being better), and they would hurt their own dGPU sales.

BTW, 1080p perf is also not needed, 900p is always a lot faster in APUs and visual quality is almost the same as 1080p.
 
Last edited:
Thread starter Similar threads Forum Replies Date
L CPUs and Overclocking 16

ASK THE COMMUNITY