Question Speculation: RDNA2 + CDNA Architectures thread

Page 93 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

uzzi38

Platinum Member
Oct 16, 2019
2,705
6,427
146
All die sizes are within 5mm^2. The poster here has been right on some things in the past afaik, and to his credit was the first to saying 505mm^2 for Navi21, which other people have backed up. Even still though, take the following with a pich of salt.

Navi21 - 505mm^2

Navi22 - 340mm^2

Navi23 - 240mm^2

Source is the following post: https://www.ptt.cc/bbs/PC_Shopping/M.1588075782.A.C1E.html
 

Head1985

Golden Member
Jul 8, 2014
1,867
699
136
Watched NOAF stream(not apple fan):
better than +10% IPC
1.3x better clocks
better than +50% perf/watt
faster than 3080
AMD will have own DLSS
He said amd is not impresed with 3080 performance.And that RDNA2 is better than they anticipate.It is similar to zen1 when they increased IPC more than 50% and target was only 50%.
 
Last edited:

maddie

Diamond Member
Jul 18, 2010
4,881
4,951
136
Why is everyone surprised that RDNA2 can clock 2.5 Ghz when the Renoir APU with Vega gfx overclocked to 2.4 Ghz. In fact if anything at minimum 2.5 Ghz was to be expected given RDNA2 is AMD's latest and best GPU arch and incorporates the best physical design methodologies utilized by the "Zen" CPU team. Obviously > 2.5 Ghz is not so likely. 3 Ghz is probably going to happen on TSMC N5P or N3.
Honestly. I don't have a clue as to top speed. The slide that mentioned reduced complexity + when aided by physical optimization is promising.

As an aside, Zen2 can run very efficiently at 3 GHz. How do the AVX circuitry in it compare in complexity to CU? Anyone?
 
  • Like
Reactions: Tlh97

uzzi38

Platinum Member
Oct 16, 2019
2,705
6,427
146
Just when the hype train looks to be slowing down it suddenly picks up momentum! /s

Lisa Su/AMD will tease RDNA2 during the Zen3 announcement. Odds?
I totally want to just see something super smug just like AMD showing off the performance of Zen 3 in a game at 1080p, 1440p and 4K with an "unreleased Radeon graphics card"... Or even just a footnote about the GPU used.

Or for the ultimate memes, they could do a demo showing off a shiny new 5950X + "unreleased Radeon graphics card" vs 10900K + RTX 3090, with performance numbers at all three resolutions and a Kill-a-Watt reading showing the difference in power draw , preferably in all three resolutions again.

Probably won't happen, but hey, it'd be hilarious if it did.
 

Gideon

Golden Member
Nov 27, 2007
1,774
4,145
136
Or for the ultimate memes, they could do a demo showing off a shiny new 5950X + "unreleased Radeon graphics card" vs 10900K + RTX 3090, with performance numbers at all three resolutions and a Kill-a-Watt reading showing the difference in power draw , preferably in all three resolutions again.

This crossed my mind as well. If pefromance allows it they could even pull these rigs out twice. Once @ Ryzen launch @ 1080p and the next time at RDAN2 launch @ 4K.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
They will sell but not for gamers. In the high prices the brand power is a stronger factor. I can bet with you even if AMD is 20% faster than the 3090 for $1000 we will see the major of streamers/youtubers and high end users in the foruns still buying Nvidia.

Thats a baseless argument. Nvidia has a stronger brand but that is built on performance leadership for many years. I can bet you if Nvidia did not have the GPU crown consumers would not hesitate to switch to AMD. Maybe few will want to wait for 1 more generation to see if AMD have an overall package which is competitive on features like DLSS and performance (raster and raytracing) with Nvidia. But I can guarantee you that performance leadership matters. You should only look at RTX 2070 Super launching to replace the RTX 2070 at $500 because AMD said they were coming with RX 5700XT at $449. Once RTX 2070 Super launched AMD reacted and cut the launch price of RX 5700XT to $399. If thats not an indication that these companies compete on performance then I don't know what is. Remember RTX 2070 had ray tracing and RX 5700XT did not. So if the Nvidia brand is all that mattered why would Nvidia want to bring RTX 2070 Super with a bigger TU104 die and reduce their margins.
 

kurosaki

Senior member
Feb 7, 2019
258
250
86
hese companies compete on performance then I don't know what is. Remember RTX 2070 had ray tracing and RX 5700XT did not. So if the Nvidia brand is all that mattered why would Nvidia want to bring RTX 2070 Super with a bigger TU104 die and reduce their margins.
Was going to reply something similar, plus: Branding turns quicker now because of how society looks. if something is clearly better than the other, every one on earth will know instantly. It's all so fluent these days. If one streamer picked up a card from AMD because it was the best, well, it is going to be an avalanche. AMD is just going to get that top perf crown to get highly adopted amongst the streaming and gaming crowd, all us other peasants will probably suffice with the tier below at 40% of the price.. Top tier is most often all marketing and mindshare, and these days, that shit turns fast.
 

Shivansps

Diamond Member
Sep 11, 2013
3,875
1,530
136
So you are saying if AMD have a flagship N21 GPU with 16 GB HBM2E for $1000 or even $1200 which is 5-10% faster than RTX 3090 FE AMD will not sell out every unit they make. I will take a bet with you that they will sell every unit they manufacture.

AMD has one problem, at equal price, Nvidia is offering a better encoder, RTX and DLSS. Everyone streams these days so a good encoder should be a high priority, for RTX AMD still has to show their tech, personally i dont care much about it. But DLSS is a must, they need to came up with their own version. So ill say until all this is fixed they cant try to offer similar perf at equal price, its not going to work. And it did not worked for Navi.
 

kurosaki

Senior member
Feb 7, 2019
258
250
86
vidia is offering a better encoder, RTX and DLSS. Everyone streams these days so a good encoder should be a high priority, for RTX AMD still has to show their tech, personally i dont care much about it. But DLSS is a must, they need to came up with their own version. So ill say until all this is fixed they cant try to offer same perf at equal price, its not going to work.


DLSS is crap. I hope to god AMD just ignores DLSS, it's a redactedrunaway to rendering games at lower resolution at an alleged higher one. It's pure marketing, with fine words as AI and unique Tenor cores. >>All I hear is waste of real raster space. They do what they always have done, Nvidia. Cut corners with IQ to win in pure FPS. It's old as the street.

My personal opinion is that Nvidia has always put way more on strategic marketing than AMD, and it pays off for them. Does it make them sell cards? Of course! Am I going to buy a card full of caveats just because they claim they have advanced highly in AI? Doubtful.



Profanity in tech isn't allowed.


esquared
Anandtech Forum Director
 
Last edited by a moderator:
  • Like
Reactions: swilli89

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
AMD has one problem, at equal price, Nvidia is offering a better encoder, RTX and DLSS. Everyone streams these days so a good encoder should be a high priority, for RTX AMD still has to show their tech, personally i dont care much about it. But DLSS is a must, they need to came up with their own version. So ill say until all this is fixed they cant try to offer similar perf at equal price, its not going to work. And it did not worked for Navi.

I did say in my reply few users might look at the overall package. AMD need a competitive response to DLSS and raytracing and thats obvious. The media engine is important but I do not agree that everyone streams. Having said that each user looks at a certain set of parameters to make his decision but I agree that there are folks who require a good encoder and software support for OBS and other popular streaming platforms for their streaming needs.
 

Guru

Senior member
May 5, 2017
830
361
106
We know from the PS5 specs that the new RDNA2 GPU's can pretty much clock at least 2.23GHz and we know from PS5 tech presentation that it can possibly reach even higher, but it starts having diminishing returns!

So a full blown 80cu's or whatever desktop product is likely going to be able to add at least 100mhz to that 2.23, so that would put it around 2.35Ghz. I think with overclocking you can probably push the GPU good 150MHz more before it starts crashing and being unstable.

So I do not expect a 2500MHz core clock for any of their GPU's out of the box, but I think the RX 6800 or RX 6700 is likely going to have 2300MHz boost clock. Again these GPU's will probably have main core clock at say 1.9GHz, "game" clock at 2.1GHz and boost clock up to 2.3GHz
 

ModEl4

Member
Oct 14, 2019
71
33
61
The +40% frequency, achieved with Pascal in relation with Maxwell 2 was also due to 16nm vs 28nm process. I think a reasonable limit for what AMD can achieve in the same process is +25% (what Nvidia achieved from Kepler to Maxwell 2) Then in N7 process it would theoretically achieve 2115MHz+25%= 2645MHz (highest on air overclocked achieved with a Sapphire RX 5700XT Special Edition+25%) if it is on N7+ EUV then add +10% and viola, we have 2910MHz🤪. On the other hand, if true, the XBOX series X 1825MHz would be too low I think in that tower design, unless TSMC in order to achieve the greater density that N7E enjoys they shed 7-10% performance in relation with N7. Of course, if XBOX series X is indeed 1825MHz (which is the official specs), at Hot Chips the Gtri/s & Gpix/s figures that MS had in the slides suggest at least 1906MHz, maybe they made a mistake. Anyway, we will see! The way I see it, the most optimistic scenario for AMD is 3090 performance at $899 competing with $999 3090 12GB version😋 but Nvidia offers more features (unless AMD surprise us, but for example DLSS which for me is a superb feature, AMD probably will have nothing equivalent, a PS4 Pro checkerboarding feature maybe given the 128RBEs and 256bit bus...)
 

Glo.

Diamond Member
Apr 25, 2015
5,803
4,777
136
Its funny.

I looked at this thread's previous pages, and just three days ago we speculated whether AMD is able to even touch 2.3 GHz in their designs, and at what power drawn. That PS5 is going to suck out the power lines within 15 miles to maintain its clock speeds.

Today we are discussing: "Where is the ceiling?"
 

Konan

Senior member
Jul 28, 2017
360
291
106
Its funny.

I looked at this thread's previous pages, and just three days ago we speculated whether AMD is able to even touch 2.3 GHz in their designs, and at what power drawn. That PS5 is going to suck out the power lines within 15 miles to maintain its clock speeds.

Today we are discussing: "Where is the ceiling?"
Hype train is going full pelt mate. I feel folks should temper a little.
I always had ( and got the majority impression over the last month) that AMD's dGPU RDNA2 cards can boost high like 2.3ghz. Nothing about permanent sustained speeds.
 

Glo.

Diamond Member
Apr 25, 2015
5,803
4,777
136
Hype train is going full pelt mate. I feel folks should temper a little.
I always had ( and got the majority impression over the last month) that AMD's dGPU RDNA2 cards can boost high like 2.3ghz. Nothing about permanent sustained speeds.
The thing is, is there even a hype train, since we know the clock speeds of those GPUs, or rather rough specification, that will be within 5% of the end products?

I think, the general consensus, based on those clock speeds leaked can be summed up by one sentence:

"General disbelief".
 

Konan

Senior member
Jul 28, 2017
360
291
106
The thing is, is there even a hype train, since we know the clock speeds of those GPUs, or rather rough specification, that will be within 5% of the end products?

I think, the general consensus, based on those clock speeds leaked can be summed up by one sentence:

"General disbelief".

The question is is it going to be within 5% of the low-end or the high-end of those clock ranges. Because between the low-end on the high-end in the table there’s a massive variance. Like 30%...

Excluding clock speeds still don’t really know full IPC gain or even how things will perform with RT turned on. I have some scepticism on the BVH approach.
 

Glo.

Diamond Member
Apr 25, 2015
5,803
4,777
136
The question is is it going to be within 5% of the low-end or the high-end of those clock ranges. Because between the low-end on the high-end in the table there’s a massive variance. Like 30%...

Excluding clock speeds still don’t really know full IPC gain or even how things will perform with RT turned on. I have some scepticism on the BVH approach.
For me there are only four thing that are interesting me.

Rasterization performance, price, power draw, and Linux, Open Source drivers quality, feature sets and stability.

If I will be able to use my GPU on day one with latest Kernel and Mesa drivers(of course, I do not need to buy the GPU on day 1, and simply wait for Mesa and Kernel to be updated...).
 

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
7,741
2,717
146
For me there are only four thing that are interesting me.

Rasterization performance, price, power draw, and Linux, Open Source drivers quality, feature sets and stability.

If I will be able to use my GPU on day one with latest Kernel and Mesa drivers(of course, I do not need to buy the GPU on day 1, and simply wait for Mesa and Kernel to be updated...).
You forgot availability :D
 

Shivansps

Diamond Member
Sep 11, 2013
3,875
1,530
136
I did say in my reply few users might look at the overall package. AMD need a competitive response to DLSS and raytracing and thats obvious. The media engine is important but I do not agree that everyone streams. Having said that each user looks at a certain set of parameters to make his decision but I agree that there are folks who require a good encoder and software support for OBS and other popular streaming platforms for their streaming needs.

You know what is the issue here? sometimes when buying AMD i feel like im buying a promise, when i got my RX480 at launch, that had the encoder, coming from a GTX750TI with a NVENC i felt like i was hit with a ton of bricks, it had quality issues, color issues, i had to use 3rd party software for game recording, it was not stable... it took AMD close to a year to get everything fully working with relive stable and no weird issues, but hey, they marketed it as having the encoder from day 1. And even so, quality was below 750TI for both streaming and recording. They had similar issues with the Navi encoder at launch, everything is working fine now.

At this point i feel like they are behind features and implementing them, and get them fully working, may take a long time., i hope they have something more than a promise to show in the RDNA2 presentation.
 
  • Like
Reactions: ozzy702

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
You know what is the issue here? sometimes when buying AMD i feel like im buying a promise, when i got my RX480 at launch, that had the encoder, coming from a GTX750TI with a NVENC i felt like i was hit with a ton of bricks, it had quality issues, color issues, i had to use 3rd party software for game recording, it was not stable... it took AMD close to a year to get everything fully working with relive stable and no weird issues, but hey, they marketed it as having the encoder from day 1. And even so, quality was below 750TI for both streaming and recording. They had similar issues with the Navi encoder at launch, everything is working fine now.

At this point i feel like they are behind features and implementing them, and get them fully working, may take a long time., i hope they have something more than a promise to show in the RDNA2 presentation.

Yep.The 7970 was the clear winner from a hardware perspective and aged well, but it sure did take awhile to get there. AMD is typically forward looking, but doesn't deliver polished products right out of the gate. Hopefully RDNA2 gets decent release drivers and is a fantastic product. With NVIDIA's failed execution, maybe they'll have another Ryzen moment on their hands.
 
  • Like
Reactions: Tlh97 and Elfear

Panino Manino

Senior member
Jan 28, 2017
869
1,119
136
DLSS is crap. I hope to god AMD just ignores DLSS, it's a redactedrunaway to rendering games at lower resolution at an alleged higher one. It's pure marketing, with fine words as AI and unique Tenor cores. >>All I hear is waste of real raster space. They do what they always have done, Nvidia. Cut corners with IQ to win in pure FPS. It's old as the street.

It's shame, but you're wrong.
RTG needs to at least announce upcoming support via update to DirectX ML to offer image upscale. Because DLSS don't need to be perfect, Nvidia just needs to convince the public (with some help of the press) that it's good enough and that they need to use for magical performance improvement. Mindshare is all that it's needed to not have image upscale become a "fatal" flaw of RDNA2 cards.
 

DJinPrime

Member
Sep 9, 2020
87
89
51
DLSS is crap. I hope to god AMD just ignores DLSS, it's a redactedrunaway to rendering games at lower resolution at an alleged higher one. It's pure marketing, with fine words as AI and unique Tenor cores. >>All I hear is waste of real raster space. They do what they always have done, Nvidia. Cut corners with IQ to win in pure FPS. It's old as the street.
Sometime when your view is the opposite of what just about everyone else think, you might want to reconsider your position or reevaluate it because everyone else isn't dumb. Just about everyone who have used DLSS 2 or just looked at the screenshot and video are impressed with it. If you hate it that much, just don't use it. They're completely upfront about it being an intelligent upscale, no one is hiding that fact. You even choose the base resolution when using it. Tensor cores are used and it requires the software to be written to use it, so it's not just a regular upscaler.
I don't know how any rational person would be against a technology that will give you 2 resolution step up in performance with minor visual artifact trade off. And in some case improved clarity. Do you rage against TN/VA panels? They aren't visually nice as IPS but offer higher refresh rates. What about having to drop down to YCbCr422 from 444 in order to get the ultra high refresh rate? It's good to have choices.
 

Tup3x

Golden Member
Dec 31, 2016
1,086
1,085
136
Do you rage against TN/VA panels? They aren't visually nice as IPS but offer higher refresh rates.
They don't though. But yeah, there's no real reason to complain about DLSS. It improves speed with minimal visual quality loss and in some cases it is an improvement.