Question Speculation: RDNA2 + CDNA Architectures thread

Page 205 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

uzzi38

Platinum Member
Oct 16, 2019
2,608
5,828
146
All die sizes are within 5mm^2. The poster here has been right on some things in the past afaik, and to his credit was the first to saying 505mm^2 for Navi21, which other people have backed up. Even still though, take the following with a pich of salt.

Navi21 - 505mm^2

Navi22 - 340mm^2

Navi23 - 240mm^2

Source is the following post: https://www.ptt.cc/bbs/PC_Shopping/M.1588075782.A.C1E.html
 

Gideon

Golden Member
Nov 27, 2007
1,619
3,643
136
Yeah my bad ,the naming is terrible. Could be something as trivial as "Intelligent Upscaling," but probably way too little marketing speak to get anywhere
 

Shivansps

Diamond Member
Sep 11, 2013
3,851
1,518
136
Well i got fed up with this stock situacion and the miners buying everything that i ended up getting a used Asus Dual RX5700 that i know was not used for mining literally 2 days before all GPU prices increased around 100% in my... lest call it country. Best decision i could ever made. The stock fan curve is BS, it allows the card to get to 95°C with the fan running at 1100rpm and that was causing it to crash in some games like Cyberpunk. I wonder if that is the issue of so many people complaining about crashes and black screens with Navi.

I suggest Artificial Super Resolution since that's what DLSS is. :p

What about calling it "GOIA" Get over it already.
Seriusly this amount of hate just for what DLSS does needs to stop.
 

blckgrffn

Diamond Member
May 1, 2003
9,121
3,049
136
www.teamjuchems.com
Well i got fed up with this stock situacion and the miners buying everything that i ended up getting a used Asus Dual RX5700 that i know was not used for mining literally 2 days before all GPU prices increased around 100% in my... lest call it country. Best decision i could ever made. The stock fan curve is BS, it allows the card to get to 95°C with the fan running at 1100rpm and that was causing it to crash in some games like Cyberpunk. I wonder if that is the issue of so many people complaining about crashes and black screens with Navi.

My first 5700xt had a fan stop bug that didn’t even start the fans until 95C that led to all sorts of issues - and then ran them at 100% indefinitely trying to cool the card back to something like 55C. I had been running a AMD blower 5700 without issues before that (my Dad's) and I had to install special drivers, endure green screens and do all sorts of stupid stuff on my way to trying to get the card to work.

Since this was in a more sane time, I simply walked into a Best Buy and returned it. I then bought the Blower 5700 XT from Dell for $360 after a coupon and feel fine about it. Visiontek warranty (1 year!) sucks though.

Reading at that time, there were horror stories from every AIB that seemed to have pooped out a custom design. I couldn't figure out how my Hawaii 300W card was fine for years but they all dropped the ball on a 225W part? What did AMD to piss off all these partners that they would release these cards like this? There was some sort of validation step that had to have been missed.

What about calling it "GOIA" Get over it already.
Seriusly this amount of hate just for what DLSS does needs to stop.

Hey fun police, next are you going to tell us we can't complain about the complete lack of Gigastabs the RNDA2 provides? ;)
 
  • Haha
Reactions: Elfear

KompuKare

Golden Member
Jul 28, 2009
1,013
924
136
What about calling it "GOIA" Get over it already.
Seriusly this amount of hate just for what DLSS does needs to stop.
No thanks.
I'll keep calling quality reducing fake resolution by what it is: fake resolution.
Just because over-sharpened fake4k looks sharper than overdone AA at native doesn't make fake4k better.
For years any quality reducing trickery would rightly face a big backlash from everyone, but now that Nvidia figured out what to do with their tensor units on consumer hardware their marketing department has convinced people that quality reducing trickery is a good thing?
I get it, people liked over sharpened images but if it throws out so much fidelity I don't.
 

Tup3x

Senior member
Dec 31, 2016
957
940
136
No thanks.
I'll keep calling quality reducing fake resolution by what it is: fake resolution.
Just because over-sharpened fake4k looks sharper than overdone AA at native doesn't make fake4k better.
For years any quality reducing trickery would rightly face a big backlash from everyone, but now that Nvidia figured out what to do with their tensor units on consumer hardware their marketing department has convinced people that quality reducing trickery is a good thing?
I get it, people liked over sharpened images but if it throws out so much fidelity I don't.
Clearly you haven't seen it in action... At least in Control there's zero over sharpening going on. It does wonderful job at keeping image stable in action without aliasing. Much better than the default antialiasing in action. Also in that game it's possible to actually run the DLSS at native resolution simply by changing the resolution from config. Film grain and motion blur doesn't work when DLSS in on but that's hardly a big loss...

Also lets put it this way:
DLSS+RT is much better looking than no DLSS and no RT.
 
  • Like
Reactions: zinfamous

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,249
136
Also lets put it this way:
DLSS+RT is

Like a vegie burger! Not real, fake, look alike, etc....lol

It takes a required taste for the product. Tastes great to you, but to others it's unbearable. I see no argument for either side that'll sway the others opinion.
 

Mopetar

Diamond Member
Jan 31, 2011
7,827
5,971
136
I see no argument for either side that'll sway the others opinion.

It's not really an argument per se, but AMD just has to have a better implementation of it than Nvidia and then the fanboys on both sides will change their tune. One need look no farther than various arguments about the importance of certain benchmarks, etc. in the CPU space with the most recent Zen 3 CPUs to see this in action.

It's rather fun to watch, particularly when it reverses again and they all have to convince themselves that we've always been at war with East Asia all over again.
 

Ajay

Lifer
Jan 8, 2001
15,410
7,837
136
I really don't care about DLSS or Raytracing. Maybe in a couple of generations when multi-chip GPUs are perfected, there will be enough horse power to actually build a game engine in DXR from the ground up.
 

soresu

Platinum Member
Dec 19, 2014
2,650
1,853
136
I really don't care about DLSS or Raytracing. Maybe in a couple of generations when multi-chip GPUs are perfected, there will be enough horse power to actually build a game engine in DXR from the ground up.
It's not just about the horsepower but also the software side.

nVidia have pumped a lot of R&D into new real time RT rendering techniques to increase the viability from something as basic as Quake 2 RTX to something a little more modern.

From what I have gathered on the direct lighting* side at least it will be like the jump from forward to deferred rendering in terms of the possible increase in light sources.

Bear in mind even with DLSS nVidia cards still need to denoise every frame to get rid of the monte carlo sampling noise that is seen in most kinds of RT rendering when you only use very limited samples per pixel for real time gaming.

*as opposed to indirect lighting aka global illumination.
 

Gideon

Golden Member
Nov 27, 2007
1,619
3,643
136
Bear in mind even with DLSS nVidia cards still need to denoise every frame to get rid of the monte carlo sampling noise that is seen in most kinds of RT rendering when you only use very limited samples per pixel for real time gaming.

*as opposed to indirect lighting aka global illumination.

Actually the noise and denoising artifacts in games (Control in particular) was the most shocking thing for me about RT, when I was finally able to try it out. Compressed Youtube videos don't really show this off very well but it's very visible in gameplay.
Yet even these extremely minimal (denoised but still noisy) effects plummeted framerate from ~120 FPS to ~30FPS. It's pretty clear that it's really early days for RT.

It kinda reminded me of the very first DX 9.0 shader demos and games that I ran on my brand new Radeon 9800 Pro in 2004. While Pixel Shaders offered significant improvement in quality in some scenes of 2004 era games (HL2, Far Cry, etc) most only really used bump-maps and stencil-shadows with very limited pixel-shading effects (most were still only shading water, like DX 8.1 days).

It took quite a few GPU generations after that, before games really started to be built with DX 9 in mind from the ground-up. I'd say Crysis (in 2007) was the first to really do it well. Obviously by then 9700, 9800 PRO and even X800 were long forgotten. You needed at least a x1800X at bare minimum. In reality the 4870 (released a year after the game) was the first AMD card that was able to run it at half decent framerates with OK settings.

I'd say that for truly immersive and persistent RT we need at least an order of magnitude of more performance. We will have new denoising and AA algorithms by then. AI upscaling is also certain to not go anywhere by that time, but it will probably be standardized (for all: Nvidia, AMD and Intel).
 
Last edited:

Mopetar

Diamond Member
Jan 31, 2011
7,827
5,971
136
It's going to take some time for developers to figure out how to best use RT and design around it as well. Realistic lighting is something that a lot of games aim for, but in certain scenes the lighting effects are cheated to get a specific look. If Hollywood could fake lighting as well as games could to achieve an exact result they'd do it in almost every film.
 

soresu

Platinum Member
Dec 19, 2014
2,650
1,853
136
It's going to take some time for developers to figure out how to best use RT and design around it as well. Realistic lighting is something that a lot of games aim for, but in certain scenes the lighting effects are cheated to get a specific look. If Hollywood could fake lighting as well as games could to achieve an exact result they'd do it in almost every film.
The time frame for movie VFX and for game production is different.

VFX studios negotiate a fixed rate for x VFX shots and then that number does not change regardless of time overruns or the film studio coming back and changing shots after the fact.

With games production the decision making is all in house and can go on for years vs the months for a film VFX job.

Games studios get paid when the game is finished and selling copies - VFX studios only get paid the once, and the consequences of film directors monkeying about with VFX shots can end up costing the VFX studio more than the contract was actually worth.

This actually caused the VFX studio that worked on Life of Pi to go bankrupt just before they accepted an academy award for it.
 
Last edited:
  • Wow
Reactions: moinmoin

Mopetar

Diamond Member
Jan 31, 2011
7,827
5,971
136
A bit off topic, but that just sounds like an absolutely idiotic way to contract out work. If a pay someone to do some landscaping they aren't going to let me keep changing me mind about where I want everything.
 
  • Like
Reactions: GodisanAtheist

moinmoin

Diamond Member
Jun 1, 2017
4,944
7,656
136
A bit off topic, but that just sounds like an absolutely idiotic way to contract out work. If a pay someone to do some landscaping they aren't going to let me keep changing me mind about where I want everything.
This is the biggest source of frustrations and clashes when programmers have to work with visual designers (the kind that has no technical knowledge but offer their service for interface design and the likes).
 

Glo.

Diamond Member
Apr 25, 2015
5,700
4,545
136

N22 - trading blows, and beating RTX 3060 Ti.
TBP - even over 200W.
The GPU appears to have similar or even the same MSRP as RTX 3060 Ti.

Cut down N22:

There appear to be two cut down models.
One has 12 GB's of VRAM, second one has 10 GB's of VRAM.
For now, 12 GB is the one which might be priced above 300$.
There is always room for 10 or even 6 GB, budget version of this GPU.

So in essence the lineup might look like this:

RX 6700XT - 40CUs/192 bit bus/12 GB VRAM - up to 399$
RX 6700 - 36 CUs/192 bit bus/12 GB VRAM - above 300$ MSRP.
RX 6600 XT - 36 CUs/160 bit bus/10 GB VRAM - below 300$ MSRP.

Doesn't this lineup reminds us of anything...?

Mimics, exactly the SKU stack that Navi 10 had:
40CUs/ful VRAM, 36 CUs/full VRAM, 36CUs/cut down VRAM.

This might suggest, that RX 6500 XT and non-XT are going to be based on Navi 23.

Which is by far the most exciting GPU of them all.
 
Last edited:

uzzi38

Platinum Member
Oct 16, 2019
2,608
5,828
146

N22 - trading blows, and beating RTX 3060 Ti.
TBP - even over 200W.
The GPU appears to have similar or even the same MSRP as RTX 3060 Ti.

Cut down N22:

There appear to be two cut down models.
One has 12 GB's of VRAM, second one has 10 GB's of VRAM.
For now, 12 GB is the one which might be priced above 300$.
There is always room for 10 or even 6 GB, budget version of this GPU.

So in essence the lineup might look like this:

RX 6700XT - 40CUs/192 bit bus/12 GB VRAM - up to 399$
RX 6700 - 36 CUs/192 bit bus/12 GB VRAM - above 300$ MSRP.
RX 6600 XT - 36 CUs/160 bit bus/10 GB VRAM - below 300$ MSRP.

Doesn't this lineup reminds us of anything...?

Mimics, exactly the SKU stack that Navi 10 had:
40CUs/ful VRAM, 36 CUs/full VRAM, 36CUs/cut down VRAM.

This might suggest, that RX 6500 XT and non-XT are going to be based on Navi 23.

Which is by far the most exciting GPU of them all.

Not so sure of the middle SKU there, fairly sure it's just 40CUs + 192 bit bus and 36 CUs + 160 bit bus. That being said, the performance there for the top end SKU sounds legit.

Also as a totally unrelated side-note, don't expect N24 to ever come to desktop. Laptops only. Desktop product stack is N21, N22 and N23.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,777
7,105
136
Was wondering when info about the 6700xt was going to start dribbling out.

Competing with the 3060Ti/2080S is a totally predictable albeit not unwelcome result. Again, puts the 60CU 6800 in a kind of weird spot since the gap between the 3060TI/3070/6700XT/6800 will end up around 15-20% at best.

Also a good opportunity to really dissect the architecture improvements between Navi 1 & 2 thanks to the CU count.

Keep bringing those GPU releases NV and AMD, even if we cannot buy them most of the fun is just talking about them anyway...
 

Mopetar

Diamond Member
Jan 31, 2011
7,827
5,971
136
I'm sure the MSRP of these cards is as much of a suggestion as it is for the rest of the product stack.

The 3060Ti is a good card, so although the results aren't unexpected at this point, it's a good result for AMD.

So no matter what card you wind up paying scalper/miner prices for is assured to be a good one.
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,355
2,845
106
RX6700XT doesn't look so great. Higher TDP, same performance, worse RT performance, more Vram, same price.

So no N24 for desktop? Will they replace RX 5500XT with N23?
 

Glo.

Diamond Member
Apr 25, 2015
5,700
4,545
136
Im pretty confident, at this point, that that performance target for 40 CU die being 10-15% faster than RTX 2080 Super was spot on.

Yeah. If N24 is laptop only, then N23 is going to replace Navi 14.

But I also wouldn't exclude possibility that full die SKU from N23 is going to be used for RX 6600 non-XT.

Because AMD simply "can" do that. 32 CUs/128 bit GPU will be the same performance as RTX 3060 while being much, much more efficient.
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,355
2,845
106
Because AMD simply "can" do that. 32 CUs/128 bit GPU will be the same performance as RTX 3060 while being much, much more efficient.
Either N23 won't be on par with RTX 3060 or It won't be much much more efficient If the above info about N22 is true.
N22 to be on par with RTX3060Ti will need higher TDP, then It doesn't bode that well for N23. Either they will clock It very high which will hurt efficiency or be more conservative, but It will hurt performance.

BTW what is much much more efficient for you?
RTX 3060 has 170W TBP, Navi14 has 130W so N23 should be what TBP in your opinion?
 
Last edited:

CP5670

Diamond Member
Jun 24, 2004
5,510
588
126
Clearly you haven't seen it in action... At least in Control there's zero over sharpening going on. It does wonderful job at keeping image stable in action without aliasing. Much better than the default antialiasing in action. Also in that game it's possible to actually run the DLSS at native resolution simply by changing the resolution from config. Film grain and motion blur doesn't work when DLSS in on but that's hardly a big loss...

Also lets put it this way:
DLSS+RT is much better looking than no DLSS and no RT.

A few games use it very well, but others don't. I played through Control and it makes a big difference there. DLSS is needed with RT, and even with a 3090 I had to play at 1080p to get 80+fps consistently, with 2560x1440 running at 45-60fps. The image looks soft and has sampling noise as mentioned earlier, but much nicer than playing at 4K without RT. I didn't care about RT/DLSS when buying the card but now think it's an important feature even today.