Question Speculation: RDNA2 + CDNA Architectures thread

Page 96 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

uzzi38

Platinum Member
Oct 16, 2019
2,705
6,427
146
All die sizes are within 5mm^2. The poster here has been right on some things in the past afaik, and to his credit was the first to saying 505mm^2 for Navi21, which other people have backed up. Even still though, take the following with a pich of salt.

Navi21 - 505mm^2

Navi22 - 340mm^2

Navi23 - 240mm^2

Source is the following post: https://www.ptt.cc/bbs/PC_Shopping/M.1588075782.A.C1E.html
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
The 8K marking BS is very suspicious to me. The number of people with 8K displays of any kind is very small and most of them are not gamers. The 3090 8K gaming stuff makes me think Nvidia knows that they are going to be outperformed by a cheaper card at 4K. They can make some claim of superiority at 8K, even though it is completely irrelevant. This happens all of the time in the gpu market where people read the reviews of the top end cards and use that in a purchasing decision, even though they are buying a low to mid range card.

I really wouldn't fixate on that! Marketing departments produce nonsense at the best of times. Ask one to justify something like the 3090 and, well!

Its a snappy catch phrase. Do you want them to say 'its got a crazy premium (over the 3080) because we've found out that people will pay for this sort of thing?'.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,340
5,464
136
- I agree with basically everything you said, but I'm not sure about the "DSR never caught on" comment. There is nothing to "catch on". Its an SSAA injector built into the driver. If you are playing older games and have a buttload of performance sitting around "wasted" you might as well downsample 4K to 1080P and get buttery delicious image quality on some normally craptastic looking old game.

You know all those super nice looking marketing shots your game never looks like when you run it? That's a super sampled image (much larger image crammed into a smaller viewing area).

I use DSR all the time when I play older games.

AMD has the same thing, only they call it VSR.

If you're playing an older game and your GPU is sitting at 50% utilization or something, crank up that SSAA and at least get a much better looking game for the same FPS. Of course if you're playing some newfangled hotness and struggling to maintain even 60FPS then DSR/VSR is about the last thing you want to use.

True. DSR is not something that needs in game support.

But it probably would be better if there were in game support. Some games have fixed resolution 2D overlay menus. Run at higher resolution and the menus take less screen space, now downscale them and the UI is getting a small poorly rendered UI with less pixels and size than before.
 

PhoBoChai

Member
Oct 10, 2017
119
389
106
When you compare something with TAA, that something is always better than TAA. Here that something is DLSS.
If I am not wrong that games that supports DLSS only has TAA or poor implementation of some AA. In that that way DLSS will automatically look better than native resolution.

DLSS is only implemented in games with TAA, because it is a more advanced TAA with a motion vector stage for image recombination, this part runs NV's ML algo to determine how best to reconstruct the new image based on the prior image & motion vector samples. This processing of motion vector is the reason for performance penalty, typically 1.4 to 2ms depending on the available GPU power.

If devs use another form of AA, you can't have DLSS. Sadly, TAA is usually terrible so that's why so many gamers get fooled into believing DLSS is some magic "looks better than native".
 

PhoBoChai

Member
Oct 10, 2017
119
389
106
Then why not call it what it is? 1440p upscaled to 4k with DLSS Should just be named 1440p Nvidia AAx4. Or something like that. Why call it another res? (8k gaming cards and what not) They could market hype the redacted out of it, or be honest with a really great AA-method. There lies all the difference in the world.

It's not just AA.

It's quite novel and has higher potential due to sampling previous images AND their motion vectors, which is stored in in the frame buffers, hence the perf impact associated with it isn't minimal like regular shader AA or just upscaling. The time cost makes it less useful when you already have high FPS, as the 1.4 to 2ms can eat into any potential gains at rendering a lower internal res.

NV runs their AI to determine the best algo for each game to produce the best temporal reconstruction. But its not fool proof, if devs use effects that are particle or compute based without motion vectors, DLSS gets confused and causes artifacts.

It's good, and it can get better. AMD needs their own ML equivalent, not just because NV marketing is awesome at getting mindshare, but these techniques do have potential to allow weaker GPUs to game at 4K res at comparable IQ to native 4K, once all the artifacts are solved.
 
Last edited by a moderator:

kurosaki

Senior member
Feb 7, 2019
258
250
86
Ever heard of FSAA or Super Sampling? DLSS is more or less the same but marketed in reverse, With sprinkles of AI-fairy dust on top. And could you believe the images coming out of the process looks better than the Native resoulution?! Whooooa!

"Full-scene anti-aliasing by super-sampling usually means that each full frame is rendered at double (2x) or quadruple (4x) the display resolution, and then down-sampled to match the display resolution. Thus, a 2x FSAA would render 4 super-sampled pixels for each single pixel of each frame. Rendering at larger resolutions will produce better results; however, more processor power is needed, which can degrade performance and frame rate. "
 

Panino Manino

Senior member
Jan 28, 2017
869
1,119
136
About the clocks, I had thought about this and excuse me if it's obvious of everyone.
I believe that many leaks are "orchestrated", that these companies let the information leak to the public. This recent lack of leaks from AMD if intentional to let them work their best until the last minute, not creating expectations that they'll be forced to meet, and possibly fail. That explains the situation with the clocks, right? First they want to built inventory waiting until the last week to fulfill that promises of launching before the new video games, and have time to test and select the skus with the higher clock possible to better compete with Nvidia.
Maybe they'll even create some card with a bigger than normal cooler for a halo product with insane clock.

I just hope that TSMC delivery.
But with that Zen 3 leak things are looking promising...

Now, about Image Upscaling, it may be a possibility but the new video games are launching without a IM solution. Bad news?
I still believe that RTG will announce an in-house solution in the works coming in a future update.
 

blckgrffn

Diamond Member
May 1, 2003
9,301
3,442
136
www.teamjuchems.com
Hype related, I just purchased a double digit number of AMD stocks in my 401k plan. Most industry analysts think I am wrong, but if Zen 3 is anything like Zen 2 and datacenter demand grows, it seems like there is a long runway for increasing revenues (at Intel's expense, of course) and RDNA2 might be shiny, but mostly because both of these things are happening in October.

I can wish that I had committed to this last week and got more shares for less money, but as with my Tesla purchase (and sale - in @$130 and out at $300 when I was trying to get funds in place to buy lower) just getting in can be the important factor.

Anyway, I feel like this will go well enough for a bounce. That and crazy monetary policies inflating stocks that show any promise at all, ha.

I need the markets to feel the hype now too! :D
 

blckgrffn

Diamond Member
May 1, 2003
9,301
3,442
136
www.teamjuchems.com
It's how forums work - if you can repeat the same thing enough (DLSS isn't important) then it somehow becomes truth, as DLSS is actually quite important this takes several pages of echo's for it to sink in.

It's an nvidia feature. Let's discuss it in the nvidia thread. Or make a new thread. It is only tangentially relevant to the RDNA2/CDNA leaks (or lack thereof) we are getting.

The moment AMD drops some feature for RNDA2 that sounds like DLSS by another name, by all means, let's rehash pages of people not changing opinions.
 

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
Putting together a build for a friend and he's wanting to spend about $400 on a GPU and its killing me shopping around. Yes, the 5700XT is the undisputed king at that price currently, but I'm thinking AMD drastically shakes things up at that price level when the 6000 series drops in a month.

The Navi 10 replacement, the 40 CU part rumored to be sitting at around 2.5 ghz will probably slot in the $300-400 range and it may be even smaller than Navi 10. AMD has a lot of leeway in pricing given how many chips they are getting per wafer, especially on TSMC's mature 7nm node. Could we see a repeat of the 4850/4870 and see a stunner get released at $300? Imagine a card at that price level that comes within spitting range of the 2080S?
 

soresu

Diamond Member
Dec 19, 2014
3,230
2,515
136
Oyvey - it seems that Intel are so into pushing OneAPI that they are even willing to invest in getting it working on both nVidia and AMD GPU's.

That bodes extremely well for open rendering if Intel push their own OptiX GPU path/ray tracing renderer toolkit competitor for OneAPI instead of CUDA.

Given their historic investment in Embree and their increased GPU focus it doesn't seem like such a stretch.
 

Mopetar

Diamond Member
Jan 31, 2011
8,114
6,770
136
Putting together a build for a friend and he's wanting to spend about $400 on a GPU and its killing me shopping around. Yes, the 5700XT is the undisputed king at that price currently, but I'm thinking AMD drastically shakes things up at that price level when the 6000 series drops in a month.

NVidia is supposed to be launching some of their cards in that range towards the end of October or in November, so there should be a lot of good competition to keep the prices in check. That the 3080 performs so close to the 3090 suggests that the 3070 and 3060 cards should have excellent performance as well since they won't hit the same issues that hold the 3090 back.
 

blckgrffn

Diamond Member
May 1, 2003
9,301
3,442
136
www.teamjuchems.com
Putting together a build for a friend and he's wanting to spend about $400 on a GPU and its killing me shopping around. Yes, the 5700XT is the undisputed king at that price currently, but I'm thinking AMD drastically shakes things up at that price level when the 6000 series drops in a month.

The Navi 10 replacement, the 40 CU part rumored to be sitting at around 2.5 ghz will probably slot in the $300-400 range and it may be even smaller than Navi 10. AMD has a lot of leeway in pricing given how many chips they are getting per wafer, especially on TSMC's mature 7nm node. Could we see a repeat of the 4850/4870 and see a stunner get released at $300? Imagine a card at that price level that comes within spitting range of the 2080S?

It's definitely worth waiting a month on and you can always wait for better, but this is a big generational jump we are about to get. It used to happen nearly every year it felt like, but this could be a 2-3 year point where you spend money now and enjoy it for that long. I mean, the timing is really great.

If he could save up another $100 OR find a really nice monitor to go with the new PC in the meantime, Zen 3 and RNDA 2 or RDNA 2 pressuring nvidia (fingers crossed) is like right now in the scheme of things.
 

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
NVidia is supposed to be launching some of their cards in that range towards the end of October or in November, so there should be a lot of good competition to keep the prices in check. That the 3080 performs so close to the 3090 suggests that the 3070 and 3060 cards should have excellent performance as well since they won't hit the same issues that hold the 3090 back.
I expect the 3070 cards to be around $550 since NVIDIA seems to have an issue in production at Samsung. Hopefully, AMD doesn't repeat NVIDIA's blunder of a launch and has good inventory at launch.

Having said that I will be keeping an eye on what the 3060 ends up looking like versus the ~6600 XT or whatever its named. Imagine spending $400 on a 5700xt only to see cards 50-75% faster for the same price drop a few weeks later..
 
  • Like
Reactions: Tlh97 and blckgrffn

Mopetar

Diamond Member
Jan 31, 2011
8,114
6,770
136
I expect the 3070 cards to be around $550 since NVIDIA seems to have an issue in production at Samsung. Hopefully, AMD doesn't repeat NVIDIA's blunder of a launch and has good inventory at launch.

Having said that I will be keeping an eye on what the 3060 ends up looking like versus the ~6600 XT or whatever its named. Imagine spending $400 on a 5700xt only to see cards 50-75% faster for the same price drop a few weeks later..

Wasn't the price for the 3070 already announced as $500? Sure there will be some third party cards that go for higher, but if the 3070 is at the far end of what it can achieve out of the gate there isn't a lot to gain from the premium AIB models.

I'm definitely getting a new rig this fall though. For one reason or another I've been holding off but it seems like all the stars are aligning with good new hardware options and good value for money to boot.
 

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
Wasn't the price for the 3070 already announced as $500? Sure there will be some third party cards that go for higher, but if the 3070 is at the far end of what it can achieve out of the gate there isn't a lot to gain from the premium AIB models.

I'm definitely getting a new rig this fall though. For one reason or another I've been holding off but it seems like all the stars are aligning with good new hardware options and good value for money to boot.

Yes the 3070 was announced at $500 but scarcity might drive up real retail prices of AIB. I'm not sure there will be a FE 3070 and if there is I hope supply issues are sorted out by then.

This fall is shaping up to be an AMAZING time to build a new rig. When is the last time we saw potentially game changing CPUs and GPUs released from the major players all together in such a short time span? Myself - I simply plan to upgrade my 1070 Ti to whatever is available in November for $500 or less, and my Ryzen 2700 to a potential 5700/5800 @ $350.
 

Guru

Senior member
May 5, 2017
830
361
106
RX 6900XT seems like a beast, 80cu's, 96rops, 2.2GHz boost clock, 16GB 16gbps GDDR6 memory and 256bit interface.

If its 70% faster than the RX 5700xt it would put it at RTX 3080 levels in terms of performance, but consuming less power, being build on a small node and AMD being able to adjust prices very competitively.

I expect if its as fast the the RTX 3080 to be the same price range though.
 

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
RX 6900XT seems like a beast, 80cu's, 96rops, 2.2GHz boost clock, 16GB 16gbps GDDR6 memory and 256bit interface.

If its 70% faster than the RX 5700xt it would put it at RTX 3080 levels in terms of performance, but consuming less power, being build on a small node and AMD being able to adjust prices very competitively.

I expect if its as fast the the RTX 3080 to be the same price range though.

The top 80CU part surely must come with an HBM variant. If a much lower clockspeed 40CU Navi was combined with a 256-bit bus last gen, I can't imagine how a much faster clocked 80CU chip could make do with the same bud width simply from moving from GDDR6 to GDDR6X.
 
  • Like
Reactions: Mopetar and Tlh97

GoodRevrnd

Diamond Member
Dec 27, 2001
6,801
581
126
The top 80CU part surely must come with an HBM variant. If a much lower clockspeed 40CU Navi was combined with a 256-bit bus last gen, I can't imagine how a much faster clocked 80CU chip could make do with the same bud width simply from moving from GDDR6 to GDDR6X.
I know it's just a dumb render in Fortnite, but seems like everyone forgot the rear heatsink anchor layout was highly suggestive of HBM.
 
Last edited:

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
No way the 80CU part has a 256bit memory interface, we have been through this. That is NOT enough bandwidth for a GPU of that size. It works fine on the 40CU 5700XT, but not a chip that is double the size. 16GB means its either a giant 512bit interface, or as most of us suspect, HBM2e.
 
  • Like
Reactions: Mopetar

Karnak

Senior member
Jan 5, 2017
399
767
136
No way the 80CU part has a 256bit memory interface, we have been through this. That is NOT enough bandwidth for a GPU of that size. It works fine on the 40CU 5700XT, but not a chip that is double the size. 16GB means its either a giant 512bit interface, or as most of us suspect, HBM2e.
Or it's actually a 256bit interface but with the rumored cache.
Or it's actually a 256bit interface without the rumored cache - but with one extra stack of HBM2e. Like 14,4GT/s for GDDR6 and 3,6GT/s HBM2e (8GB + 8GB for a total of 16GB and 420Gb/s + 420Gb/s for a total of 920Gb/s bandwidth).

Who knows. :laughing: I mean, less than a month til the official announcement and overall we still know close to nothing except for a few things thanks to MacOS. According to AMD it will be a halo product, so.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Or it's actually a 256bit interface but with the rumored cache.
Or it's actually a 256bit interface without the rumored cache - but with one extra stack of HBM2e. Like 14,4GT/s for GDDR6 and 3,6GT/s HBM2e (8GB + 8GB for a total of 16GB and 420Gb/s + 420Gb/s for a total of 920Gb/s bandwidth).

Who knows. :laughing: I mean, less than a month til the official announcement and overall we still know close to nothing except for a few things thanks to MacOS. According to AMD it will be a halo product, so.

If there is two stacks of HBM2e (and there has to be for 16GB of RAM) then it would have a 512bit 2048bit interface, as each stack is 256bits 1024bits wide.

EDIT: Fixed the above text that is striked out.
 
Last edited:
  • Like
Reactions: Tlh97

Karnak

Senior member
Jan 5, 2017
399
767
136
One stack of HBM2(e) is 1024bit, though there is some type of a "low cost option" iirc from Samsung with 512bit.

But I was specifically thinking of just mixing GDDR6 and HBM2e with 8GB of each for a total of 16GB. Prob won't happen , but I don't know if some sort of huge cache is more realistic at this point than just having a mix of GDDR6 and HBM2e. 256bit without anything extra won't happen on a 80CU chip. Pretty sure about this.