Question Speculation: RDNA3 + CDNA2 Architectures Thread

Page 93 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

uzzi38

Platinum Member
Oct 16, 2019
2,705
6,427
146

Thunder 57

Platinum Member
Aug 19, 2007
2,993
4,570
136
N33 is mainly for 1080p, but It shouldn't be too bad for 1440p although 8GB VRAM is not very future-proof.
TBP could be 150W, but I am pretty skeptical about the price.

How sad is it that 8GB is no longer "future proof"? I got an RX 5700 over the RX 5600 XT because I wanted the extra 2GB. Otherwise as far as performance goes they weren't all that far apart. I have had 16GB of memory in my primary computer from 2012 until now, 10 years later!

I certainly overdid it in 2012 but memory was cheap at the time. It was very "Future proof". At the time I think my GPU had 2GB. It is a bit surprising about the VRAM creep that has been going on lately.

The first system I built that wasn't hand me down parts had a Geforce 3 with 64MB, while the main memory was 512MB. That is a 1:8 ratio. That 16GB which was totally overkill was also 1:8. Then it became 1:2 which is where I am at now. And now we are looking at what, 10-12GB for video cards to be future proof?
 
Last edited:

Thunder 57

Platinum Member
Aug 19, 2007
2,993
4,570
136
The first two were released in the mining boom. Their new cards will be releasing when gpu mining is dead and into a trash economy. I hope gamers don't let AMD normalize mining era MSRPs.

Ha! What do you call NVIDIA going from going from an MSRP of $600 for the RTX 2080 and 3080 to a $900 card that they "unlaunched" and a $1200 card?
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,523
3,037
136
The first two were released in the mining boom. Their new cards will be releasing when gpu mining is dead and into a trash economy. I hope gamers don't let AMD normalize mining era MSRPs.
Those cards were not sold at MSRP during the mining boom, but for a lot more. You can say that those MSRP prices would be even "pretty good" for N33.
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,523
3,037
136
How sad is it that 8GB is no longer "future proof"? I got an RX 5700 over the RX 5600 XT because I wanted the extra 2GB. Otherwise as far as performance goes they weren't all that far apart. I have had 16GB of memory in my primary computer from 2012 until now, 10 years later!

I certainly overdid it in 2012 but memory was cheap at the time. It was very "Future proof". At the time I think my GPU had 2GB. It is a bit surprising about the VRAM creep that has been going on lately.

The first system I built that wasn't hand me down parts had a Geforce 3 with 64MB, while the main memory was 512MB. That is a 1:8 ratio. That 16GB which was totally overkill was also 1:8. Then it became 1:4 which is where I am at now. And now we are looking at what, 10-12GB for video cards to be future proof?
The problem is that 16gbit memory chips are the largest ones available, and GDDR6 is not exactly cheap.
If you want more than 8GB Vram in a N33 card for example, then you need to widen the memory bus from 128bit to 160-192bit or use clamshell to have 10-12GB or 16GB respectively.
Either of these options would increase the production cost.
Unless manufacturers increase the capacity per memory chip, then we can forget about more Vram on the less expensive cards unless the price goes up.
 

Saylick

Diamond Member
Sep 10, 2012
3,532
7,859
136
MLID with a little bit more hedging right before RDNA 3 announcement. He says no one expects full fat N31 to beat the 4090 in 4K raster, so we'll just have to wait and see. I feel like those expectations are not seeing the full picture if we are to believe his statement that AIBs have not been fully briefed and given unlocked cards. I mean, an expectation is just that. It's an educated guess. If they had the real performance, it would no longer be an expectation. We already know how the 4090 performs.

Screenshot_20221102-003458.jpg
Screenshot_20221102-003510.jpg
 

psolord

Platinum Member
Sep 16, 2009
2,094
1,234
136
Ah dang, how did I miss this one in reading the Twitter thread. After 10 years of using a hardware scheduler (AMD transitioned to it with GCN), it looks like software-based scheduling is back on the menu for RDNA 3, likely for the sake of increasing perf/W and perf/A. Of course, this approach requires more software development work on the compiler but perhaps it will pan out well for AMD given they have far more resources today than before. Also, the side bonus is that compilers can get optimized over time, so FineWine is more of a possibility.


Does that mean that AMD will lose its DX12 over nvidia? Especially when using weaker cpus?
 
  • Like
Reactions: Gideon

psolord

Platinum Member
Sep 16, 2009
2,094
1,234
136
@Saylick

A lot of 'fine wine' was AMD simply lacking the development capacity to use their hardware fully at release and only unlocking the actual power of the card much later. So less fine wine is generally better.

Oh great, that means that Intel will be able to talk about fine whisky in a few years regarding Arc?

I bet they will call it something stupid like 24karat malt drivers, or something.
 

leoneazzurro

Golden Member
Jul 26, 2016
1,052
1,716
136
Isn't AMD doing just the same thing Nvidia did, going to launch two 7900 cards? Why there's no 7800xt, as it was with the Navi 21 where 6900xt full fat and 6800xt cut down versions of the flagman chip were launched first?
For me it's sign of not willing to be aggressive at all pricewise for the 20CU board.

Because 79XX cards are N31 while 78XX cards are N32, according to most "reliable" leakers, so they are based on a different GPU. And N32 is rumored to not be ready yet, probably being targetted at Q1 23 release.
 

PJVol

Senior member
May 25, 2020
708
632
136
Because 79XX cards are N31 while 78XX cards are N32, according to most "reliable" leakers, so they are based on a different GPU. And N32 is rumored to not be ready yet, probably being targetted at Q1 23 release.
Thanks, though all you said actually states the "effect", but doesn't answer "why" )
Perhaps I wasn't clear enough, sorry.
 
Last edited:

JayMX

Member
Oct 18, 2022
31
73
51
Very weak video by adored. He just talks about the skyjuice RDNA3 post in the angstronomics.com that was posted here in august.

Actually you are right that the information he has provided was nothing new. Yet the way he presented it was brilliant - I really love his videos. The most important thing IMHO was his conclusion that we are close to a 'ZEN era' (like the revolution ZEN arch brought in CPUs) in GPUs where AMD's design is so promising in terms of cost and scaling while Nvidia can't go any further with their monolithic approach.
 
  • Like
Reactions: Kaluan

H T C

Senior member
Nov 7, 2018
588
427
136
Isn't AMD doing just the same thing Nvidia did, going to launch two 7900 cards? Why there's no 7800xt, as it was with the Navi 21 where 6900xt full fat and 6800xt cut down versions of the flagman chip were launched first?

Because, though to a smaller extent, AMD too have A LOT of cards from the current generation to sell due to the crypto crash.

Supposedly, AMD will launch their new flagship cards and lower the prices of current generation of cards until "their current sock is sufficiently reduced", @ which point they'll launch the remaining RDNA3 lineup.
 

leoneazzurro

Golden Member
Jul 26, 2016
1,052
1,716
136
Thanks, though all you said actually states the "effect", but doesn't answer "why" )
Perhaps I wasn't clear enough, sorry.

Well, the "why" is about AMD not launching more than one chip at time because of different manufacturing/design schedules. The same thing was with the 6000 series, first N21 was launched, then N22, N23 and N24 were also launched at different times.
The difference this time seems to be that N31 is only Series 79XX while N21 was 69XX and 68XX.
 

H T C

Senior member
Nov 7, 2018
588
427
136
I thought they did that already.

This depends entirely on how much they want to try and undercut, nVidia price wise: if they want a similar price for X% of nVidia's performance or if they want a Y% LOWER price for X% of nVidia's performance.

If it's the former, you're likely right.

Keep in mind that nVidia is supposed to have a 4090 Ti "ready to launch" to counter AMD's flagship, should it have higher than RTX 4090 performance.
 

Tuna-Fish

Golden Member
Mar 4, 2011
1,486
2,023
136
Precisely. It was just so immediately transparent to me that Skyjuice was an order of magnitude better than MLID, RGT (laughable), AdoredTV, pretty much all of them. Literally every one who does leaks on YouTube always spends an inordinate amount of time hedging whatever they leak out. Meanwhile, Skyjuice's leaks are succinct, ad free, and credible. It was just a breath of fresh air.

Note that the monolithic sapphire rapids article is very different from their early leaks. Unlike the RDNA3 or the PROM21 article, it starts with "Now that images have surfaced publicly, it’s time for us to share what we know", and then proceeds to do a bunch of analysis, all of which can be derived from the picture of the wafer that was out there before the article was released. None of the analysis is bullshit, but that article did not need a source, just good understanding and that photo. The A16 article was also similar, in that it was clearly well founded, but could be entirely based on public knowledge. Oberon Plus article also is based on hardware that is already out there and can be examined.

I think Skyjuice is generally knowledgeable and well informed, but has exactly one real source, and it's either at AMD or at an AIB/MB maker that works with AMD. They post about other stuff so it wouldn't be that obvious.
 

linkgoron

Platinum Member
Mar 9, 2005
2,409
979
136
This depends entirely on how much they want to try and undercut, nVidia price wise: if they want a similar price for X% of nVidia's performance or if they want a Y% LOWER price for X% of nVidia's performance.

If it's the former, you're likely right.

Keep in mind that nVidia is supposed to have a 4090 Ti "ready to launch" to counter AMD's flagship, should it have higher than RTX 4090 performance.

If AMD can beat the 4090 (which IMO they should be able to) with decent pricing, Nvidia releasing a 4090TI so quickly after the 4090 release would probably cause some negative feedback from some of their customers, I assume, and maybe some negative media coverage as well. Especially if it'll be released with a price reduction for the current 4080 and 4090 cards.
 
  • Like
Reactions: Tlh97 and Kaluan

H T C

Senior member
Nov 7, 2018
588
427
136
If AMD can beat the 4090 (which IMO they should be able to) with decent pricing, Nvidia releasing a 4090TI so quickly after the 4090 release would probably cause some negative feedback from some of their customers, I assume, and maybe some negative media coverage as well. Especially if it'll be released with a price reduction for the current 4080 and 4090 cards.

They'd just need to wait several months to do it: they did it with 3090 Ti, so there's already a precedent for it.

Negative publicity didn't seem to affect them much then ... so they might do it again ...
 

Aapje

Golden Member
Mar 21, 2022
1,515
2,065
106
Keep in mind that nVidia is supposed to have a 4090 Ti "ready to launch" to counter AMD's flagship, should it have higher than RTX 4090 performance.

We know that the 4090 has a rather deep cut with 11% disabled, so they can release a 4090 Ti with 11% more transistors. Although since their architecture scales so poorly, that should provide far less than 11% improvement.

However, they have also severely voltage limited the 4090 to prevent it from getting OC'ed effectively, so they will probably give the 4090 Ti a much higher voltage limit.