Question Speculation: RDNA3 + CDNA2 Architectures Thread

Page 92 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

uzzi38

Platinum Member
Oct 16, 2019
2,638
5,991
146

leoneazzurro

Senior member
Jul 26, 2016
930
1,465
136
Isn't AMD doing just the same thing Nvidia did, going to launch two 7900 cards? Why there's no 7800xt, as it was with the Navi 21 where 6900xt full fat and 6800xt cut down versions of the flagman chip were launched first?
For me it's sign of not willing to be aggressive at all pricewise for the 20CU board.

Because 79XX cards are N31 while 78XX cards are N32, according to most "reliable" leakers, so they are based on a different GPU. And N32 is rumored to not be ready yet, probably being targetted at Q1 23 release.
 

PJVol

Senior member
May 25, 2020
534
447
106
Because 79XX cards are N31 while 78XX cards are N32, according to most "reliable" leakers, so they are based on a different GPU. And N32 is rumored to not be ready yet, probably being targetted at Q1 23 release.
Thanks, though all you said actually states the "effect", but doesn't answer "why" )
Perhaps I wasn't clear enough, sorry.
 
Last edited:

JayMX

Member
Oct 18, 2022
31
73
51
Very weak video by adored. He just talks about the skyjuice RDNA3 post in the angstronomics.com that was posted here in august.

Actually you are right that the information he has provided was nothing new. Yet the way he presented it was brilliant - I really love his videos. The most important thing IMHO was his conclusion that we are close to a 'ZEN era' (like the revolution ZEN arch brought in CPUs) in GPUs where AMD's design is so promising in terms of cost and scaling while Nvidia can't go any further with their monolithic approach.
 
  • Like
Reactions: Kaluan

H T C

Senior member
Nov 7, 2018
555
396
136
Isn't AMD doing just the same thing Nvidia did, going to launch two 7900 cards? Why there's no 7800xt, as it was with the Navi 21 where 6900xt full fat and 6800xt cut down versions of the flagman chip were launched first?

Because, though to a smaller extent, AMD too have A LOT of cards from the current generation to sell due to the crypto crash.

Supposedly, AMD will launch their new flagship cards and lower the prices of current generation of cards until "their current sock is sufficiently reduced", @ which point they'll launch the remaining RDNA3 lineup.
 

leoneazzurro

Senior member
Jul 26, 2016
930
1,465
136
Thanks, though all you said actually states the "effect", but doesn't answer "why" )
Perhaps I wasn't clear enough, sorry.

Well, the "why" is about AMD not launching more than one chip at time because of different manufacturing/design schedules. The same thing was with the 6000 series, first N21 was launched, then N22, N23 and N24 were also launched at different times.
The difference this time seems to be that N31 is only Series 79XX while N21 was 69XX and 68XX.
 

H T C

Senior member
Nov 7, 2018
555
396
136
I thought they did that already.

This depends entirely on how much they want to try and undercut, nVidia price wise: if they want a similar price for X% of nVidia's performance or if they want a Y% LOWER price for X% of nVidia's performance.

If it's the former, you're likely right.

Keep in mind that nVidia is supposed to have a 4090 Ti "ready to launch" to counter AMD's flagship, should it have higher than RTX 4090 performance.
 

Tuna-Fish

Golden Member
Mar 4, 2011
1,355
1,550
136
Precisely. It was just so immediately transparent to me that Skyjuice was an order of magnitude better than MLID, RGT (laughable), AdoredTV, pretty much all of them. Literally every one who does leaks on YouTube always spends an inordinate amount of time hedging whatever they leak out. Meanwhile, Skyjuice's leaks are succinct, ad free, and credible. It was just a breath of fresh air.

Note that the monolithic sapphire rapids article is very different from their early leaks. Unlike the RDNA3 or the PROM21 article, it starts with "Now that images have surfaced publicly, it’s time for us to share what we know", and then proceeds to do a bunch of analysis, all of which can be derived from the picture of the wafer that was out there before the article was released. None of the analysis is bullshit, but that article did not need a source, just good understanding and that photo. The A16 article was also similar, in that it was clearly well founded, but could be entirely based on public knowledge. Oberon Plus article also is based on hardware that is already out there and can be examined.

I think Skyjuice is generally knowledgeable and well informed, but has exactly one real source, and it's either at AMD or at an AIB/MB maker that works with AMD. They post about other stuff so it wouldn't be that obvious.
 

linkgoron

Platinum Member
Mar 9, 2005
2,300
821
136
This depends entirely on how much they want to try and undercut, nVidia price wise: if they want a similar price for X% of nVidia's performance or if they want a Y% LOWER price for X% of nVidia's performance.

If it's the former, you're likely right.

Keep in mind that nVidia is supposed to have a 4090 Ti "ready to launch" to counter AMD's flagship, should it have higher than RTX 4090 performance.

If AMD can beat the 4090 (which IMO they should be able to) with decent pricing, Nvidia releasing a 4090TI so quickly after the 4090 release would probably cause some negative feedback from some of their customers, I assume, and maybe some negative media coverage as well. Especially if it'll be released with a price reduction for the current 4080 and 4090 cards.
 
  • Like
Reactions: Tlh97 and Kaluan

H T C

Senior member
Nov 7, 2018
555
396
136
If AMD can beat the 4090 (which IMO they should be able to) with decent pricing, Nvidia releasing a 4090TI so quickly after the 4090 release would probably cause some negative feedback from some of their customers, I assume, and maybe some negative media coverage as well. Especially if it'll be released with a price reduction for the current 4080 and 4090 cards.

They'd just need to wait several months to do it: they did it with 3090 Ti, so there's already a precedent for it.

Negative publicity didn't seem to affect them much then ... so they might do it again ...
 

Aapje

Golden Member
Mar 21, 2022
1,385
1,865
106
Keep in mind that nVidia is supposed to have a 4090 Ti "ready to launch" to counter AMD's flagship, should it have higher than RTX 4090 performance.

We know that the 4090 has a rather deep cut with 11% disabled, so they can release a 4090 Ti with 11% more transistors. Although since their architecture scales so poorly, that should provide far less than 11% improvement.

However, they have also severely voltage limited the 4090 to prevent it from getting OC'ed effectively, so they will probably give the 4090 Ti a much higher voltage limit.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,501
20,624
146
idk about this future proofing... my brother is running an RX 480 8GB card. Was there ever a game it could run at 60Hz that would use the full vram?
That's an outdated mindset. Years ago, debates going on here about why a card wasn't fast enough to need more vram were all the rage.

Times change. Now, more vram even on slower cards means you can turn textures up. One of the best visual improvements, and without costing much in performance.
 

Timorous

Golden Member
Oct 27, 2008
1,627
2,797
136

That would be 86% faster than the 6950XT at a similar power draw.

Based on info provided at B3D regarding laptop N32 performance and power draw that show an 80% perf/watt gain so using that number you can get to 10% faster than the 4090 at 350W. Seems a bit far fetched as I don't think AMD would have sandbagged their >50% claim that much.

OTOH 4090 +/- 10% at 350W is probably ballpark for 4K raster so a high end guess but not an absolutely impossible guess I don't think.
 
  • Like
Reactions: Tlh97 and Joe NYC

SteveGrabowski

Diamond Member
Oct 20, 2014
6,903
5,836
136
Ha! What do you call NVIDIA going from going from an MSRP of $600 for the RTX 2080 and 3080 to a $900 card that they "unlaunched" and a $1200 card?

I have already pretty much written Nvidia off, no interest in any of their cards unless the 3060 Ti absolutely crashes in price at some point this month.
 

maddie

Diamond Member
Jul 18, 2010
4,749
4,691
136
If the new shaders do have 1/2 the transistors (1/2 the area on identical node), what does this say about switching power? Why is this ignored by seemingly everyone?
 

eek2121

Platinum Member
Aug 2, 2005
2,930
4,026
136
The first two were released in the mining boom. Their new cards will be releasing when gpu mining is dead and into a trash economy. I hope gamers don't let AMD normalize mining era MSRPs.
You are pointing fingers at the wrong company.
For AMD to match the 4090 in raster at 350W it needs around a 62% perf/watt gain.

Not impossible given the 6900XT had a 64% perf/watt gain vs the 5700XT but not easy either.

I don't see that MLID has anything new to offer here.
  • Points 1 and 2 can be taken from that picture of the reference design.
  • Point 3 is new but could just be BS or is just a guess as to the lack of leaks.
  • Point 4 can be taken from the Igor picture.
  • Point 5 can be calculated using the 50% perf/watt gain and a 350W TBP from the 6900XT or 6950XT.
  • The bit about supply could also just be made up BS that will be hard to verify after the fact.
So just looks like a lot of hot air and a restatement of stuff we have already seen.
Glad somebody besides me sees how he makes his videos. I see tweets with various bits of info pop up, then hours/days later he makes a video claiming to have new info. All his new info either came from tweets or is unverifiable.

Follow a few reliable leakers on twitter and you will find out about stuff faster and without having to watch a cringy video.
I have already pretty much written Nvidia off, no interest in any of their cards unless the 3060 Ti absolutely crashes in price at some point this month.

I wish I could write them off.
 
  • Like
Reactions: Kaluan

eek2121

Platinum Member
Aug 2, 2005
2,930
4,026
136
If the claims of ~4090 performance at 350W are true, it sounds like 3090 performance may be possibly at ~175W? If so, that is pretty amazing.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
God I hope we get a nice 1440p card at 150W for ~$300 or so.
I cannot think of a single 1440 card that has been sold for $300. The 5700 (non-XT) gets close at $349.


The first two were released in the mining boom. Their new cards will be releasing when gpu mining is dead and into a trash economy. I hope gamers don't let AMD normalize mining era MSRPs.
The mining craze started after Ampere/Navi2 launched, not before. Which is why the MSRP's went up as the life cycle went on.
 
Last edited:
  • Like
Reactions: Mopetar

Thunder 57

Platinum Member
Aug 19, 2007
2,675
3,801
136
I cannot think of a single 1440 card that has been sold for $300. The 5700 (non-XT) gets close at $349.



The mining craze started after Ampere/Nava2 launched, not before. Which is why the MSRP's went up as the life cycle went on.

I got a 5700 for $329. I'd gladly do that again. Before that I had an RX 480 8GB that cost $239. I used that for 1440p in Doom 2016 and BF 1. I had to turn down some settings on BF, but it was doable.
 

biostud

Lifer
Feb 27, 2003
18,251
4,765
136
Is that indicating if you limit the NVIDIA card to the same wattage as the AMD card it will perform better? I don't think its indicating it beats the NVIDIA card when it is allowed to use its full power.
It is indicating that AMD beats a 450W card using far less power