Question Speculation: RDNA3 + CDNA2 Architectures Thread

Page 91 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

uzzi38

Platinum Member
Oct 16, 2019
2,565
5,572
146

maddie

Diamond Member
Jul 18, 2010
4,722
4,625
136
Most will probably be 5 MCD, but no doubt there will be some 6 MCD with a failed bond, as well as 4 full MCD + 2 half disabled MCD.
Now that is a possibility I had not thought of, I/2 MCDs. I suppose we could tell by the placement of the memory chips on the PCB. A 1/2 MCD using (1) 2GB die is a giveaway.

edit:
I'm assuming that the MCDs use adjacent paired memory chips.
 

Saylick

Diamond Member
Sep 10, 2012
3,084
6,184
136
Very weak video by adored. He just talks about the skyjuice RDNA3 post in the angstronomics.com that was posted here in august.
It's so funny. All of the typical "leakers" who use videos as their medium for generating clicks/views pretty much fell in line and deferred to Skyjuice as soon as he dropped that bombshell. Leakers literally went from "Well, I have some insiders telling me some info that I will make a video on shortly, just need to confirm some things" to "Actually, Skyjuice has confirmed what I've been told all along, so there's no point in holding back what I know" like overnight.

What's crazy is that Skyjuice literally doesn't run a rumormill site to the likes of Videocardz or WCCFTech yet he has some of the best leak articles we've seen.
 

Thunder 57

Platinum Member
Aug 19, 2007
2,647
3,706
136
Have to say that I only play single player games but I have never though to myself:
"If only this added more fake frames between each real one, things would look smoother."
I really see zero or less value in DLSS 3.0 - certainly at the moment with the artifacts and in the future too unless the DL AI becomes truly sentient and plays the game for me. And even the, if I wanted to watch gaming footage I could go to youtube!

Imagine being the one trying to market DLSS 3.

Fake Frames! "The way it's meant to be played"!
 

Tuna-Fish

Golden Member
Mar 4, 2011
1,324
1,462
136
What's crazy is that Skyjuice literally doesn't run a rumormill site to the likes of Videocardz or WCCFTech yet he has some of the best leak articles we've seen.

He started his site with an article that proved that he has actual sources, and did so in a way that was very easy to tell very soon after he launched it. The PROM21 article very deliberately contained a lot of details that are unneccessary for any reader, except to confirm he has an actual source instead of rumors, because they were completely absent on the internet before the article and information about them was made public soon after.

It made everyone who follows these things closely sit down and pay attention.
 

Saylick

Diamond Member
Sep 10, 2012
3,084
6,184
136
He started his site with an article that proved that he has actual sources, and did so in a way that was very easy to tell very soon after he launched it. The PROM21 article very deliberately contained a lot of details that are unneccessary for any reader, except to confirm he has an actual source instead of rumors, because they were completely absent on the internet before the article and information about them was made public soon after.

It made everyone who follows these things closely sit down and pay attention.
Precisely. It was just so immediately transparent to me that Skyjuice was an order of magnitude better than MLID, RGT (laughable), AdoredTV, pretty much all of them. Literally every one who does leaks on YouTube always spends an inordinate amount of time hedging whatever they leak out. Meanwhile, Skyjuice's leaks are succinct, ad free, and credible. It was just a breath of fresh air.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
6,792
5,750
136
Doubt it. Too many folks equate AMD with having slow GPUs with buggy drivers, which is a shame, because next gen sounds like it is going to lay the smackdown on NVIDIA.

AMD could charge similar prices to last year and still wouldn’t sell as many units as NVIDIA.

God I hope we get a nice 1440p card at 150W for ~$300 or so.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Even though I have always leaned strongly towards Nvidia graphics cards, I sincerely hope that RDNA3 is powerful enough to put a smackdown on Nvidia this cycle. Nvidia has too much power in the GPU market and they need to be taken down a few notches. RDNA2 came close, but it's ray tracing performance was definitely an Achilles heel compared to Ampere.

My biggest concern with RDNA3 is ray tracing. Nvidia has such a large lead, that even if RDNA3 doubles ray tracing performance, they will still be behind Nvidia in the majority of ray tracing capable titles. They need to triple ray traced performance to really be on par with Nvidia and that's a tall order.

RTX 4090 is out of stock everywhere and I don't feel like paying scalper prices on eBay so I can enjoy gaming at 4K when I build my new computer, so I'm hoping and praying that AMD delivers in a big way.

If not, I may just buy a used RTX 3090 Ti on eBay even though it's nowhere near as good as an RTX 4090 for tackling 4K. This kind of thing is almost enough to turn me into a console gamer :D
 
  • Like
Reactions: Kaluan and pcp7

Thunder 57

Platinum Member
Aug 19, 2007
2,647
3,706
136
N33 is mainly for 1080p, but It shouldn't be too bad for 1440p although 8GB VRAM is not very future-proof.
TBP could be 150W, but I am pretty skeptical about the price.

How sad is it that 8GB is no longer "future proof"? I got an RX 5700 over the RX 5600 XT because I wanted the extra 2GB. Otherwise as far as performance goes they weren't all that far apart. I have had 16GB of memory in my primary computer from 2012 until now, 10 years later!

I certainly overdid it in 2012 but memory was cheap at the time. It was very "Future proof". At the time I think my GPU had 2GB. It is a bit surprising about the VRAM creep that has been going on lately.

The first system I built that wasn't hand me down parts had a Geforce 3 with 64MB, while the main memory was 512MB. That is a 1:8 ratio. That 16GB which was totally overkill was also 1:8. Then it became 1:2 which is where I am at now. And now we are looking at what, 10-12GB for video cards to be future proof?
 
Last edited:

Thunder 57

Platinum Member
Aug 19, 2007
2,647
3,706
136
The first two were released in the mining boom. Their new cards will be releasing when gpu mining is dead and into a trash economy. I hope gamers don't let AMD normalize mining era MSRPs.

Ha! What do you call NVIDIA going from going from an MSRP of $600 for the RTX 2080 and 3080 to a $900 card that they "unlaunched" and a $1200 card?
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,329
2,811
106
The first two were released in the mining boom. Their new cards will be releasing when gpu mining is dead and into a trash economy. I hope gamers don't let AMD normalize mining era MSRPs.
Those cards were not sold at MSRP during the mining boom, but for a lot more. You can say that those MSRP prices would be even "pretty good" for N33.
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,329
2,811
106
How sad is it that 8GB is no longer "future proof"? I got an RX 5700 over the RX 5600 XT because I wanted the extra 2GB. Otherwise as far as performance goes they weren't all that far apart. I have had 16GB of memory in my primary computer from 2012 until now, 10 years later!

I certainly overdid it in 2012 but memory was cheap at the time. It was very "Future proof". At the time I think my GPU had 2GB. It is a bit surprising about the VRAM creep that has been going on lately.

The first system I built that wasn't hand me down parts had a Geforce 3 with 64MB, while the main memory was 512MB. That is a 1:8 ratio. That 16GB which was totally overkill was also 1:8. Then it became 1:4 which is where I am at now. And now we are looking at what, 10-12GB for video cards to be future proof?
The problem is that 16gbit memory chips are the largest ones available, and GDDR6 is not exactly cheap.
If you want more than 8GB Vram in a N33 card for example, then you need to widen the memory bus from 128bit to 160-192bit or use clamshell to have 10-12GB or 16GB respectively.
Either of these options would increase the production cost.
Unless manufacturers increase the capacity per memory chip, then we can forget about more Vram on the less expensive cards unless the price goes up.
 

Saylick

Diamond Member
Sep 10, 2012
3,084
6,184
136
MLID with a little bit more hedging right before RDNA 3 announcement. He says no one expects full fat N31 to beat the 4090 in 4K raster, so we'll just have to wait and see. I feel like those expectations are not seeing the full picture if we are to believe his statement that AIBs have not been fully briefed and given unlocked cards. I mean, an expectation is just that. It's an educated guess. If they had the real performance, it would no longer be an expectation. We already know how the 4090 performs.

Screenshot_20221102-003458.jpg
Screenshot_20221102-003510.jpg
 

psolord

Golden Member
Sep 16, 2009
1,875
1,184
136
Ah dang, how did I miss this one in reading the Twitter thread. After 10 years of using a hardware scheduler (AMD transitioned to it with GCN), it looks like software-based scheduling is back on the menu for RDNA 3, likely for the sake of increasing perf/W and perf/A. Of course, this approach requires more software development work on the compiler but perhaps it will pan out well for AMD given they have far more resources today than before. Also, the side bonus is that compilers can get optimized over time, so FineWine is more of a possibility.


Does that mean that AMD will lose its DX12 over nvidia? Especially when using weaker cpus?
 
  • Like
Reactions: Gideon

Timorous

Golden Member
Oct 27, 2008
1,532
2,535
136
MLID with a little bit more hedging right before RDNA 3 announcement. He says no one expects full fat N31 to beat the 4090 in 4K raster, so we'll just have to wait and see. I feel like those expectations are not seeing the full picture if we are to believe his statement that AIBs have not been fully briefed and given unlocked cards. I mean, an expectation is just that. It's an educated guess. If they had the real performance, it would no longer be an expectation. We already know how the 4090 performs.

View attachment 70236
View attachment 70235

For AMD to match the 4090 in raster at 350W it needs around a 62% perf/watt gain.

Not impossible given the 6900XT had a 64% perf/watt gain vs the 5700XT but not easy either.

I don't see that MLID has anything new to offer here.
  • Points 1 and 2 can be taken from that picture of the reference design.
  • Point 3 is new but could just be BS or is just a guess as to the lack of leaks.
  • Point 4 can be taken from the Igor picture.
  • Point 5 can be calculated using the 50% perf/watt gain and a 350W TBP from the 6900XT or 6950XT.
  • The bit about supply could also just be made up BS that will be hard to verify after the fact.
So just looks like a lot of hot air and a restatement of stuff we have already seen.
 

psolord

Golden Member
Sep 16, 2009
1,875
1,184
136
@Saylick

A lot of 'fine wine' was AMD simply lacking the development capacity to use their hardware fully at release and only unlocking the actual power of the card much later. So less fine wine is generally better.

Oh great, that means that Intel will be able to talk about fine whisky in a few years regarding Arc?

I bet they will call it something stupid like 24karat malt drivers, or something.
 

PJVol

Senior member
May 25, 2020
513
435
106
Isn't AMD doing just the same thing Nvidia did, going to launch two 7900 cards? Why there's no 7800xt, as it was with the Navi 21 where 6900xt full fat and 6800xt cut down versions of the flagman chip were launched first?
For me it's sign of not willing to be aggressive at all pricewise for the 20CU board.
 
  • Like
Reactions: Joe NYC and adamge