Question Speculation: RDNA3 + CDNA2 Architectures Thread

Page 242 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

uzzi38

Platinum Member
Oct 16, 2019
2,552
5,527
146

Frenetic Pony

Senior member
May 1, 2012
218
179
116
The layoffs are all in part because "money is expensive" which is a stupid business term meaning loans for giant companies were so cheap a few years ago and everything was growing so fast they were basically getting money for free, but now it's expensive and they need to pay some of that off.

And since China might randomly imprison your employees or raid your offices or some other redacted you've no defense over getting rid of exposure to that BS is an easy first choice. That the primary employee count there was from RTG/Graphics is probably as much of a coincidence as anything.

Speaking of China, the GRE stuff is marked down in part because it doesn't have to go through any "from China" tarrifs (or shipping costs). I mean, it's obvious when you say it out loud, but I just realized that.


We keep telling you no profanity in the tech forums.


esquared
Anandtech Forum Director
 
Last edited by a moderator:

Ajay

Lifer
Jan 8, 2001
15,332
7,787
136
The layoffs are all in part because "money is expensive" which is a stupid business term meaning loans for giant companies were so cheap a few years ago and everything was growing so fast they were basically getting money for free, but now it's expensive and they need to pay some of that off.

And since China might randomly imprison your employees or raid your offices or some other bullshit you've no defense over getting rid of exposure to that BS is an easy first choice. That the primary employee count there was from RTG/Graphics is probably as much of a coincidence as anything.

Speaking of China, the GRE stuff is marked down in part because it doesn't have to go through any "from China" tarrifs (or shipping costs). I mean, it's obvious when you say it out loud, but I just realized that.
Hmm, thought most of RTG's driver team was in China, so, this doesn't sound good. Maybe AMD is had been building up another driver team somewhere on the down-low?
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,312
2,798
106
RX 7900M 3dmark score
Videocardz.com
RADEON-7900M-3DMARK-TIMESPY.jpg

~4% better than RTX 4080M.
The raster performance is not bad, but efficiency is worse than Nvidia although more Vram is a plus.
 

jpiniero

Lifer
Oct 1, 2010
14,393
5,112
136
  • Like
Reactions: Tlh97 and psolord

GodisanAtheist

Diamond Member
Nov 16, 2006
6,544
6,743
136

New bundle with the Avatar game.

- "Buy Navi and become Na'Vi"

1699389180446.png
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,312
2,798
106
Nice.
So, who said AMD fell short of 50%+ ppw gen-to-gen uplift? )
Are you sure this is a great result?
csm_2023_01_04_17_02_30_3DMark_Professional_Edition_83af94e13b.jpg

RX 6850M XT - 11,912 pts 165W
RX 7900M -19,908 19,743 pts 180(200)W, including smart shift.
If that score was at 180W TGP, then that's 53 52% ppw.
If that score was at 200W TGP, then that's 38 37% ppw.

Game frequency:
RX 6850M XT: 2463 MHz (135%)
RX 7900M: 1825 MHz (100%)

At best only 53 52% PPW gain?
That's not a good result considering you have 80% more CUs, 26% lower clocks and better process.
I would rather call It disappointing compared to Ada.

RTX 4080 Laptop(58SM) vs RTX 3080Ti Laptop(58SM)
19,565 pts at 175W vs 13,835 pts at 175W
41% PPW gain despite having the same amount of Cuda. So performance came from much higher clocks, offsetting the better process.

edit: wrong score for RX7900M. Fixed It.
 
Last edited:
  • Like
Reactions: Tlh97

PJVol

Senior member
May 25, 2020
505
422
106
Are you sure this is a great result? ... At best only 53% PPW gain?
It would be if 180W was TBP , but it's not.
And 53% "only" ... come on )) Compared to the 6800XT it would be great

But seriously, the main issue with RDNA3 is their chiplet obsession, the core logic power is Ok.
Just look at the power consumed by the Soc, Vddci and Mem in 7900 GRE and XTX compared to my 6800XT in stock
rdna3 vs rdna2 uncore power .png
7900 screenshots taken from the PCGH reviews, 6800XT I made in timespy.
"Uncore" power consumption in 7900's is ~ 40% of TGP !
60W from Vdd SoC rail, lol
 
Last edited:

TESKATLIPOKA

Platinum Member
May 1, 2020
2,312
2,798
106
It would be if 180W was TBP , but it's not.
There is no TBP(whole board), but only TGP(GPU+memory) for mobile chips.
TGP is 180W + additional 20W from Smart Shift. Most likely this score is with Smart shift.
And 53% "only" ... come on )) Compared to the 6800XT it would be great
Would It?
Reduce frequency and undervolt 6800XT as shown below and you end up with 82(111)W lower power consumption than stock.
Screenshot_14.png
That's already 300-82= 218W TBP. If you looked at only TGP then It would be <200W.
So It turns out 7900M's efficiency improvement is clearly not that great.
 
Last edited:

TESKATLIPOKA

Platinum Member
May 1, 2020
2,312
2,798
106
But seriously, the main issue with RDNA3 is their chiplet obsession, the core logic power is Ok.
Just look at the power consumed by the Soc, Vddci and Mem in 7900 GRE and XTX compared to my 6800XT in stock
View attachment 88697
7900 screenshots taken from the PCGH reviews, 6800XT I made in timespy.
"Uncore" power consumption in 7900's is ~ 40% of TGP !
60W from Vdd SoC rail, lol
It is still part of the chip, so It doesn't look good for the whole GPU.
From those screens It looks to me like:
VDDR_GFX -> GFX power
VDDR_SOC + VDDCR_USR -> SOC
VDDIO -> MVDD
VDDCI_MEM -> VDDCI
BTW, the Watts in your screen combined are more than GPU PPT = 255W.

P.S. Maybe you could do undervolting(underclocking) with your card, to see how much you save on power.
 

PJVol

Senior member
May 25, 2020
505
422
106
Would It?
If 7900M was discrete GPU with TBP 180W and scored 19700 in TimeSpy, sure it would )
Reduce frequency and undervolt 6800XT as shown below and you end up with 82(111)W lower power consumption than stock.
I just don’t understand, if you need to reduce power consumption, why limit the frequency when there is a power limit for this purpose?
Questionable method to say the least, and besides there's no comparative data from tests.
 
Last edited:

TESKATLIPOKA

Platinum Member
May 1, 2020
2,312
2,798
106
I just don’t understand, if you need to reduce power consumption, why limit the frequency when there is a power limit for this purpose?
Questionable method to say the least, and besides there's no comparative data from tests.
Power limit was actually set to +15%.
You have 6800XT, just test It yourself to see what happens. :)
 

PJVol

Senior member
May 25, 2020
505
422
106
Power limit was actually set to +15%.
You have 6800XT, just test It yourself to see what happens.
IDK if you had any RDNA card or played with its settings before, but to me this method looks absurd and whoever posted it seems to have a vague understanding of the basic perf/power management principles in these GPUs.

Without going "why and how", I'd rather post my version of "proper" UV :)
so you can tell me if I still need to test (even though I can tell you what would happen)

BTW, the Watts in your screen combined are more than GPU PPT = 255W.
I forgot to mention that the sensors in that pic are the "max" column, not "current", and as you may guess they are not sensor's data snapshot. For example, in GT1 the peak Gfx Power was ~195W and more "package power budget" went toward "uncore" rails, hence their peaks retained in the panel.
 

Attachments

  • Proper UV.png
    Proper UV.png
    906.3 KB · Views: 21
Last edited:

PJVol

Senior member
May 25, 2020
505
422
106
It is still part of the chip, so It doesn't look good for the whole GPU.
From those screens It looks to me like:
VDDR_GFX -> GFX power
VDDR_SOC + VDDCR_USR -> SOC
VDDIO -> MVDD
VDDCI_MEM -> VDDCI
Maybe I wasn't clear, its easier to compare 6800XT 6900XT and 7900GRE, both have 72 80 CU, 16Gb vram and same Boost clocks.
The 7900 gre at 130W GfxCore scored 8% higher matches 6900XT in TimeSpy than the 6800 XT. So, roughly speaking:

.......7900 GRE at 130125 W GfxCore ~= 6800 6900 XT at 202W GfxCore

Now imagine they pack all RDNA 3 IP-blocks into the monolithic piece of silicon keeping "uncore" power at the rdna2 level. I'm pretty sure AMD would have done so if they hadn't put a ton of effort into R&D something as complex as the MI-300.
 
Last edited:

tajoh111

Senior member
Mar 28, 2005
295
309
136
So much for that chipandcheese article about RDNA3 being just a better architecture for Starfield.

starfield update.jpg


From a 8% loss to a 23% win for the RTX 4090 in 2 months. AMD bucks definitely crippled Lovelace at launch. These gains were the result of not drivers but an update from Bestheda, thus the burden was on Bestheda.

What has the last time a game gained 40%(123/92) performance in two months, particular when both vendors having game ready drivers at launch.
 
  • Wow
Reactions: psolord

hlreijnd

Junior Member
Sep 30, 2016
1
11
81
So much for that chipandcheese article about RDNA3 being just a better architecture for Starfield.

View attachment 88756


From a 8% loss to a 23% win for the RTX 4090 in 2 months. AMD bucks definitely crippled Lovelace at launch. These gains were the result of not drivers but an update from Bestheda, thus the burden was on Bestheda.

What has the last time a game gained 40%(123/92) performance in two months, particular when both vendors having game ready drivers at launch.
This guy uses different platforms for both GPU's. He uses the the Intel platform for the 4090, which is faster in Starfield. Doesn't even mention it in the graphs, only in his tweets. CapframeX manages to be a complete fraud once again.
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,312
2,798
106
IDK if you had any RDNA card or played with its settings before, but to me this method looks absurd and whoever posted it seems to have a vague understanding of the basic perf/power management principles in these GPUs.

Without going "why and how", I'd rather post my version of "proper" UV :)
so you can tell me if I still need to test (even though I can tell you what would happen)
I sadly don't have any RDNA card.
You have one, that's why I said you can test this out yourself to see If that guy's post is BS or not.
Maybe I wasn't clear, its easier to compare 6800XT and 7900GRE, both have 72 CU, 16Gb vram and same Boost clocks.
The 7900 gre at 130W GfxCore scored 8% higher in TimeSpy than the 6800 XT. So, roughly speaking:

.......7900 GRE at 125W GfxCore ~= 6800 XT at 202W GfxCore

Now imagine they pack all RDNA 3 IP-blocks into the monolithic piece of silicon keeping "uncore" power at the rdna2 level. I'm pretty sure AMD would have done so if they hadn't put a ton of effort into R&D something as complex as the MI-300.
Just a reminder, 7900GRE has 80CU not 72.

We already have RDNA3 monolith called RX 7600, and It didn't impress even against N23.
By far the biggest reason 7900GRE looks a lot more efficient is 5nm process.
 

Mopetar

Diamond Member
Jan 31, 2011
7,784
5,879
136
This guy uses different platforms for both GPU's. He uses the the Intel platform for the 4090, which is faster in Starfield. Doesn't even mention it in the graphs, only in his tweets. CapframeX manages to be a complete fraud once again.

Time and time again he's proven to be utterly duplicitous. I hope he's either getting paid for what he does or is just a massive troll, because the alternatives are far sadder.

I'd lump him in with MLID, RGT, or UserBenchmark in that posting him is at best good for a laugh.
 

PJVol

Senior member
May 25, 2020
505
422
106
Just a reminder, 7900GRE has 80CU not 72.

We already have RDNA3 monolith called RX 7600, and It didn't impress even against N23.
By far the biggest reason 7900GRE looks a lot more efficient is 5nm process.
Yeah, my mistake - fixed it (I might have confused it with 7900M), but anyway, this doesn't change much if we take 6900XT instead.
The RX 7600 is built on a 6nm process, which those in the know say is essentially a tuned 7nm. Besides N33 seems to have other architectural changes and isn't suitable for gen-to-gen comparison.

But my point still stands:
looking at how power efficient the Gfx core is, it’s amaizing how much is lost on the SoC, the fabric and the rest.

You have one, that's why I said you can test this out yourself to see If that guy's post is BS or not.
No, not entirely BS, just kinda strange roundabout way to achieve a goal, no matter the inherent losses :rolleyes:
 
Last edited:

jpiniero

Lifer
Oct 1, 2010
14,393
5,112
136
Alienware m18 R1 laptop
7945HX + RX7900M for only $2300.


Dell says a lot of the issues with the 7900M can be solved with using an old driver (NBC in their review used the latest driver). But there are still issues.