Question Speculation: RDNA3 + CDNA2 Architectures Thread

Page 148 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

uzzi38

Platinum Member
Oct 16, 2019
2,635
5,984
146

gdansk

Platinum Member
Feb 8, 2011
2,123
2,629
136
I'm curious. Since RDNA3 is broken, we can expect marked improvement next year with a RDNA3+ refresh.

But how much does nVidia have in the tank in comparison? Is the RTX 4090 a cut down chip? do they have room for an RTX 4090ti to compete with a RX 7970
Nvidia has 16SM and 150W spare for the 4090 Ti. There is little doubt in my mind it will easily beat a refreshed Navi 31 even if the refresh is 30% faster and hits 3.2GHz like the early rumors.
 

tajoh111

Senior member
Mar 28, 2005
298
312
136
One more variable for performance improvement is the actual game clock of the 6950xt being used as a reference point.

AMD specifies the game clocks of the 6950xt to be 2.1ghz but actual testing shows the average clock to be 2.4ghz about using the reference design(Using tech powerups review) using 23 games.

If AMD is downclocking the 6950xt to an actual game clock of 2.1ghz which is quite possible with the 300watt testing environment vs the actual 340watts, you might have to subtract about 10% performance from the performance claims if we want to use current reviews to extrapolate performance.

If we do that, we actually get numbers AMD 1.54x performance increase.

This is 2.3ghz/2.1Ghz = 1.095
1.095 x 1.175(IPC increase) x 1.2(20 percent more CUs) = 1.544 which is basically what AMD said the performance increase is.

However if we consider this, that means the design AMD is using to compare the improvement gen on gen is 14% lower clocked in reviews which should lead to about a 10% loss in performance which is about where the 6900xt is.

If we are actually getting around a 1.54 increase in performance vs a 6900xt. I could see why AMD wants to compare this to the RTX 4080 16gb. If using computerbase.de results we could see the 7950xtx losing to the RTX 4090 by 25% or using techpowerups results but corrected with their updated CPU game comparisons using faster processors, losing to the RTX 4090 by 22%. This is not a small amount and AMD would not want to post graphs like this even if the price difference is $600 dollars.

If the RTX 4080 ends up about 20% faster than a rtx 3090 ti, this leads to 7900 xtx being about 7 to 15 percent faster if we use techpowerups or computerbase.de review. Add the ray tracing advantage of Nvidia cards and all of a sudden the pricing of the 7900xtx makes sense along why AMD is targeting the RTX 4080.

relative-performance_3840-2160.png


Again. I am the only single person to get close to the actual performance literally.

The hype train was off the charts. It was not deep logic, it's people have become so irrational with AMD hype. Ignoring marketing bull and taking what is often best case and exaggerated performance in marketing material as the average.

The fact that Nvidia represents 80 percent of the market, while this hype thread is twice the size of the lovelace thread, just shows how engrained red this forum is. Logic gets lost to hype for team red. Rational discussion needs to be considered more in these hype thread so less time is wasted. The amount of people lying to themselves saying this performed as expected when their previous post says otherwise are numerous. I say to these people, you don't need to do free marketing for AMD at your own reputations expense.
 
Last edited:

Kaluan

Senior member
Jan 4, 2022
500
1,071
96
Yup, but the problem it seems there is no such +20% in performance in actual overclocking. Best so far I've seen is a +5% @2700 MHz and some undervolting. Simply raising the power limit seems to do not much.
These are all reference (ChipHell reviewed some customs but IDK what to make of it).

So they're stuck at 366-375W of usable power. Increased power limits don't mean much in that case.

I've seen some limited undervolting data and it's not half bad. I actually saw one review mentioning AMD reached out and told them to try -150mV undervolting + increase power limit. But they didn't include that yet in the review. Don't even remember which outlet it was lol grrr
 
  • Like
Reactions: scineram

Saylick

Diamond Member
Sep 10, 2012
3,172
6,407
136
relative-performance_3840-2160.png


Again. I am the only single person to get close to the actual performance literally.

The hype train was off the charts. It was not deep logic, it's people have become so irrational with AMD hype. Ignoring marketing bull and taking what is often best case and exaggerated performance in marketing material as the average.

The fact that Nvidia represents 80 percent of the market, while this hype thread is twice the size of the lovelace thread, just shows how engrained red this forum is. Logic gets lost to hype for team red. Rational discussion needs to be considered more in these hype thread so less time is wasted.
I wouldn't say that this forum is "engrained red". There's plenty of us with Nvidia GPUs in our rigs. The reason why AMD gets more comments is because that's the side that has more speculation. That's it. AMD is pretty much air tight these days, so the only way you're going to get discussion is through rumors, baseless or not. Meanwhile, Nvidia executes like a well-oiled machine so there's really no such thing as a major surprise. Lastly, if your concern is time wasting because there's a lack of rational discussion, you can always chose to not go into AMD speculation threads. Simple as that. I think a lot of us find it fun to "waste time" and discuss rumors, but that's our choice.
 

JujuFish

Lifer
Feb 3, 2005
11,004
735
136
IMO, the biggest problem with these cards is that they don't seem to match AMD's claims. AMD has been reliably forthright and honest about their numbers for a little while now, such that I have been willing to somewhat trust their numbers before reviewers get their hands on the products. Seems like they've broken that trust.
 

Kaluan

Senior member
Jan 4, 2022
500
1,071
96
Nvidia has 16SM and 150W spare for the 4090 Ti. There is little doubt in my mind it will easily beat a refreshed Navi 31 even if the refresh is 30% faster and hits 3.2GHz like the early rumors.
You think the consumerbase will take up 600W GPUs just like that? Oh dear...


From my limited understanding, N31 h/w bugs are not just about not clocking high enough. But higher clocks & better v/f curve is a start.
 
  • Like
Reactions: Leeea

gdansk

Platinum Member
Feb 8, 2011
2,123
2,629
136
You think the consumerbase will take up 600W GPUs just like that? Oh dear...
Yes, the type of people that pay $2000 GPU do not care at all. You can call them all sorts of names but it doesn't change the fact they will buy a 600W TBP card. They'll buy so eagerly that it won't be in stock at MSRP for a year after it launches. More worrying it might not even have an MSRP and launch with only 3rd party cooler designs.
 

tajoh111

Senior member
Mar 28, 2005
298
312
136
I wouldn't say that this forum is "engrained red". There's plenty of us with Nvidia GPUs in our rigs. The reason why AMD gets more comments is because that's the side that has more speculation. That's it. AMD is pretty much air tight these days, so the only way you're going to get discussion is through rumors, baseless or not. Meanwhile, Nvidia executes like a well-oiled machine so there's really no such thing as a major surprise. Lastly, if your concern is time wasting because there's a lack of rational discussion, you can always chose to not go into AMD speculation threads. Simple as that.

I just see an obvious train wreck waiting to happen for hype and simply want to get things back onto rational thinking, so we do not get get into the wrecks we have today.

As is, neither the RTX 4080 and RX 7900xtx deserve your money. While I can see why the RTX 4080 was priced as it is as a result of inventory overload, the 7900xtx is nearly as a bad of a value when you take into account the RT loss and the AMD brand. AMD spends vastly less on R and D so they can price their chips lower(lower than this $200 amount). At $999 the chip is not a good value, particularly with how much AMD is promoting the saving of this chiplet design.

When you add the 5nm chip size, the cooler production cost and the R and D depreciation expense, I suspect the margins of the 7900xtx and rtx 4080 are pretty similar. So what this means is both Nvidia and AMD are being greedy.

As a result, both cards need to fail at their price point for pricing to get better. Buy AMD simply because they are the little guy is going to do nothing when AMD keeps raising their prices are pricing it more or less equal to Nvidia minus 5-10%(if all things are equal). AMD can afford to price their chips lower because if we separate CPU R and D, AMD is spending a forth or fifth on R and D compared to Nvidia on GPU which is 2nd highest expense after wafers/packaging at 6 to 8 billion a year. The rest of the market is not as red as this forum and the higher AMD prices go, the more demand towards Nvidia go. This causes Nvidia to raise prices and subsequently AMD.
 
  • Like
Reactions: psolord and xpea

maddie

Diamond Member
Jul 18, 2010
4,747
4,691
136
relative-performance_3840-2160.png


Again. I am the only single person to get close to the actual performance literally.

The hype train was off the charts. It was not deep logic, it's people have become so irrational with AMD hype. Ignoring marketing bull and taking what is often best case and exaggerated performance in marketing material as the average.

The fact that Nvidia represents 80 percent of the market, while this hype thread is twice the size of the lovelace thread, just shows how engrained red this forum is. Logic gets lost to hype for team red. Rational discussion needs to be considered more in these hype thread so less time is wasted. The amount of people lying to themselves saying this performed as expected when their previous post says otherwise are numerous. I say to these people, you don't need to do free marketing for AMD at your own reputations expense.
Reputations? I'm assuming that most here do this as a hobby and through general enthusiasm. If this is to be my life's work and reputation, I decline. Being wrong here is so trivial.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,825
7,190
136
@tajoh111 I think a 50-60% gen on gen increase at the top end is a totally rational perspective to take, and that was the general perspective I saw on this board as we zoned in on release. A die shrink, an arch bump, similar total silicon to the prior gen top part.

AMD seemed to have found its footing with the RDNA arch, and RDNA2 really knocked things out of the park while staying on the same 7nm process. 5700XT to 6700XT was a 30% performance bump with the same shader layout (much larger die though). The arch scaled fantastically top to bottom. AMD (really any GPU vendor, and AMD also had great CPUs in its corner) was making money hand over fist so there really shouldn't have been a funding problem.

I think it would have been irrationally pessimistic to say that RDNA 3 would at best scrap together 35-40% improvements over the prior arch despite more than doubling the transistor count.

I hope we get a post mortem on this launch, part of the reason AMD is discussed more is the variability of their launches as well. NV is a known quantity, we generally know what they're going to do and there is a good amount of rabble rousing when they miss the mark (2080Ti only managing a 30% raster improvement over the prior gen).
 

pj-

Senior member
May 5, 2015
481
249
116
Yes, the type of people that pay $2000 GPU do not care at all. You can call them all sorts of names but it doesn't change the fact they will buy a 600W TBP card. They'll buy so eagerly that it won't be in stock at MSRP for a year after it launches. More worrying it might not even have an MSRP and launch with only 3rd party cooler designs.

Idk I think I'm the target demo for high end cards (way overpaid for a 3090 during the shortage, constantly buying dumb electronics, etc) and was already put-off enough with the 4090 to not get one. The physical size, price, and already high power consumption made it very unappealing. Someone I know who also has a 3090 is in the same boat.

I think Nvidia has found the limit for the average rube who buys high end. There will be an audience for even more ridiculous cards but I can't imagine it's very big.
 

biostud

Lifer
Feb 27, 2003
18,251
4,765
136
@tajoh111 I think a 50-60% gen on gen increase at the top end is a totally rational perspective to take, and that was the general perspective I saw on this board as we zoned in on release. A die shrink, an arch bump, similar total silicon to the prior gen top part.

AMD seemed to have found its footing with the RDNA arch, and RDNA2 really knocked things out of the park while staying on the same 7nm process. 5700XT to 6700XT was a 30% performance bump with the same shader layout (much larger die though). The arch scaled fantastically top to bottom. AMD (really any GPU vendor, and AMD also had great CPUs in its corner) was making money hand over fist so there really shouldn't have been a funding problem.

I think it would have been irrationally pessimistic to say that RDNA 3 would at best scrap together 35-40% improvements over the prior arch despite more than doubling the transistor count.

I hope we get a post mortem on this launch, part of the reason AMD is discussed more is the variability of their launches as well. NV is a known quantity, we generally know what they're going to do and there is a good amount of rabble rousing when they miss the mark (2080Ti only managing a 30% raster improvement over the prior gen).
Exactly, also when they said 54% increase performance/watt then you kind of expect that a 355W card (7900XTX) will perform >54% faster than a 300W (6900XT)
 

Panino Manino

Senior member
Jan 28, 2017
821
1,022
136
Is not that people become "irrational" with hype for AMD, it's just that people are very optimistic waiting to see a similar turnaround as the CPU side. Because opportunity AMD has, it could make a bigger GPU, it could had reached higher clocks, but as always AMD missed the opportunity. There isn't someone there like Jacket Man to force the team to make a ludicrous GPU no matter the cost.

My disappointment this time is that all "estimations" as that it would land BETWEEN the 4080 and 4090.
Tell me, does that performance looks like it's in BETWEEN, it's barely on par with the 4080 even in raster.

AMD should abandon the "leader" talk and just say "it's bad, but at least it's cheaper".
 
  • Like
Reactions: insertcarehere

Kaluan

Senior member
Jan 4, 2022
500
1,071
96
It seems the extra flop/cycle is almost impossible to use?
If you take the TFLOPs and divide by 2 then compare to 6950XT it matches almost exactly with TPU's 4K results.
So all that complication in the compiler for no real gain.
Well, duh. That's were the "N31 broke" part comes in.

Are there still people around who are gonna tell me N31 is working as intended and the OG leak clocks and performance projections were just based on pure fabrications?

When we have data like this now?

ezgif-3-521d9c35fb.jpg
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,825
7,190
136
It will be really funny, and I mean reeeeeaaaaalllllyyyy funny if the N32 7800XT is just a few % off from the 7900XT and the N31 die gets refreshed next year.

At that point I'd have to wonder outloud to myself why the actual **** wouldn't AMD just lead with the N32 die and then launch the N31/7900XTX later when its sorted out? They basically did it with Tahiti/Hawaii for the first go at the 7xxx series, they really should have done it again (if N32 isn't also a complete disaster).

At this rate AMD is not only going to manage to get people to buy NV's top end, but NV's last gen stack too (since naturally one would wonder if the 7800XT would even catch a 6800XT at this point, and at what price?).

I mean in this environment, NV dominating the market, Intel dipping their toes in (for realizies this time!), AMD is honestly not in a good position to handle a flubbed launch.
 
  • Like
Reactions: Joe NYC and RnR_au

exquisitechar

Senior member
Apr 18, 2017
657
871
136
It will be really funny, and I mean reeeeeaaaaalllllyyyy funny if the N32 7800XT is just a few % off from the 7900XT and the N31 die gets refreshed next year.

At that point I'd have to wonder outloud to myself why the actual **** wouldn't AMD just lead with the N32 die and then launch the N31/7900XTX later when its sorted out? They basically did it with Tahiti/Hawaii for the first go at the 7xxx series, they really should have done it again (if N32 isn't also a complete disaster).
Probably because then they'd have nothing in 2022 and they said they would.

Full N32 could be very close to the 7900 XT indeed.
 

leoneazzurro

Senior member
Jul 26, 2016
930
1,465
136
Is not that people become "irrational" with hype for AMD, it's just that people are very optimistic waiting to see a similar turnaround as the CPU side. Because opportunity AMD has, it could make a bigger GPU, it could had reached higher clocks, but as always AMD missed the opportunity. There isn't someone there like Jacket Man to force the team to make a ludicrous GPU no matter the cost.

My disappointment this time is that all "estimations" as that it would land BETWEEN the 4080 and 4090.
Tell me, does that performance looks like it's in BETWEEN, it's barely on par with the 4080 even in raster.

AMD should abandon the "leader" talk and just say "it's bad, but at least it's cheaper".

It's 5-10% faster than 4080 except RT, not "barely on par", and it's 200$ cheaper.
Yes, it is not the eathbreaking product many hoped, but at the price it's not bad at all considering the market and what the competition has. Please stop all the whining.
 

lucasworais

Junior Member
Dec 11, 2022
1
0
6
It will be really funny, and I mean reeeeeaaaaalllllyyyy funny if the N32 7800XT is just a few % off from the 7900XT and the N31 die gets refreshed next year.

At that point I'd have to wonder outloud to myself why the actual **** wouldn't AMD just lead with the N32 die and then launch the N31/7900XTX later when its sorted out? They basically did it with Tahiti/Hawaii for the first go at the 7xxx series, they really should have done it again (if N32 isn't also a complete disaster).

At this rate AMD is not only going to manage to get people to buy NV's top end, but NV's last gen stack too (since naturally one would wonder if the 7800XT would even catch a 6800XT at this point, and at what price?).

I mean in this environment, NV dominating the market, Intel dipping their toes in (for realizies this time!), AMD is honestly not in a good position to handle a flubbed launch.

Rofl, this is exactly what gonna happen.
 

insertcarehere

Senior member
Jan 17, 2013
639
607
136
It's 5-10% faster than 4080 except RT, not "barely on par", and it's 200$ cheaper.
Yes, it is not the eathbreaking product many hoped, but at the price it's not bad at all considering the market and what the competition has. Please stop all the whining.

This draws nearly as much power as a 1060 on full bore... playing 1440p YouTube. And given the multi-monitor woes for a good set of consumers this will practically consume more energy than any 4090. That'd be embarrassing at $600 let alone $1k..

TPU said:
We measured a shocking power consumption result for multi-monitor and media playback. Here, just the graphics card alone consumes 103 W and 88 W, respectively. This is way too high, RTX 4080 uses only 20-23 W in the same scenario, even the last generation RDNA2 cards were less than half that with 40 W. This can only be some sort of driver bug, because it basically disqualifies the new Radeons for multi-monitor use. Remember, this is idle sitting at the desktop, not gaming. Wasting that much power is simply a big no-no, especially in these times. This also affects YouTube playback in your browser (YT 4K/1440p = ~100 W, 1080p = ~50 W, 720p = ~20 W). AMD has had a long history of drawing a lot of power in these power states, so I'm not 100% convinced this really is so easy to fix. I also find it hard to imagine that nobody at AMD tests multi-monitor power draw, so in some meeting somewhere, someone decided "we will release it like that."
 

leoneazzurro

Senior member
Jul 26, 2016
930
1,465
136
This draws nearly as much power as a 1060 on full bore... playing 1440p YouTube. And given the multi-monitor woes for a good set of consumers this will practically consume more energy than any 4090. That'd be embarrassing at $600 let alone $1k..

That could be or could not be a problem. For someone here 450W or 600W high end cards are ot an issue. And I suspect that if Nvidia had a similar issue, a lot of people here would not care.
 

gdansk

Platinum Member
Feb 8, 2011
2,123
2,629
136
That could be or could not be a problem. For someone here 450W or 600W high end cards are ot an issue. And I suspect that if Nvidia had a similar issue, a lot of people here would not care.
Oh no we'd complain. Pretty unacceptable. We had a little fit with the RTX30 high multi monitor HDR power. But with Nvidia people expect them to fix it.
But Phoronix suggested that it was not an issue with the Linux drivers. So seems to be a software problem.
 
Last edited:

gdansk

Platinum Member
Feb 8, 2011
2,123
2,629
136
I did not see many complaining for the 12V cable fiasco, or for a $2000 card having DP 1.4. And these are not SW issues.
It was a big subject until GN said, basically, you're using it wrong. At least you don't have to wait for Nvidia to make your card work, you can just carefully plug it in anytime.

DP1.4 does suck. But it isn't a deal breaker.
 
  • Like
Reactions: scineram