Question Speculation: RDNA3 + CDNA2 Architectures Thread

Page 94 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

uzzi38

Platinum Member
Oct 16, 2019
2,705
6,427
146

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
29,625
24,616
146
idk about this future proofing... my brother is running an RX 480 8GB card. Was there ever a game it could run at 60Hz that would use the full vram?
That's an outdated mindset. Years ago, debates going on here about why a card wasn't fast enough to need more vram were all the rage.

Times change. Now, more vram even on slower cards means you can turn textures up. One of the best visual improvements, and without costing much in performance.
 

Timorous

Golden Member
Oct 27, 2008
1,748
3,240
136

That would be 86% faster than the 6950XT at a similar power draw.

Based on info provided at B3D regarding laptop N32 performance and power draw that show an 80% perf/watt gain so using that number you can get to 10% faster than the 4090 at 350W. Seems a bit far fetched as I don't think AMD would have sandbagged their >50% claim that much.

OTOH 4090 +/- 10% at 350W is probably ballpark for 4K raster so a high end guess but not an absolutely impossible guess I don't think.
 
  • Like
Reactions: Tlh97 and Joe NYC

SteveGrabowski

Diamond Member
Oct 20, 2014
7,450
6,172
136
Ha! What do you call NVIDIA going from going from an MSRP of $600 for the RTX 2080 and 3080 to a $900 card that they "unlaunched" and a $1200 card?

I have already pretty much written Nvidia off, no interest in any of their cards unless the 3060 Ti absolutely crashes in price at some point this month.
 

maddie

Diamond Member
Jul 18, 2010
4,881
4,951
136
If the new shaders do have 1/2 the transistors (1/2 the area on identical node), what does this say about switching power? Why is this ignored by seemingly everyone?
 

eek2121

Diamond Member
Aug 2, 2005
3,100
4,398
136
If the claims of ~4090 performance at 350W are true, it sounds like 3090 performance may be possibly at ~175W? If so, that is pretty amazing.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
God I hope we get a nice 1440p card at 150W for ~$300 or so.
I cannot think of a single 1440 card that has been sold for $300. The 5700 (non-XT) gets close at $349.


The first two were released in the mining boom. Their new cards will be releasing when gpu mining is dead and into a trash economy. I hope gamers don't let AMD normalize mining era MSRPs.
The mining craze started after Ampere/Navi2 launched, not before. Which is why the MSRP's went up as the life cycle went on.
 
Last edited:
  • Like
Reactions: Mopetar

Thunder 57

Platinum Member
Aug 19, 2007
2,993
4,571
136
I cannot think of a single 1440 card that has been sold for $300. The 5700 (non-XT) gets close at $349.



The mining craze started after Ampere/Nava2 launched, not before. Which is why the MSRP's went up as the life cycle went on.

I got a 5700 for $329. I'd gladly do that again. Before that I had an RX 480 8GB that cost $239. I used that for 1440p in Doom 2016 and BF 1. I had to turn down some settings on BF, but it was doable.
 

biostud

Lifer
Feb 27, 2003
18,700
5,434
136
Is that indicating if you limit the NVIDIA card to the same wattage as the AMD card it will perform better? I don't think its indicating it beats the NVIDIA card when it is allowed to use its full power.
It is indicating that AMD beats a 450W card using far less power
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
I got a 5700 for $329. I'd gladly do that again. Before that I had an RX 480 8GB that cost $239. I used that for 1440p in Doom 2016 and BF 1. I had to turn down some settings on BF, but it was doable.

Was that a new 5700 before the 6K series came out? The AIB cards were most in the $380 range. $329 is a screaming deal.

The 480 (I had one also) was sold as a 1080P card. It was not marketed for 1440. The Fury cards were intended for 1440 and 4K (They came out a year earlier though). And then Vega took over that place in the market a year after Polaris launched.
 
  • Like
Reactions: Leeea

Thunder 57

Platinum Member
Aug 19, 2007
2,993
4,571
136
Was that a new 5700 before the 6K series came out? The AIB cards were most in the $380 range. $329 is a screaming deal.

The 480 (I had one also) was sold as a 1080P card. It was not marketed for 1440. The Fury cards were intended for 1440 and 4K (They came out a year earlier though). And then Vega took over that place in the market a year after Polaris launched.

Yes, before RDNA2. Got one for $329 on 21 May 2020 and it included two games which I never really played. Even if it was a cheaper AIB it looks like I got quite the deal at the time. Just got lucky I suppose.

At 1440p, the 480 did very well in Doom since it used Vulkan. It was also better than the 1060 6GB in BF which the other demanding game I was playing at the time. That made it an easy decision.
 
Last edited:

Kaluan

Senior member
Jan 4, 2022
504
1,074
106
Based on info provided at B3D regarding laptop N32 performance and power draw that show an 80% perf/watt gain
Using synthetics such as Firestrike and Tine Spy, the gap difference between 145-165W 6800M/6850M XT and 335W 6950XT is more like ~100% perf/watt for the hypothetical mobile N32. From my research, 6950XT scores roughly +95% of those mobile parts' graphics scores (like 11.200 Time Spy Graphics vs 22.000+)

Seems a bit far fetched as I don't think AMD would have sandbagged their >50% claim that much.
Yeah it's not like they said >15% PPC 5 months ago yet we got ~29% on Zen4 announcement/review day 😅

They got >50% perf/watt on the same node with previous gen (and the one before, but not sure if that was RDNA v 7nm Vega or vanilla), would not exactly be a shocker if they achieve quite a bit more with a full node jump and a hugely rearchitected graphics pipeline (maybe the biggest since HD 7000/GCN1).
 
Last edited:

Asterox

Golden Member
May 15, 2012
1,039
1,823
136
For AMD to match the 4090 in raster at 350W it needs around a 62% perf/watt gain.

Not impossible given the 6900XT had a 64% perf/watt gain vs the 5700XT but not easy either.


I don't see that MLID has anything new to offer here.
  • Points 1 and 2 can be taken from that picture of the reference design.
  • Point 3 is new but could just be BS or is just a guess as to the lack of leaks.
  • Point 4 can be taken from the Igor picture.
  • Point 5 can be calculated using the 50% perf/watt gain and a 350W TBP from the 6900XT or 6950XT.
  • The bit about supply could also just be made up BS that will be hard to verify after the fact.
So just looks like a lot of hot air and a restatement of stuff we have already seen.

Keep in mind, there are plenty of games that prefer AMD GPU hardware.In such situations, RDNA3 will be faster, that's for sure.This is obviously one very good such example.

 

blckgrffn

Diamond Member
May 1, 2003
9,301
3,442
136
www.teamjuchems.com
This is insane. Could it be because of optimizing for AMD hardware on consoles?

I've been positive that this was reason - existential, imo - that AMD had to win the console "wars". Their diminutive PC market share just isn't enough, but RDNA2 fixed configuration boxes in living rooms around the world? Suddenly its worth optimizing for, and we know how much effort goes into most PC ports. If the nvidia cards are faster in a brute force kind of way, then they working well enough to not get the optimization hours.

Was that a new 5700 before the 6K series came out? The AIB cards were most in the $380 range. $329 is a screaming deal.

The 480 (I had one also) was sold as a 1080P card. It was not marketed for 1440. The Fury cards were intended for 1440 and 4K (They came out a year earlier though). And then Vega took over that place in the market a year after Polaris launched.

My dad got a 5700 vanilla at Best Buy for like $280, it was a blower though. I got my 5700XT blower on eBay from Dell.com for about $340 with a eBay coupon. For a while I looked like a genius.

At that time I was struggling to justify upgrading from my 290x and really wanted to hold out for RDNA2 but that 5700 worked so good when I tested it! I was hunting used upgrades, and I got counter offer from a eBay seller for $98 on a Fury X Nano at that time, he had like 8 to sell. Obviously I should have bought all of those and held them for one year, and then I could have had cheap very gameable cards for any build throughout the pandemic. Oops ;)

3 years ago!

1667403600884.png

Looks like I gave myself too much credit - it was $370 with tax for my 5700XT:

1667403826995.png
 
Last edited:

Kaluan

Senior member
Jan 4, 2022
504
1,074
106
Since conveniently for my comment, we're on actual games and optimizations now, I'll say I fully expect AMD to make a big fuss about this game and how well RDNA3 ray traces it during the announcement video:

(Title: The Callisto Protocol Will Support Ray Tracing, Even on Eyeballs)

(along with Forespoken and an unnamed Unbisoft game or announcement)
 
Last edited:
  • Like
Reactions: Leeea and Joe NYC

JujuFish

Lifer
Feb 3, 2005
11,125
826
136
RT performance per dollar and power efficiency are the two main things I'll be looking at for my next upgrade. I've held onto my 1070 for quite a while and I think RT has had a long enough time to bake.
 
  • Like
Reactions: Leeea

Mopetar

Diamond Member
Jan 31, 2011
8,114
6,770
136
No real answer for dlss 3.

Honestly I hope AMD doesn't waste the time developing anything comparable.

It is indicating that AMD beats a 450W card using far less power

Has anyone looked at how much performance a 4090 loses if you drop it down to 350W? If Nvidia is only getting the last ~5% performance for at the cost of an extra 100W then it's not too surprising. We've seen AMD get better CPU performance for less power because pushing a chip to the limits just ends up drawing a disproportionate amount of power for the extra performance you get.

RT performance per dollar and power efficiency are the two main things I'll be looking at for my next upgrade. I've held onto my 1070 for quite a while and I think RT has had a long enough time to bake.

I think it'll be at least another generation, probably more likely two before RT is actually ready and a reasonable option for most consumers. It's only with the 4090 that you can actually run a game with it on and get acceptable frame rates so that you don't need to use a fancy upscaling technique to compensate for the performance hit.

If course by then we might see a new craze (8K displays seem possible) and people will have largely quit caring about RT or it gets relegated to a niche like VR.
 

JujuFish

Lifer
Feb 3, 2005
11,125
826
136
I think it'll be at least another generation, probably more likely two before RT is actually ready and a reasonable option for most consumers. It's only with the 4090 that you can actually run a game with it on and get acceptable frame rates so that you don't need to use a fancy upscaling technique to compensate for the performance hit.
If you're playing at 4K, perhaps. My monitor is 1440p, so considerably less demanding.
 
  • Like
Reactions: Leeea

DDH

Member
May 30, 2015
168
168
111
For AMD to match the 4090 in raster at 350W it needs around a 62% perf/watt gain.

Not impossible given the 6900XT had a 64% perf/watt gain vs the 5700XT but not easy either.

I don't see that MLID has anything new to offer here.
  • Points 1 and 2 can be taken from that picture of the reference design.
  • Point 3 is new but could just be BS or is just a guess as to the lack of leaks.
  • Point 4 can be taken from the Igor picture.
  • Point 5 can be calculated using the 50% perf/watt gain and a 350W TBP from the 6900XT or 6950XT.
  • The bit about supply could also just be made up BS that will be hard to verify after the fact.
So just looks like a lot of hot air and a restatement of stuff we have already seen.

No it doesnt. A 50% performance per watt improvement will increase the performance up to 4090 levels @4k @350watts.
Where are you getting 64% from?

100%/300w*1.5*350w = 175% of the 6900xt speed at 4k. According to techpowerups 4k charts 4090 is 75% faster than 6900xt
 

Kaluan

Senior member
Jan 4, 2022
504
1,074
106
Compubase's review had it at 3% on average at 4K but up to 8% slower. Dropping it to 300 W increases the drop on average to 10%.
Performance vs stock/450W takes a much bigger dive in RT and Tensor-using gaming scenarios tho.

Most outlets don't really test power usage with RT-enabled settings in games or even less so while having stuff like 'DLSS' or 'RTX Voice' turned on, do they...

Not to mention power usage @ various framerate caps is probably THE most important aspect to measure. From what I've seen, RDNA2 did even better here than the raw (and not very useful) frames/watt or typical and peak power measurements we see it beat Ampere at.

So just going by SOME data doesn't give you a complete picture. At all.
 
  • Like
Reactions: Tlh97 and Leeea