Question Speculation: RDNA3 + CDNA2 Architectures Thread

Page 93 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

uzzi38

Platinum Member
Oct 16, 2019
2,635
5,984
146

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
I got a 5700 for $329. I'd gladly do that again. Before that I had an RX 480 8GB that cost $239. I used that for 1440p in Doom 2016 and BF 1. I had to turn down some settings on BF, but it was doable.

Was that a new 5700 before the 6K series came out? The AIB cards were most in the $380 range. $329 is a screaming deal.

The 480 (I had one also) was sold as a 1080P card. It was not marketed for 1440. The Fury cards were intended for 1440 and 4K (They came out a year earlier though). And then Vega took over that place in the market a year after Polaris launched.
 
  • Like
Reactions: Leeea

Thunder 57

Platinum Member
Aug 19, 2007
2,675
3,801
136
Was that a new 5700 before the 6K series came out? The AIB cards were most in the $380 range. $329 is a screaming deal.

The 480 (I had one also) was sold as a 1080P card. It was not marketed for 1440. The Fury cards were intended for 1440 and 4K (They came out a year earlier though). And then Vega took over that place in the market a year after Polaris launched.

Yes, before RDNA2. Got one for $329 on 21 May 2020 and it included two games which I never really played. Even if it was a cheaper AIB it looks like I got quite the deal at the time. Just got lucky I suppose.

At 1440p, the 480 did very well in Doom since it used Vulkan. It was also better than the 1060 6GB in BF which the other demanding game I was playing at the time. That made it an easy decision.
 
Last edited:

Kaluan

Senior member
Jan 4, 2022
500
1,071
96
Based on info provided at B3D regarding laptop N32 performance and power draw that show an 80% perf/watt gain
Using synthetics such as Firestrike and Tine Spy, the gap difference between 145-165W 6800M/6850M XT and 335W 6950XT is more like ~100% perf/watt for the hypothetical mobile N32. From my research, 6950XT scores roughly +95% of those mobile parts' graphics scores (like 11.200 Time Spy Graphics vs 22.000+)

Seems a bit far fetched as I don't think AMD would have sandbagged their >50% claim that much.
Yeah it's not like they said >15% PPC 5 months ago yet we got ~29% on Zen4 announcement/review day 😅

They got >50% perf/watt on the same node with previous gen (and the one before, but not sure if that was RDNA v 7nm Vega or vanilla), would not exactly be a shocker if they achieve quite a bit more with a full node jump and a hugely rearchitected graphics pipeline (maybe the biggest since HD 7000/GCN1).
 
Last edited:

Asterox

Golden Member
May 15, 2012
1,026
1,775
136
For AMD to match the 4090 in raster at 350W it needs around a 62% perf/watt gain.

Not impossible given the 6900XT had a 64% perf/watt gain vs the 5700XT but not easy either.


I don't see that MLID has anything new to offer here.
  • Points 1 and 2 can be taken from that picture of the reference design.
  • Point 3 is new but could just be BS or is just a guess as to the lack of leaks.
  • Point 4 can be taken from the Igor picture.
  • Point 5 can be calculated using the 50% perf/watt gain and a 350W TBP from the 6900XT or 6950XT.
  • The bit about supply could also just be made up BS that will be hard to verify after the fact.
So just looks like a lot of hot air and a restatement of stuff we have already seen.

Keep in mind, there are plenty of games that prefer AMD GPU hardware.In such situations, RDNA3 will be faster, that's for sure.This is obviously one very good such example.

 

blckgrffn

Diamond Member
May 1, 2003
9,128
3,069
136
www.teamjuchems.com
This is insane. Could it be because of optimizing for AMD hardware on consoles?

I've been positive that this was reason - existential, imo - that AMD had to win the console "wars". Their diminutive PC market share just isn't enough, but RDNA2 fixed configuration boxes in living rooms around the world? Suddenly its worth optimizing for, and we know how much effort goes into most PC ports. If the nvidia cards are faster in a brute force kind of way, then they working well enough to not get the optimization hours.

Was that a new 5700 before the 6K series came out? The AIB cards were most in the $380 range. $329 is a screaming deal.

The 480 (I had one also) was sold as a 1080P card. It was not marketed for 1440. The Fury cards were intended for 1440 and 4K (They came out a year earlier though). And then Vega took over that place in the market a year after Polaris launched.

My dad got a 5700 vanilla at Best Buy for like $280, it was a blower though. I got my 5700XT blower on eBay from Dell.com for about $340 with a eBay coupon. For a while I looked like a genius.

At that time I was struggling to justify upgrading from my 290x and really wanted to hold out for RDNA2 but that 5700 worked so good when I tested it! I was hunting used upgrades, and I got counter offer from a eBay seller for $98 on a Fury X Nano at that time, he had like 8 to sell. Obviously I should have bought all of those and held them for one year, and then I could have had cheap very gameable cards for any build throughout the pandemic. Oops ;)

3 years ago!

1667403600884.png

Looks like I gave myself too much credit - it was $370 with tax for my 5700XT:

1667403826995.png
 
Last edited:

Kaluan

Senior member
Jan 4, 2022
500
1,071
96
Since conveniently for my comment, we're on actual games and optimizations now, I'll say I fully expect AMD to make a big fuss about this game and how well RDNA3 ray traces it during the announcement video:

(Title: The Callisto Protocol Will Support Ray Tracing, Even on Eyeballs)

(along with Forespoken and an unnamed Unbisoft game or announcement)
 
Last edited:
  • Like
Reactions: Leeea and Joe NYC

JujuFish

Lifer
Feb 3, 2005
11,004
735
136
RT performance per dollar and power efficiency are the two main things I'll be looking at for my next upgrade. I've held onto my 1070 for quite a while and I think RT has had a long enough time to bake.
 
  • Like
Reactions: Leeea

Mopetar

Diamond Member
Jan 31, 2011
7,848
6,015
136
No real answer for dlss 3.

Honestly I hope AMD doesn't waste the time developing anything comparable.

It is indicating that AMD beats a 450W card using far less power

Has anyone looked at how much performance a 4090 loses if you drop it down to 350W? If Nvidia is only getting the last ~5% performance for at the cost of an extra 100W then it's not too surprising. We've seen AMD get better CPU performance for less power because pushing a chip to the limits just ends up drawing a disproportionate amount of power for the extra performance you get.

RT performance per dollar and power efficiency are the two main things I'll be looking at for my next upgrade. I've held onto my 1070 for quite a while and I think RT has had a long enough time to bake.

I think it'll be at least another generation, probably more likely two before RT is actually ready and a reasonable option for most consumers. It's only with the 4090 that you can actually run a game with it on and get acceptable frame rates so that you don't need to use a fancy upscaling technique to compensate for the performance hit.

If course by then we might see a new craze (8K displays seem possible) and people will have largely quit caring about RT or it gets relegated to a niche like VR.
 

JujuFish

Lifer
Feb 3, 2005
11,004
735
136
I think it'll be at least another generation, probably more likely two before RT is actually ready and a reasonable option for most consumers. It's only with the 4090 that you can actually run a game with it on and get acceptable frame rates so that you don't need to use a fancy upscaling technique to compensate for the performance hit.
If you're playing at 4K, perhaps. My monitor is 1440p, so considerably less demanding.
 
  • Like
Reactions: Leeea

DDH

Member
May 30, 2015
168
168
111
For AMD to match the 4090 in raster at 350W it needs around a 62% perf/watt gain.

Not impossible given the 6900XT had a 64% perf/watt gain vs the 5700XT but not easy either.

I don't see that MLID has anything new to offer here.
  • Points 1 and 2 can be taken from that picture of the reference design.
  • Point 3 is new but could just be BS or is just a guess as to the lack of leaks.
  • Point 4 can be taken from the Igor picture.
  • Point 5 can be calculated using the 50% perf/watt gain and a 350W TBP from the 6900XT or 6950XT.
  • The bit about supply could also just be made up BS that will be hard to verify after the fact.
So just looks like a lot of hot air and a restatement of stuff we have already seen.

No it doesnt. A 50% performance per watt improvement will increase the performance up to 4090 levels @4k @350watts.
Where are you getting 64% from?

100%/300w*1.5*350w = 175% of the 6900xt speed at 4k. According to techpowerups 4k charts 4090 is 75% faster than 6900xt
 

Kaluan

Senior member
Jan 4, 2022
500
1,071
96
Compubase's review had it at 3% on average at 4K but up to 8% slower. Dropping it to 300 W increases the drop on average to 10%.
Performance vs stock/450W takes a much bigger dive in RT and Tensor-using gaming scenarios tho.

Most outlets don't really test power usage with RT-enabled settings in games or even less so while having stuff like 'DLSS' or 'RTX Voice' turned on, do they...

Not to mention power usage @ various framerate caps is probably THE most important aspect to measure. From what I've seen, RDNA2 did even better here than the raw (and not very useful) frames/watt or typical and peak power measurements we see it beat Ampere at.

So just going by SOME data doesn't give you a complete picture. At all.
 
  • Like
Reactions: Tlh97 and Leeea

Karnak

Senior member
Jan 5, 2017
399
767
136
It is indicating that AMD beats a 450W card using far less power
Tbf the 4090 is not a "450W" card. On paper yes but while gaming it's actually more like a 3xxW GPU. 371W according to Igor's review on average in 4K (11 games total).

You have to compare actual power consumption and not the TDP number alone.
 

Glo.

Diamond Member
Apr 25, 2015
5,711
4,559
136
Keep in mind, there are plenty of games that prefer AMD GPU hardware.In such situations, RDNA3 will be faster, that's for sure.This is obviously one very good such example.

4090 performs just like we would expect. Its previous generations of Nvidia GPU that lack of optimization. Nothing to see here, just usual Nvidia driver Shenanigans with giving previous generations lower priority with game ready drivers.

If any priority...
 

Karnak

Senior member
Jan 5, 2017
399
767
136
Is top end Navi 31 any faster than RTX 4070 will be?
I mean the 4070 prob won't be faster than a 3090(Ti) so somewhere around 6950XT in raster. But I get what you mean (N21 = 3070 rumors).

But I hope we'll see some good RT numbers tomorrow. Raster is great but we are almost in 2023 and the 4090 is now good enough for RT even in 4K (ok, besides some CP overdrive nonsense lol).
 
  • Like
Reactions: Leeea

Timorous

Golden Member
Oct 27, 2008
1,625
2,792
136
No it doesnt. A 50% performance per watt improvement will increase the performance up to 4090 levels @4k @350watts.
Where are you getting 64% from?

100%/300w*1.5*350w = 175% of the 6900xt speed at 4k. According to techpowerups 4k charts 4090 is 75% faster than 6900xt

1) I used 6950XT perf as a baseline not 6900XT
2) I used techspot 4k numbers because they are between the TPU low end and ComputerBase high end. The 5800X TPU use holds back the 4090 even at 4K.
3) the reference 6950XT has a 335W TBP so 350W is about 4.4% extra and the 4090 is about 1.69x faster than the 6950XT at techspot.
5) 64% was what AMD claimed the 6900XT perf/watt performance improvement over the 5700XT was.
6) I estimated 62% improvement required based on points 1,2 and 3 as 1.044 x 1.62 is approx 1.69 in round numbers.

So yes it does when using numbers that are not being held back by CPU choice.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
I was hunting used upgrades, and I got counter offer from a eBay seller for $98 on a Fury X Nano at that time, he had like 8 to sell. Obviously I should have bought all of those and held them for one year, and then I could have had cheap very gameable cards for any build throughout the pandemic. Oops ;)

The Nano was such a cool card.

Tbf the 4090 is not a "450W" card. On paper yes but while gaming it's actually more like a 3xxW GPU. 371W according to Igor's review on average in 4K (11 games total).

You have to compare actual power consumption and not the TDP number alone.

The 4090 is a ~470W card when gaming at 4K with with RT enabled, and while not using DLSS.
If you use DLSS the power drops a bunch (as its rendering at a much lower resolution), as seen in this chart for the 4090 FE (not an AIB):
1667425486903.png
 

Karnak

Senior member
Jan 5, 2017
399
767
136
The 4090 is a ~470W card when gaming at 4K with with RT enabled, and while not using DLSS.
If you use DLSS the power drops a bunch (as its rendering at a much lower resolution), as seen in this chart for the 4090 FE (not an AIB):
View attachment 70300
How do you get almost 480W? Even my Gaming OC with it's 600W BIOS only get's above 450W if I raise the PL manually. So I don't know how TPU got their numbers. Probably OC which then is pretty pointless (or is it peak power consumption? that would be possible to go over the default 450W without raising the PL).

Igor measured the average power consumption with a total of 11 games and tbh I think his equipment is more "trustworthy" than anyone elses. No offense though @TPU.

02d-UHD-Watts.png
 
  • Like
Reactions: Tlh97 and Saylick