Question Speculation: RDNA3 + CDNA2 Architectures Thread

Page 21 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

uzzi38

Platinum Member
Oct 16, 2019
2,635
5,984
146

Ajay

Lifer
Jan 8, 2001
15,468
7,873
136
Another downright nutty figure is the TDP. In a few generations (maybe just one?) it'll be measured in kilowatts.
And the VideoCardz article says 'passive cooling' - lol. No, they are tower coolers with massive airflow coming from the front. I wouldn't want to forget my earplugs when a rack of those servers start up :eek:
 
  • Like
Reactions: Thibsie

Saylick

Diamond Member
Sep 10, 2012
3,172
6,410
136
Just nuts if this is true. It's almost like, what do we need this for? 8k gaming? 240 FPS at 4K (we don't even have an interface that can drive think, IIRC)?

3x performance (on paper) of N21 is looking super legit. The move to Infinity Cache is just genius in hindsight, especially since a wider memory bus + more power hungry GDDR6X is just not feasible if you want to scale to higher performance targets. AD102 is supposed to be 384-bit G6X memory just like GA102, but I don't see how they can get 2x the bandwidth to feed 2x the execution units without also using some form of on-die cache to relieve pressure on the memory bus. 512MB of Infinity Cache is going to provide so much more effective bandwidth than a traditional approach. That's 4x the IC for 3x the compute units, so the bandwidth/CU is still increased over RDNA2.
 

biostud

Lifer
Feb 27, 2003
18,251
4,765
136

Just nuts if this is true. It's almost like, what do we need this for? 8k gaming? 240 FPS at 4K (we don't even have an interface that can drive think, IIRC)?

DP1.4 with DSC can do 5120x1440x240Hz, which the G9 Odyssey can push and is close to 4k.

but otherwise I would think DP2.0/hdmi2.1 is a given for next gen cards.
 
  • Like
Reactions: Tlh97 and Ajay

biostud

Lifer
Feb 27, 2003
18,251
4,765
136
3x performance (on paper) of N21 is looking super legit. The move to Infinity Cache is just genius in hindsight, especially since a wider memory bus + more power hungry GDDR6X is just not feasible if you want to scale to higher performance targets. AD102 is supposed to be 384-bit G6X memory just like GA102, but I don't see how they can get 2x the bandwidth to feed 2x the execution units without also using some form of on-die cache to relieve pressure on the memory bus. 512MB of Infinity Cache is going to provide so much more effective bandwidth than a traditional approach. That's 4x the IC for 3x the compute units, so the bandwidth/CU is still increased over RDNA2.

I wonder if HBM2e or HBM3 is going to appear on a gaming card.
 

Justinus

Diamond Member
Oct 10, 2005
3,175
1,518
136

Just nuts if this is true. It's almost like, what do we need this for? 8k gaming? 240 FPS at 4K (we don't even have an interface that can drive think, IIRC)?

I've got a 4k160hz display and a 6900XT, and more games than I'd like run well below 100 FPS. Shadow of the tomb raider only runs at 50 fps with Ray tracing enabled. Even tripling the performance wouldn't hit 160hz over my setup now.

More realistically though, most games at 4k ultra are running 70-90 FPS, with a few special games running much faster (Doom Eternal, FH4, etc). That config would lead to me being able to actually approach or maintain 160hz in most games.
 

Ajay

Lifer
Jan 8, 2001
15,468
7,873
136
I've got a 4k160hz display and a 6900XT, and more games than I'd like run well below 100 FPS. Shadow of the tomb raider only runs at 50 fps with Ray tracing enabled. Even tripling the performance wouldn't hit 160hz over my setup now.

More realistically though, most games at 4k ultra are running 70-90 FPS, with a few special games running much faster (Doom Eternal, FH4, etc). That config would lead to me being able to actually approach or maintain 160hz in most games.
Well, I didn't think about that - I never run Ultra settings. Most of the games I play are fast paced shooters, so pretty scenery just doesn't matter to me. Plus, I haven't bought any frame killer games since I'm still on a GTX 1070. My 'skip every other generation' preference got blown out of the water with the current crazy prices.
 
Last edited:
  • Like
Reactions: Tlh97 and biostud

Saylick

Diamond Member
Sep 10, 2012
3,172
6,410
136
Well, I didn't think about that - I never run Ultra settings. Most of the games I play are fast paced shooters, so pretty scenery just doesn't matter to me. Plus, I haven't bought any frame killer games since I'm still on a GTX 1070. My 'skip ever other generation' preference got blown out of the water with the current crazy prices.
At the rate we're going, that statement might end up being true, except it won't be in reference to graphics cards generations but genealogical generations. Your grandkids, if you have any, better start saving up now if they want to afford a graphics card down the road. :p

I can picture it now... children in primary school will brag to their friends that their grandpops once owned an RTX 3080. All their friends will go "wooooah" in astoundment.
 

eek2121

Platinum Member
Aug 2, 2005
2,930
4,026
136
DP1.4 with DSC can do 5120x1440x240Hz, which the G9 Odyssey can push and is close to 4k.

but otherwise I would think DP2.0/hdmi2.1 is a given for next gen cards.
Fun fact: The G9 itself couldn’t handle 240hz (factory defect, it interlaces the second the monitor is pushed somewhat hard.
Well, I didn't think about that - I never run Ultra settings. Most of the games I play are fast paced shooters, so pretty scenery just doesn't matter to me. Plus, I haven't bought any frame killer games since I'm still on a GTX 1070. My 'skip ever other generation' preference got blown out of the water with the current crazy prices.
The 3090 handles ultra settings like a champ, but STILL struggles with RT stuff.

Games will also get more demanding as time goes on.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,826
7,190
136
At the rate we're going, that statement might end up being true, except it won't be in reference to graphics cards generations but genealogical generations. Your grandkids, if you have any, better start saving up now if they want to afford a graphics card down the road. :p

I can picture it now... children in primary school will brag to their friends that their grandpops once owned an RTX 3080. All their friends will go "wooooah" in astoundment.

-Whole new meaning to "Legacy" Graphics Cards and Drivers...
 

eek2121

Platinum Member
Aug 2, 2005
2,930
4,026
136
Yeah, my idea of "I will get a 5700XT to hold me over for now" turned into "Looks like I will be running this until it dies".

When the RTX 30 series came out, I made the controversial decision to upgrade from my 1080ti to a 3090. Something that I never do (I would usually skip 1 more generation at least, or not buy high end at all). I bit the bullet and I am glad I did every day. Now I have a 3090 and a 1080ti, both of which I purchased for MSRP.

I would love to get a 6900xt for my Linux machine, however.

Bonus points: I am on EVGA’s queue system for a 3080 and 3090. was going to cancel, but I might buy and use for trades instead.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,826
7,190
136
Frankly any card purchased for MSRP is a good deal in this day and age.

-Depends on the MSRP though.

Reading about people getting their MSI or Gigabyte 3080s for "$1300 MSRP" when the founders edition was $800 makes my head spin a bit.

I weep for the MSRP of even "founders" Edition type cards for this next round.
 

Jwilliams01207

Junior Member
Dec 6, 2013
24
2
71
The Steam hardware survey has its issues, but even as unscientific as it is, it's still unlikely to be wrong by such a margin that 4K displays are really 10% of the market.

People on tech forums get nice new shiny toys a lot more frequently than the average person. 4K TVs are really cheap if you don't give a damn about picture quality, but most people don't even know they can be used as a monitor. To them a TV is a TV and a monitor is a monitor.
 

Jwilliams01207

Junior Member
Dec 6, 2013
24
2
71
The Steam hardware survey has its issues, but even as unscientific as it is, it's still unlikely to be wrong by such a margin that 4K displays are really 10% of the market.

People on tech forums get nice new shiny toys a lot more frequently than the average person. 4K TVs are really cheap if you don't give a damn about picture quality, but most people don't even know they can be used as a monitor. To them a TV is a TV and a monitor is a monitor.
 

Mopetar

Diamond Member
Jan 31, 2011
7,848
6,015
136
The Steam hardware survey has its issues, but even as unscientific as it is, it's still unlikely to be wrong by such a margin that 4K displays are really 10% of the market.

People on tech forums get nice new shiny toys a lot more frequently than the average person. 4K TVs are really cheap if you don't give a damn about picture quality, but most people don't even know they can be used as a monitor. To them a TV is a TV and a monitor is a monitor.

You've got something going on because this is the third thread I've seen you have a double post in.
 

thecoolnessrune

Diamond Member
Jun 8, 2005
9,672
578
126
Another downright nutty figure is the TDP. In a few generations (maybe just one?) it'll be measured in kilowatts.
In the end, in these applications, only perf/W matters at the end. Even if now a single package has 560W of max power draw, it matters nothing if it has a performance >2x a part with 300W draw. PC environment is different.

For the past year or so we’ve been getting briefed by our Partners indicating ~300W CPU / ~400W GPU configurations air cooled. 400W CPU / 600W GPU configurations liquid cooled. This is why company’s like Cisco released the UCS-X Chassis. 2023 is supposed to see higher wattage GPU configurations than that. 1kW is not far off.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
For the past year or so we’ve been getting briefed by our Partners indicating ~300W CPU / ~400W GPU configurations air cooled. 400W CPU / 600W GPU configurations liquid cooled. This is why company’s like Cisco released the UCS-X Chassis. 2023 is supposed to see higher wattage GPU configurations than that. 1kW is not far off.

12th gen Intel chips are already at the 300W level, which is absolutely nuts. nVidia is already at 400W for GPUs, so it would seem that even higher draws are possible. But even with liquid coolers, 1000W (CPU/GPU combined) is going to be tough to reliably cool. Even my current PC, which is way under that, heats my room up. I would certainly not want a 1000W space heater in here.
 

thecoolnessrune

Diamond Member
Jun 8, 2005
9,672
578
126
12th gen Intel chips are already at the 300W level, which is absolutely nuts. nVidia is already at 400W for GPUs, so it would seem that even higher draws are possible. But even with liquid coolers, 1000W (CPU/GPU combined) is going to be tough to reliably cool. Even my current PC, which is way under that, heats my room up. I would certainly not want a 1000W space heater in here.

Yep, the big thing here is these partners are scoping 300W Air for 1U Rack Servers, in dual CPU configurations. 600W total for 2 CPUs And still leaving space for the massive amount of RAM modern servers can contain. A couple of years ago when partners were getting ready to introduce their new Xeon Scalable lines, they lamented that it was incredibly difficult to arrange the system in a way to be able to hold the massive sockets as well as the increased RAM Channels. That’s one of the reasons why we say almost all the vendors get rid of things like Mirrored SD Card Readers and settle on M.2 Card modules Only. After all this they released a platform that they were pleased with getting 205W TDPs out of. As TDPs increased across Xeon Scalable, most have adapted by decreasing Ambient Temps required (for instance, instead of 90-95 degree ambients, requiring 75-80 degree ambients). With 3rd gen Xeon Scalable, we’re at 270W TDPs on the flagships, or 540W across 2 CPUs, and most partners feel we’re simply at the limits of what we can cool with air in a 1U form factor in a CPU socket. Hence why we expect to see 300W TDPs on next gen on air (likely Sapphire Rapids), but the CPU itself will likely be cable of 400W TDPs (800W across 2 sockets). Next-Gen servers are where I really expect to see Closed Loop and Facility level liquid cooling to start taking off, because it will no longer be a feature relegated for ultra-density, but rather a feature required to get the biggest return on investment of the silicon purchased.
 
Last edited:
  • Like
Reactions: Tlh97 and Stuka87