Question Speculation: RDNA3 + CDNA2 Architectures Thread

Page 19 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

uzzi38

Platinum Member
Oct 16, 2019
2,565
5,573
146

eek2121

Platinum Member
Aug 2, 2005
2,904
3,906
136
If you're willing to trust Steam hardware survey

I don’t. Among other things:
  1. Countries such as China slew the results. There is a LOT of older hardware out there.
  2. I have 2 machines that have never received the Steam survey prompt. Both machines have had Steam installed forever (years in one case). If a user updates to a 4K display, Steam won’t pick it up.
  3. Not everyone uses Steam, especially in modern times with Game Pass.
 
  • Like
Reactions: Mopetar and psolord

Mopetar

Diamond Member
Jan 31, 2011
7,797
5,899
136
The Steam hardware survey has its issues, but even as unscientific as it is, it's still unlikely to be wrong by such a margin that 4K displays are really 10% of the market.

People on tech forums get nice new shiny toys a lot more frequently than the average person. 4K TVs are really cheap if you don't give a damn about picture quality, but most people don't even know they can be used as a monitor. To them a TV is a TV and a monitor is a monitor.
 

blckgrffn

Diamond Member
May 1, 2003
9,110
3,028
136
www.teamjuchems.com
The Steam hardware survey has its issues, but even as unscientific as it is, it's still unlikely to be wrong by such a margin that 4K displays are really 10% of the market.

People on tech forums get nice new shiny toys a lot more frequently than the average person. 4K TVs are really cheap if you don't give a damn about picture quality, but most people don't even know they can be used as a monitor. To them a TV is a TV and a monitor is a monitor.

I agree. And some people I know who went "4K" for gaming are retreating now because if you are playing online, a 1440p high refresh rate monitor is much more of a sweet spot. You can get decent DPI at 32" for desktop use and still get 120hz+ very reasonably.

That and driving a AAA game at 4K native and getting 60FPS is just a stupidly expensive pursuit right now. A good buddy of mine finally got a 3080 and was still having to use DLSS on his 40" 4K TV to hit solid 60 FPS+ in CoD and he was like - what's the point of upscaling wizardry if I bought 4K to get all the pixels? I mean, to each their own but he has a bit of a point. He had just dropped $1,200 on GPU and wanted extreme settings and 4K and 60FPS without compromise. I can't blame him, really, even if I knew in advance that was unlikely.

In any case, you can drive a 1440p screen with lots of GPUs around $1k street price and that seems to be an upper limit for the folks in my circles, not withstanding the one 3090 moonshot :D
 

Leeea

Diamond Member
Apr 3, 2020
3,599
5,340
106
I agree. And some people I know who went "4K" for gaming are retreating now because if you are playing online, a 1440p high refresh rate monitor is much more of a sweet spot. You can get decent DPI at 32" for desktop use and still get 120hz+ very reasonably.

That and driving a AAA game at 4K native and getting 60FPS is just a stupidly expensive pursuit right now. A good buddy of mine finally got a 3080 and was still having to use DLSS on his 40" 4K TV to hit solid 60 FPS+ in CoD and he was like - what's the point of upscaling wizardry if I bought 4K to get all the pixels? I mean, to each their own but he has a bit of a point. He had just dropped $1,200 on GPU and wanted extreme settings and 4K and 60FPS without compromise. I can't blame him, really, even if I knew in advance that was unlikely.

In any case, you can drive a 1440p screen with lots of GPUs around $1k street price and that seems to be an upper limit for the folks in my circles, not withstanding the one 3090 moonshot :D

I think you hit a very important point that the latest and greatest typically will not push 4k to a frame rate over 80 fps on many titles.

In that scenario, it seems better to just get more frames at 1440p.
 
Last edited:

exquisitechar

Senior member
Apr 18, 2017
655
862
136
Navi 33 being faster than 6900 XT still seems sketchy to me. That'd be like 80-90% faster as the 6600 XT on a minor shrink. Are they sure it's not also on 5 nm?
Why compare it to Navi 23? Navi 33 is apparently quite a bit larger than Navi 22, actually. It's not all that shocking for it to be slightly faster than the 6900 XT with the rumored changes.
 

Mopetar

Diamond Member
Jan 31, 2011
7,797
5,899
136
Low-end is probably getting further cannibalized by APUs, especially now that we're starting to see ones based on current technology come to market.

Shortages or rising wafer costs will also push them out since there're probably one of the least profitable products.
 

scineram

Senior member
Nov 1, 2020
361
283
106
Low-end is probably getting further cannibalized by APUs, especially now that we're starting to see ones based on current technology come to market.

Shortages or rising wafer costs will also push them out since there're probably one of the least profitable products.
The gap is still huge between future APU and 6900XT perf. Something is missing.
 
Mar 11, 2004
23,031
5,495
146
The gap is still huge between future APU and 6900XT perf. Something is missing.

So I guess you're ignoring the 6800, 6700, 6600? Yeah, you said low end, but then why are you asking about a gap between APU and the highest end card? You'd think you would've said 6600.

Most likely, the low end stuff is gonna be previous gen mid-range (if not low end) until the chip shortages become less of an issue. And the future low end is probably going to be APU as they push for more compact devices and chiplets become the norm. Or it'll continue to be the previous gen stuff if for some dumb reason they have to shove a dGPU with incredibly mediocre performance so they can upcharge a few hundreds of dollars, in there.
 
Mar 11, 2004
23,031
5,495
146
Low-end is probably getting further cannibalized by APUs, especially now that we're starting to see ones based on current technology come to market.

Shortages or rising wafer costs will also push them out since there're probably one of the least profitable products.

I actually would guess its one of the more profitable products, as when they shoehorn even older (and mediocre when it came out) dGPUs in laptops they seem to upcharge a few hundred bucks (granted they accomplish the same by putting APU only stuff in thinner lighter laptops so its a bit of a wash). But for the low end "gaming" laptops they'll just shove in the lower previous gen stuff (like they've been doing with the 1650s for instance). And then APUs with a GPU chiplet might finally make that moot, but we'll see. Seems like OEMs don't want to rock the boat and AMD and Intel are appeasing them.
 
  • Like
Reactions: Tlh97 and Thibsie

jpiniero

Lifer
Oct 1, 2010
14,510
5,159
136
Why compare it to Navi 23? Navi 33 is apparently quite a bit larger than Navi 22, actually. It's not all that shocking for it to be slightly faster than the 6900 XT with the rumored changes.

I guess I am kind of assuming that this would take the 6600 XT's real price point ($300). If this is going to be $700 even in a world where ethereum collapses then I guess it doesn't matter how big it is.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,717
7,008
136
Where are the low end cards then?

- I strongly suspect we'll see N22 and downward still fill that niche, likely as a rebrand or as a respin on 6nm.

I suspect the plan for a lot of companies going forward will be to do a warm hand-off between generations, where the old line is still being manufactured/sold using and older/cheaper node or process while the new line comes in on top and slowly supplants the old line. Plenty of historical precident for lower range stuff to be a gen or two old.

If everything is moving off the shelf, might as well keep those older production lines warm and moving.
 
  • Like
Reactions: Tlh97 and blckgrffn

blckgrffn

Diamond Member
May 1, 2003
9,110
3,028
136
www.teamjuchems.com
- I strongly suspect we'll see N22 and downward still fill that niche, likely as a rebrand or as a respin on 6nm.

I suspect the plan for a lot of companies going forward will be to do a warm hand-off between generations, where the old line is still being manufactured/sold using and older/cheaper node or process while the new line comes in on top and slowly supplants the old line. Plenty of historical precident for lower range stuff to be a gen or two old.

If everything is moving off the shelf, might as well keep those older production lines warm and moving.

Given RDNA2 is feature complete over and above what the consoles will be wielding for 3-4 years and are DX12U compliant, they really don't need to be refreshed for some time. Might as well only make the mega SKUs on the cutting edge processes and let the rest age.

Haha, the $300 1050ti at Microcenter says "Hi!" and sadly that card would be so amazing for many Fortnite and similar gamers at this time if it were available around $100 like maybe it should be.
 
  • Like
Reactions: Tlh97

eek2121

Platinum Member
Aug 2, 2005
2,904
3,906
136
I agree. And some people I know who went "4K" for gaming are retreating now because if you are playing online, a 1440p high refresh rate monitor is much more of a sweet spot. You can get decent DPI at 32" for desktop use and still get 120hz+ very reasonably.

That and driving a AAA game at 4K native and getting 60FPS is just a stupidly expensive pursuit right now. A good buddy of mine finally got a 3080 and was still having to use DLSS on his 40" 4K TV to hit solid 60 FPS+ in CoD and he was like - what's the point of upscaling wizardry if I bought 4K to get all the pixels? I mean, to each their own but he has a bit of a point. He had just dropped $1,200 on GPU and wanted extreme settings and 4K and 60FPS without compromise. I can't blame him, really, even if I knew in advance that was unlikely.

In any case, you can drive a 1440p screen with lots of GPUs around $1k street price and that seems to be an upper limit for the folks in my circles, not withstanding the one 3090 moonshot :D

I don't. To be clear, I'm not claiming 4K has even close to a majority market share (unless you include consoles, then I bet it wins hands down), but I follow this scene pretty closely, and have worked with, or are friends with, a lot of people that game. Furthermore, a bit of digging into the Steam hardware survey will tell you why those stats are WAY off. Look at the multi-monitor stats and tell me if you see the problem. HINT: Ultrawide.
 

Kepler_L2

Senior member
Sep 6, 2020
308
977
106

So Navi31 is taped out. It may point to a late 2022 release, but the increased packaging complexity may affect this.
I don't know why people keep thinking complex = delayed. MI200 was the first MCM GPU from AMD and it was not delayed at all.
 

leoneazzurro

Senior member
Jul 26, 2016
905
1,430
136
I don't know why people keep thinking complex = delayed. MI200 was the first MCM GPU from AMD and it was not delayed at all.

I am not saying it will be delayed, I am wondering if the increased package complexity will add validation time, thus requiring longer than usual time between tapeout and release to market (rumors so far seems to point out that MI200 and Navi31 have a different structure of the package). This is why I wrote "may" and not "will". I myself would lke to see such a beast running ASAP lol.