Question Speculation: RDNA2 + CDNA Architectures thread

Page 182 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

uzzi38

Platinum Member
Oct 16, 2019
2,636
5,984
146
All die sizes are within 5mm^2. The poster here has been right on some things in the past afaik, and to his credit was the first to saying 505mm^2 for Navi21, which other people have backed up. Even still though, take the following with a pich of salt.

Navi21 - 505mm^2

Navi22 - 340mm^2

Navi23 - 240mm^2

Source is the following post: https://www.ptt.cc/bbs/PC_Shopping/M.1588075782.A.C1E.html
 

Glo.

Diamond Member
Apr 25, 2015
5,711
4,559
136
A side note.

Based on what I am hearing, AMD's marketing team underplayed the efficiency of RDNA2 GPUs, again.

That 250 and 300W of power TBP are supposedly numbers of when the GPUs are pushed absolutely to the wall.

In normal circumstances we should see 6900XT averaging around 275W of power.

6800XT will be more like 250-260W of power.
Just reposting this, once again ;).
 

Ajay

Lifer
Jan 8, 2001
15,468
7,874
136
Whaaa? 2532MHz avg. clock with 208W avg. GPU (only) power consumption?

Also, "only" 269W max. GPU power consumption at 2553MHz -> RX 6800 non-XT?

These are great results... if true
I don't like that a bunch of stuff is being blurred out in some recent leaks on GPU stuff. Also, there is no system information - makes it hard to accept these leaks at face value.
I imagine that they are true - but in what context?
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
I don't like that a bunch of stuff is being blurred out in some recent leaks on GPU stuff. Also, there is no system information - makes it hard to accept these leaks at face value.
I imagine that they are true - but in what context?

The blurs are typically to protect whomever has the card. AMD knows who has cards, and what cards those people have.
 

Glo.

Diamond Member
Apr 25, 2015
5,711
4,559
136
What matters is the performance.
If the performance is so-so even with all this isn't than isn't worth that much in practice. The "dream" is beating Ampere while hoping that AMD's solution for image upscale works well.
Who cares about Upscaling tech in a world in which AMD delivers faster AND more efficient GPUs than Nvidia, both, at the same time?

Why suddenly upscaling tech has become the "make or break" reason for buying a product?
 

Veradun

Senior member
Jul 29, 2016
564
780
136
Who cares about Upscaling tech in a world in which AMD delivers faster AND more efficient GPUs than Nvidia, both, at the same time?

Why suddenly upscaling tech has become the "make or break" reason for buying a product?
Because people are unable to use the non automated version that's called "moving the graphics settings' sliders"
 

Mopetar

Diamond Member
Jan 31, 2011
7,848
6,015
136
Why suddenly upscaling tech has become the "make or break" reason for buying a product?

If you were someone who was used to trumpeting the superior performance or the superior efficiency of NVidia GPUs and suddenly found yourself in a world where you could do neither, then what else is there for you to talk about, especially when teasellation is so last decade?

So you take comfort in the fact that you can upscale a 1440p image with ray tracing better than anyone else and silently pray that your 10 GB will last you until the next generation.
 

Veradun

Senior member
Jul 29, 2016
564
780
136
If you were someone who was used to trumpeting the superior performance or the superior efficiency of NVidia GPUs and suddenly found yourself in a world where you could do neither, then what else is there for you to talk about, especially when teasellation is so last decade?

So you take comfort in the fact that you can upscale a 1440p image with ray tracing better than anyone else and silently pray that your 10 GB will last you until the next generation.
What dlss does is what everyone does manually:

move sliders down, or disable invisible fps killing effects to obtain what is basically the same quality but at higher framerate.

AMD will probably enable something similar at some point for the attached marketing, but what this stuff is is just automating what you already can do by yourself.
 

Guru

Senior member
May 5, 2017
830
361
106
Who cares about Upscaling tech in a world in which AMD delivers faster AND more efficient GPUs than Nvidia, both, at the same time?

Why suddenly upscaling tech has become the "make or break" reason for buying a product?
Because having garbage quality graphics at 720p, with the illusion of 4k is all the rage for kids these days, Nvidia said so themselves, so it must be true!

I mean look at Watch Dogs Legion and the DLSS 2.0, the image quality is SIGNIFICANTLY worse than normal no matter what mode they used! Guru3d has the review! I mean its not even close the quality, its like night and day, DLSS 2.0 is like playing at medium settings, it still looks kinda good, but its not 3 levels above higher quality!

Everyone jizzed their pants for DLSS 2.0 and Control when Nvidia had a full freaking year of running AI upscaling and the image quality was reasonably similar to normal + TAA, (which they worsened TAA on purpose, to make DLSS 2.0 look better), but in a full year Nvidia could have hand drawn the environments in Control and still had a better product than DLSS 2.0!

Realistically on release day like we see with Watch Dogs legion, DLSS 2.0 is crap in terms of image quality. You are essentially playing 720p upscaled to 4k with normal image quality. Why? Why?

I'd rather come down a notch from 4k and play 1440p but at MAX quality and 100fps, rather than play at 720p upscaled to 4k and at best medium quality at 80fps!

What is the point of 4k, if you are going to be upscaling 720p or 640p and have garbage textures?
 

gdansk

Platinum Member
Feb 8, 2011
2,123
2,629
136
I'm not as negative toward DLSS as you guys it seems. I still prefer raw rasterization performance but it is nice to have an option. Some form of upscaler trained on high resolution versions of the game could be beneficial. And I think AMD should invest in some competing technology. Preferably a more open and/or standardized one.

On the other hand maybe my optimism comes from ignorance. I have a 2080 Ti and have never once tried DLSS. None of the games I play support it...
 

uzzi38

Platinum Member
Oct 16, 2019
2,636
5,984
146
I was a lot more positive of DLSS before, but Watch Dogs does show pretty clearly it's a hit and miss technique. It can be excellent in some games, but subpar in others.

I mean, that's not really surprising given the nature of image reconstruction, but eye opening. It does just mean that one should keep their eyes open for it and that DLSS shouldn't be taken as the holy grail of graphics. Just another technique that works amazingly in some games but falls flat on it's face in others.
 

Tup3x

Senior member
Dec 31, 2016
965
951
136
Personally I care that driver v-sync works and works great in combination with adaptive sync. Also that anisotropic filtering override works. Well working post process AA (driver FXAA is better than game implementations) is also nice. It's unfortunate that AMD's drivers suck feature wise and they just keep removing stuff (video settings for example - there's only a freaking brightness slider for Renoir and RDNA).
 

Leeea

Diamond Member
Apr 3, 2020
3,626
5,368
136
Personally I care that driver v-sync works and works great in combination with adaptive sync. Also that anisotropic filtering override works. Well working post process AA (driver FXAA is better than game implementations) is also nice. It's unfortunate that AMD's drivers suck feature wise and they just keep removing stuff (video settings for example - there's only a freaking brightness slider for Renoir and RDNA).


I am using driver version 20.9.1 released 2020-09-09.

For the video, if I goto settings -> video -> video profile -> custom, I can adjust:
video sharpness
color vibrance
steady video
fluid motion video
custom brightness

If your looking to update the DisplayPort/HDMI profiles, that was moved to settings -> display, with both custom resolutions and overrides available. Including AMD Freesync, virtual super resolution, gpu scaling, interger scaling, color, custom color, etc

For anisotropic filtering: settings -> graphics -> pick the game you want (or global) -> advanced:
there are overrides for:
anisotropic filtering, anti-aliasing, tessellation*, pixel format, shader cache, BHCC (caching policy)
along with
anti-lag, chill, boost, image sharpening, enhanced sync, and vsync

*manually setting tessellation is a great way to play nvidia gameworks games with full performance

--------------------

From my perspective, your post appears to be misinformation?
 
Last edited:

Martimus

Diamond Member
Apr 24, 2007
4,488
152
106
I am using driver version 20.9.1 released 2020-09-09.

For the video, if I goto settings -> video -> video profile -> custom, I can adjust:
video sharpness
color vibrance
steady video
fluid motion video
custom brightness

If your looking to update the DisplayPort/HDMI profiles, that was moved to settings -> display, with both custom resolutions and overrides available. Including AMD Freesync, virtual super resolution, gpu scaling, interger scaling, color, custom color, etc

For anisotropic filtering: settings -> graphics -> pick the game you want (or global) -> advanced:
there are overrides for:
anisotropic filtering, anti-aliasing, tessellation*, pixel format, shader cache, BHCC (caching policy)
along with
anti-lag, chill, boost, image sharpening, enhanced sync, and vsync

*manually setting tessellation is a great way to play nvidia gameworks games with full performance

--------------------

From my perspective, your post appears to be misinformation?
These settings have been in AMD drivers at least since February when I last heard the claim that they weren't and I checked. When did these people have these problems?