• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Question Speculation: RDNA2 + CDNA Architectures thread

Page 182 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
A side note.

Based on what I am hearing, AMD's marketing team underplayed the efficiency of RDNA2 GPUs, again.

That 250 and 300W of power TBP are supposedly numbers of when the GPUs are pushed absolutely to the wall.

In normal circumstances we should see 6900XT averaging around 275W of power.

6800XT will be more like 250-260W of power.
Just reposting this, once again 😉.
 
Whaaa? 2532MHz avg. clock with 208W avg. GPU (only) power consumption?

Also, "only" 269W max. GPU power consumption at 2553MHz -> RX 6800 non-XT?

These are great results... if true
I don't like that a bunch of stuff is being blurred out in some recent leaks on GPU stuff. Also, there is no system information - makes it hard to accept these leaks at face value.
I imagine that they are true - but in what context?
 
I don't like that a bunch of stuff is being blurred out in some recent leaks on GPU stuff. Also, there is no system information - makes it hard to accept these leaks at face value.
I imagine that they are true - but in what context?

The blurs are typically to protect whomever has the card. AMD knows who has cards, and what cards those people have.
 
What matters is the performance.
If the performance is so-so even with all this isn't than isn't worth that much in practice. The "dream" is beating Ampere while hoping that AMD's solution for image upscale works well.
Who cares about Upscaling tech in a world in which AMD delivers faster AND more efficient GPUs than Nvidia, both, at the same time?

Why suddenly upscaling tech has become the "make or break" reason for buying a product?
 
Why suddenly upscaling tech has become the "make or break" reason for buying a product?

If you were someone who was used to trumpeting the superior performance or the superior efficiency of NVidia GPUs and suddenly found yourself in a world where you could do neither, then what else is there for you to talk about, especially when teasellation is so last decade?

So you take comfort in the fact that you can upscale a 1440p image with ray tracing better than anyone else and silently pray that your 10 GB will last you until the next generation.
 
If you were someone who was used to trumpeting the superior performance or the superior efficiency of NVidia GPUs and suddenly found yourself in a world where you could do neither, then what else is there for you to talk about, especially when teasellation is so last decade?

So you take comfort in the fact that you can upscale a 1440p image with ray tracing better than anyone else and silently pray that your 10 GB will last you until the next generation.
What dlss does is what everyone does manually:

move sliders down, or disable invisible fps killing effects to obtain what is basically the same quality but at higher framerate.

AMD will probably enable something similar at some point for the attached marketing, but what this stuff is is just automating what you already can do by yourself.
 
Who cares about Upscaling tech in a world in which AMD delivers faster AND more efficient GPUs than Nvidia, both, at the same time?

Why suddenly upscaling tech has become the "make or break" reason for buying a product?
Because having garbage quality graphics at 720p, with the illusion of 4k is all the rage for kids these days, Nvidia said so themselves, so it must be true!

I mean look at Watch Dogs Legion and the DLSS 2.0, the image quality is SIGNIFICANTLY worse than normal no matter what mode they used! Guru3d has the review! I mean its not even close the quality, its like night and day, DLSS 2.0 is like playing at medium settings, it still looks kinda good, but its not 3 levels above higher quality!

Everyone jizzed their pants for DLSS 2.0 and Control when Nvidia had a full freaking year of running AI upscaling and the image quality was reasonably similar to normal + TAA, (which they worsened TAA on purpose, to make DLSS 2.0 look better), but in a full year Nvidia could have hand drawn the environments in Control and still had a better product than DLSS 2.0!

Realistically on release day like we see with Watch Dogs legion, DLSS 2.0 is crap in terms of image quality. You are essentially playing 720p upscaled to 4k with normal image quality. Why? Why?

I'd rather come down a notch from 4k and play 1440p but at MAX quality and 100fps, rather than play at 720p upscaled to 4k and at best medium quality at 80fps!

What is the point of 4k, if you are going to be upscaling 720p or 640p and have garbage textures?
 
I'm not as negative toward DLSS as you guys it seems. I still prefer raw rasterization performance but it is nice to have an option. Some form of upscaler trained on high resolution versions of the game could be beneficial. And I think AMD should invest in some competing technology. Preferably a more open and/or standardized one.

On the other hand maybe my optimism comes from ignorance. I have a 2080 Ti and have never once tried DLSS. None of the games I play support it...
 
I was a lot more positive of DLSS before, but Watch Dogs does show pretty clearly it's a hit and miss technique. It can be excellent in some games, but subpar in others.

I mean, that's not really surprising given the nature of image reconstruction, but eye opening. It does just mean that one should keep their eyes open for it and that DLSS shouldn't be taken as the holy grail of graphics. Just another technique that works amazingly in some games but falls flat on it's face in others.
 
Personally I care that driver v-sync works and works great in combination with adaptive sync. Also that anisotropic filtering override works. Well working post process AA (driver FXAA is better than game implementations) is also nice. It's unfortunate that AMD's drivers suck feature wise and they just keep removing stuff (video settings for example - there's only a freaking brightness slider for Renoir and RDNA).
 
Personally I care that driver v-sync works and works great in combination with adaptive sync. Also that anisotropic filtering override works. Well working post process AA (driver FXAA is better than game implementations) is also nice. It's unfortunate that AMD's drivers suck feature wise and they just keep removing stuff (video settings for example - there's only a freaking brightness slider for Renoir and RDNA).


I am using driver version 20.9.1 released 2020-09-09.

For the video, if I goto settings -> video -> video profile -> custom, I can adjust:
video sharpness
color vibrance
steady video
fluid motion video
custom brightness

If your looking to update the DisplayPort/HDMI profiles, that was moved to settings -> display, with both custom resolutions and overrides available. Including AMD Freesync, virtual super resolution, gpu scaling, interger scaling, color, custom color, etc

For anisotropic filtering: settings -> graphics -> pick the game you want (or global) -> advanced:
there are overrides for:
anisotropic filtering, anti-aliasing, tessellation*, pixel format, shader cache, BHCC (caching policy)
along with
anti-lag, chill, boost, image sharpening, enhanced sync, and vsync

*manually setting tessellation is a great way to play nvidia gameworks games with full performance

--------------------

From my perspective, your post appears to be misinformation?
 
Last edited:
I am using driver version 20.9.1 released 2020-09-09.

For the video, if I goto settings -> video -> video profile -> custom, I can adjust:
video sharpness
color vibrance
steady video
fluid motion video
custom brightness

If your looking to update the DisplayPort/HDMI profiles, that was moved to settings -> display, with both custom resolutions and overrides available. Including AMD Freesync, virtual super resolution, gpu scaling, interger scaling, color, custom color, etc

For anisotropic filtering: settings -> graphics -> pick the game you want (or global) -> advanced:
there are overrides for:
anisotropic filtering, anti-aliasing, tessellation*, pixel format, shader cache, BHCC (caching policy)
along with
anti-lag, chill, boost, image sharpening, enhanced sync, and vsync

*manually setting tessellation is a great way to play nvidia gameworks games with full performance

--------------------

From my perspective, your post appears to be misinformation?
These settings have been in AMD drivers at least since February when I last heard the claim that they weren't and I checked. When did these people have these problems?
 
Back
Top