• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."
  • Community Question: What makes a good motherboard?

Question Speculation: RDNA2 + CDNA Architectures thread

Page 191 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

MrTeal

Platinum Member
Dec 7, 2003
2,939
596
136
I thought it was supposed to be compatible, but I may be mistaken about that. I'm fairly sure that Micron and the other manufacturers would like to sell to AMD at some point as well.
Nvidia has said that GA104's memory controller supports GDDR6X as well, but that doesn't mean AMD's will. GDDR4X uses PAM4, so the memory controller needs to be able to decode four distinct voltage levels instead of just 2. I don't know if the GDD6X chips might have some kind of NRZ fallback, but if Navi's memory controller doesn't support the different signaling, I'm not sure it would be compatible.
 
  • Like
Reactions: Tlh97 and Mopetar

PhoBoChai

Member
Oct 10, 2017
114
363
106
You are only showing 1080p and 1440p because you know very well, that in 4K RX 6800XT is not so fast as in lower resolution.
In my opinion, It's because of low hitrate of Infinity cache in 4k compared to lower resolution.
Are people seriously arguing over a 5% delta at 4K as if it matters?

A SINGLE new game that skews heavily to NV or AMD will change that picture entirely. It's not as if the difference is massive to start it, please folks.

There's custom 6800XT shown to OC on the air cooler to 2.65Ghz, and with great perf scaling from core OC (not memory bandwidth limited per Computerbase.de tests), 6800XT OC will match or beat 3080 OC even at 4K and extend the lead at 1440 and 1080p.

As for RT, this is RDNA2 RT performance when devs optimize for it, instead of being forced to run RTX games that NV with their long standing GimpWorks program have clearly shown to be detrimental to AMD perf.



RX 6800XT actually loses less performance with RT on vs off here.

So much for inferior RT hardware claims by the clueless masses.
 

TESKATLIPOKA

Senior member
May 1, 2020
265
298
96
RDNA2 wins in 1080p, draws in 1440p, loses in 4K.

Its more power efficient and more cost effective GPU(s).

Yes. You are missing those factors. Adding all this up tells you that in most important performance factors its a draw, while in two other PRODUCT factors AMD wins.

So effectively - 6800 and 6800 XT is superior product, over Nvidia's counterparts.

Was it THAT difficult to see this context?
Yes, It was difficult, because you were replying to vissarix, who was talking only about gaming performance. So naturaly I assumed It was only about performance and not the rest.
The most important gaming resolution is 4K and that's a loss.
BTW RT performance is a big loss for RDNA2, maybe It will change in the future patches, we will see, but now It's still a big loss.
Both AMD and Nvidia have their advantages and disadvantages, so I wouldn't say one is clearly superior!
I can honestly say I am pretty happy with both of their products.
 

TESKATLIPOKA

Senior member
May 1, 2020
265
298
96
Are people seriously arguing over a 5% delta at 4K as if it matters?

A SINGLE new game that skews heavily to NV or AMD will change that picture entirely. It's not as if the difference is massive to start it, please folks.

There's custom 6800XT shown to OC on the air cooler to 2.65Ghz, and with great perf scaling from core OC (not memory bandwidth limited per Computerbase.de tests), 6800XT OC will match or beat 3080 OC even at 4K and extend the lead at 1440 and 1080p.

As for RT, this is RDNA2 RT performance when devs optimize for it, instead of being forced to run RTX games that NV with their long standing GimpWorks program have clearly shown to be detrimental to AMD perf.



RX 6800XT actually loses less performance with RT on vs off here.

So much for inferior RT hardware claims by the clueless masses.
Yep, because It's still faster even If It's a small difference.
You are absolutely right about that the average performance could be heavily influenced by a single game in favor of one of the competition, Dirt 5 and Valhalla are a very good example in favor of AMD as was shown in graph provided by Glo.
Those "clueless" masses check a review at best and in It the RT performance is a lot worse in most of the games, It's not their fault the game makers didn't release a patch fixing It and It's still questionable If It's possible to fix It.
BTW you mentioned Nvidia's anticompetitive behavior, so my question is Big Navi performs significantly better in Dirt 5 and Valhalla, does It mean AMD made changes in the game which are clearly detrimental for the competition? Or only Nvidia does It? ;) I love reading forum posts how when AMD is losing in a game It's Nvidia's fault, but when Nvidia loses then It's clearly because AMD architecture is supperior.:p
 
  • Like
Reactions: insertcarehere

TESKATLIPOKA

Senior member
May 1, 2020
265
298
96
According to who?

i say ultrawide 1440P is the most important gaming resolution.

So there.....
According to AMD.:D At the presentation the strongest 6900XT was shown in only 4K.
But now seriously, If you want to eliminate the CPU limitation, you use the highest resolution, right? 4K is a higher resolution that 1440p with playable FPS unlike 8K.
 

PhoBoChai

Member
Oct 10, 2017
114
363
106
Yep, because It's still faster even If It's a small difference.
You are absolutely right about that the average performance could be heavily influenced by a single game in favor of one of the competition, Dirt 5 and Valhalla are a very good example in favor of AMD as was shown in graph provided by Glo.
Those "clueless" masses check a review at best and in It the RT performance is a lot worse in most of the games, It's not their fault the game makers didn't release a patch fixing It and It's still questionable If It's possible to fix It.
BTW you mentioned Nvidia's anticompetitive behavior, so my question is Big Navi performs significantly better in Dirt 5 and Valhalla, does It mean AMD made changes in the game which are clearly detrimental for the competition? Or only Nvidia does It? ;) I love reading forum posts how when AMD is losing in a game It's Nvidia's fault, but when Nvidia loses then It's clearly because AMD architecture is supperior.:p
Both companies do it. When they work with devs, they optimize for their own architectures and when they differ, its going to hurt the competitor.

I would never go to the absurd length to pretend like AMD good, NV bad. Both are profit companies.

I just point out how absurd some of the ppl are, when they drag out 5% delta as if its a major difference, at one resolution that hardly any gamers even play on. They then ignore OC potential, because RTX Ampere already ship at its limits. Worse is the RT argument, pretending like it's somehow indicative of RDNA2 RT perf when its being tested in NV sponsored games designed for RTX. Any good reviewer worth their salt knows these sponsorship of features gimp the competition.
 

PhoBoChai

Member
Oct 10, 2017
114
363
106
According to AMD.:D At the presentation the strongest 6900XT was shown in only 4K.
But now seriously, If you want to eliminate the CPU limitation, you use the highest resolution, right? 4K is a higher resolution that 1440p with playable FPS unlike 8K.
There is no indication at 1440p is CPU limited, at all. Even 1080p is questionable given reviewers testing on 720p show many games still gain nice perf moving down to those resolutions.

Facts are the 2x FP32 design on RTX Ampere is less utilized at lower resolution. Accept it and know that it only shines at 4K (or very heavy async compute games).
 
  • Like
Reactions: Tlh97 and Yosar

TESKATLIPOKA

Senior member
May 1, 2020
265
298
96
Ok. let's say 1440p is not CPU bound, does that mean It should be the main indicator of a GPU performance? 4K is still much more demanding on GPU hardware and that's why I prefer It over 1440p.

I don't disagree with Ampere having problems with utilization of FP32(Cuda cores) at lower resolution, It's a problem on every Ampere based GPU(RTX 3070, 3080, 3090).
 
Last edited:

TESKATLIPOKA

Senior member
May 1, 2020
265
298
96
Now let's move to something much more interesting.
I asked an owner of RX 6800XT named HEAD from another forum(pctforum.tyden.cz) to test his card at 1800mhz and default clockspeed, here are his findings.

Everything is the average value and the tested game was Control:
Frequency: 1810Mhz(100%) vs 2230Mhz(123%)
Performance: 94.9FPS(100%) vs 105.9FPS(111.6%)
Power consumption: 170W(100%) vs 255W(150%)
Voltage was set to 0.8V, he said It can't be set any lower and I am not sure If the power consumption was for the whole card, I hope It was.
It looks like AMD didn't really lie in the graph about power efficiency at lower clockspeed.
It looks like AMD could have a very good lineup in laptops, finally!

P.S. For comparison, his Undervolting gave him 108FPS with 213W power consumption and the average clockspeed was 2251Mhz.
 
Last edited:

Shivansps

Diamond Member
Sep 11, 2013
3,110
786
136
Are people seriously arguing over a 5% delta at 4K as if it matters?

A SINGLE new game that skews heavily to NV or AMD will change that picture entirely. It's not as if the difference is massive to start it, please folks.

There's custom 6800XT shown to OC on the air cooler to 2.65Ghz, and with great perf scaling from core OC (not memory bandwidth limited per Computerbase.de tests), 6800XT OC will match or beat 3080 OC even at 4K and extend the lead at 1440 and 1080p.

As for RT, this is RDNA2 RT performance when devs optimize for it, instead of being forced to run RTX games that NV with their long standing GimpWorks program have clearly shown to be detrimental to AMD perf.



RX 6800XT actually loses less performance with RT on vs off here.

So much for inferior RT hardware claims by the clueless masses.
Is people still arguing about who better when AMD and Nvidia arent even fighting anymore? Lets be honest here, AMD launched a slightly slower gpu than a 3080 at a slightly lower price, and a gpu that was faster than the 3070 at a higher price... and as things stand right now, when the 6700XT launches it will be slower and cheapper than a 3070.
 

TESKATLIPOKA

Senior member
May 1, 2020
265
298
96
If they are not fighting what are they doing? Performance/price also got significantly better compared to Turing.
RX 6700 with only 40CU should be slower than 3070 based on the performance of Big Navi, even clocking It at 2.5Ghz won't help It to make up for the difference.
If It turns out It's pretty close, then Big Navi has most likely the same problem as Ampere in effectively using a high number of CU or SM.

Does someone know If N22 will be shown this year or just at the beggining of the next?
 
Last edited:

uzzi38

Golden Member
Oct 16, 2019
1,220
2,250
96
Is people still arguing about who better when AMD and Nvidia arent even fighting anymore? Lets be honest here, AMD launched a slightly slower gpu than a 3080 at a slightly lower price, and a gpu that was faster than the 3070 at a higher price... and as things stand right now, when the 6700XT launches it will be slower and cheapper than a 3070.
You should be counting your lucky stars there is even a GA102 based 3080 in the first place as opposed to a GA104 based 3080 with GA102 being restricted to the 3090 at the same $1500 price tag.
 

Panino Manino

Senior member
Jan 28, 2017
279
327
106
Die shots of the new Console APU's is taking so long!
There's some "heated discussion" right now about the SeX supposedly underperforming, but the possibility that some amount of Infinite Cache on the PS5's is showing it's presence is still open. Like in the desktops it can't help much at 4K, but at lower resolutions it makes wonders.

I'm curious if AMD will also put some amount of IC along with RDNA2 inside the next mobile APUs. If they do this they could nullify Intel's challenge with Xe.
 
  • Like
Reactions: Olikan

TESKATLIPOKA

Senior member
May 1, 2020
265
298
96
Die shots of the new Console APU's is taking so long!
There's some "heated discussion" right now about the SeX supposedly underperforming, but the possibility that some amount of Infinite Cache on the PS5's is showing it's presence is still open. Like in the desktops it can't help much at 4K, but at lower resolutions it makes wonders.

I'm curious if AMD will also put some amount of IC along with RDNA2 inside the next mobile APUs. If they do this they could nullify Intel's challenge with Xe.
Dieshots of both Xbox series S and X are out, only PS5 is missing.
There were some leaks about Van Gogh and Rebrandt die size and they were quite big, so I believe there will be IC in them, probably only 16MB, but even that is something.
 

Shivansps

Diamond Member
Sep 11, 2013
3,110
786
136
You should be counting your lucky stars there is even a GA102 based 3080 in the first place as opposed to a GA104 based 3080 with GA102 being restricted to the 3090 at the same $1500 price tag.
if the RX6800 was performing 15% over a GA104 3080 you can be sure it would not have launched at $580. This is why the whole discussion is pointless, AMD and Nvidia arent doing a price war, they are complementing each other products and avoiding the same price points, at least for now. And i dont expect this to change with a 40CU 6700XT.
 

CakeMonster

Senior member
Nov 22, 2012
973
72
91
I do think the 4K benchmarks are worth more than then 1080p benchmarks. 4K will tell you more about future needs with regards to more geometry and bandwidth needs as 1440p will get that kind of load with future games. Its not the best approximation, but it is one of the better ones.
 
  • Like
Reactions: amenx

amenx

Platinum Member
Dec 17, 2004
2,740
494
126
I do think the 4K benchmarks are worth more than then 1080p benchmarks. 4K will tell you more about future needs with regards to more geometry and bandwidth needs as 1440p will get that kind of load with future games. Its not the best approximation, but it is one of the better ones.
Yep, its the only thing I look at. Cant go back to less than 4k, mainly because it allows you to have much larger screens than 1080p.
 

Mopetar

Diamond Member
Jan 31, 2011
5,049
1,552
136
I'd drop down from 4K if the game needs it for a good frame rate. It doesn't matter how sharp or good the visuals look when I'm just watching a slideshow. This is actually an area where I think NVidia's DLSS has a legitimate use, but I'm not going to use it as a crutch for crippling RT performance.
 
  • Like
Reactions: kurosaki

uzzi38

Golden Member
Oct 16, 2019
1,220
2,250
96
if the RX6800 was performing 15% over a GA104 3080 you can be sure it would not have launched at $580. This is why the whole discussion is pointless, AMD and Nvidia arent doing a price war, they are complementing each other products and avoiding the same price points, at least for now. And i dont expect this to change with a 40CU 6700XT.
Because a price war is the stupidest move to make when you can't supply the market?
 

Glo.

Diamond Member
Apr 25, 2015
4,589
3,190
136
Yes, It was difficult, because you were replying to vissarix, who was talking only about gaming performance. So naturaly I assumed It was only about performance and not the rest.
The most important gaming resolution is 4K and that's a loss.
BTW RT performance is a big loss for RDNA2, maybe It will change in the future patches, we will see, but now It's still a big loss.
Both AMD and Nvidia have their advantages and disadvantages, so I wouldn't say one is clearly superior!
I can honestly say I am pretty happy with both of their products.
Vissarix doesn't have the will to be factual, but only to troll.

So I took the opportunity to show that his agenda has no place, because factually AMD's product is superior to Nvidia's, taking into account ALL of its package.
 

TESKATLIPOKA

Senior member
May 1, 2020
265
298
96
Vissarix doesn't have the will to be factual, but only to troll.

So I took the opportunity to show that his agenda has no place, because factually AMD's product is superior to Nvidia's, taking into account ALL of its package.
Here are some facts.
No DLSS alternative so far.
Much worse RT performance so far.
A bit worse 4K performance.
This can't be called as a superior product, but I won't call Ampere as a superior product either. If things change in the next half a year, I will change my opinion.
 
Last edited:

Mopetar

Diamond Member
Jan 31, 2011
5,049
1,552
136
Vissarix doesn't have the will to be factual, but only to troll.

So I took the opportunity to show that his agenda has no place, because factually AMD's product is superior to Nvidia's, taking into account ALL of its package.
You're falling into the troll's trap where you feel the need to not just refute the claims, but counter-argue them. It's pretty easy to laugh off the idea that Ampere is superior to Navi 21. Why bother trying to argue in the opposite direction when it's similarly pointless to try to argue that AMD has a superior product. You're just going to get into stupid and pointless arguments about what "superior" means or whether some performance metric should be considered.

Arguing wether RDNA2 or Ampere is superior is a fool's errand. At the high end they're within spitting distance of each other and the reason to prefer one over the other is likely to come down to whatever game you primarily play as opposed to anything else. They're both brand new and we haven't even began to see updates and performance tweaks for both yet. Quite frankly anyone should be happy to be able to get their hands on either of the cards.
 

ASK THE COMMUNITY