AMD 6000 reviews thread

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,674
2,824
126


Wow, even the number 3 card has 16GB VRAM and is faster than the 2080TI. And the $1000 6900XT matches the $1500 3090 in performance.

The 3000 parts don't look so hot now.

Post reviews edit:
It's astonishing what AMD have managed to achieve with both the Ryzen 5000 and the Radeon 6000, especially given the absolutely minuscule R&D budget and resources compared to nVidia/Intel. Lisa Su is definitely the "Steve Jobs" of AMD with such a remarkable turnaround.

6900XT:
(It's absolutely amazing to see AMD compete with the 3090)


 
Last edited:

SteveGrabowski

Diamond Member
Oct 20, 2014
6,806
5,769
136
Even with console supporting ray tracing we are already seeing the performance hit. Lots of games are 30fps with ray tracing or need to run at 1080p/1440 to get 60fps. I don't know if we will see a huge jump in games using ray tracing or not. It's kind of early for that. However, ray tracing performance will be a definite concern going forward. Leaked benchmarks put the RT performance of the 6800xt about 18% behind the 3080. That's not a very good start. However those same leaks put performance without ray tracing above the 3080. So maybe the raw performance can help make up for the hit ray tracing has? Not sure. Have to see what actual game benchmarks look like.

Like you mention though, some type of high quality super sampling technique is almost necessary to get the needed performance out of ray tracing enabled games. Hopefully the above mentioned technique AMD is working on comes soon and works well.

I do think the floodgates have opened for raytracing by now. It's in Battlefield. It's in Call of Duty. It's going to be in Cyberpunk. It's coming in the Spiderman remaster. It's in Spiderman Miles Morales. It's in the Demon's Souls remake. It's in the upcoming Ratchet & Clank. It'll be in Unreal Engine. I honestly wasn't that excited about raytracing until I saw how clean DLSS-2.0 looked and how much headroom it gave even the RTX 2060. I imagine I'll mostly turn it off on PS5 though unless AMD can do something as impressive as DLSS-2.0 to make 1440p60 or 1800p60 possible with raytracing on the system.
 

thilanliyan

Lifer
Jun 21, 2005
11,848
2,051
126
Is there a "go to" AMD gpu manufacturer for their customer service and quality builds ? For example - EVGA only makes Nvidia boards and is always my go to for the customer service and quality builds such as the FTW series. I have never had a bad EVGA card. I remember when XFX made Nvidia cards. Now they specialize in AMD. Is there one company that's better than the other?
At least in my case, I try to stick with XFX and MSI as they are somewhat friendly to people replacing thermal paste, etc. last I checked.

Personally I don't think there is enough difference between the 6800XT and 6900XT to justify such a large increase in price. I'll probably pick up a 6800XT.

And I REALLY miss HBM lol...sooo much easier to cool/watercool than GDDRX cards. The 5700XT I have is a nightmare to cool compared to the Vega I have because the GDDR6 vram runs so hot.
 
Last edited:
  • Like
Reactions: Kirito

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I do think the floodgates have opened for raytracing by now. It's in Battlefield. It's in Call of Duty. It's going to be in Cyberpunk. It's coming in the Spiderman remaster. It's in Spiderman Miles Morales. It's in the Demon's Souls remake. It's in the upcoming Ratchet & Clank. It'll be in Unreal Engine. I honestly wasn't that excited about raytracing until I saw how clean DLSS-2.0 looked and how much headroom it gave even the RTX 2060. I imagine I'll mostly turn it off on PS5 though unless AMD can do something as impressive as DLSS-2.0 to make 1440p60 or 1800p60 possible with raytracing on the system.

I think that's where their real concern regarding this upscaling is. It's likely something they have been working on in partnership with Xbox and Playstation to get acceptable performance with ray tracing in console games. 30fps is not that great and while I've been somewhat used to it on a console for a while, we all know 60fps and higher just looks and plays better.
 

ModEl4

Member
Oct 14, 2019
71
33
61
Judging from AMD presentation (and the message that AMD is trying to pass - that it has a card that can compete with 3090) as far I can tell we would have in 4K:
(I made the assumption that Rage+SAM brings around +5% on average more performance, this certainly don't hurt AMD because for example if Rage+Sam was more, let's say 10%, this would mean that on a 10900K w/t Rage, 6900XT is just at 3080 level...)
3090 = 6900XT with Rage & SAM (Zen3 with Rage)
3090 = +5% 6900XT (10900K w/t Rage)
3080 = -2.5% 6800XT with Rage & SAM (Zen3 with Rage)
3080 = +2.5% 6800XT (10900K w/t Rage)
3070 = -13% 6800 with Rage & SAM (Zen3 with Rage) (13% margin is 15% markup, which correlate well with Coreteks claim of +15% perf vs 2080Ti in select AMD optimized games)
3070 = -8.5% 6800 (10900K without Rage)
The thing is that AMD's 3080 results in Borderlands and Gears 5 are too low, for example in Gears TechPowerup achieved 84.4fps in Ultra preset with i9-9900K@ 5.0GHz. Also the game suite that AMD used is not exactly in Nvidia's favor. With more games tested I think the average results will be at least +2.5% in Nvidia's favor and we will have:
3090 = +2.5% 6900XT with Rage & SAM (Zen3 with Rage)
3090 = +7.5% 6900XT (10900K w/t Rage)
3080 = 6800XT with Rage & SAM (Zen3 with Rage)
3080 = +5% 6800XT (10900K w/t Rage)
3070 = -11% 6800 with Rage & SAM (Zen3 with Rage)
3070 = -6% 6800 (10900K w/t Rage)
I think that AMD is expecting (tomorrow-17 November?) Nvidia to announce 3080Ti 12GB (312TC, $999?, -2% from 3090?) and 3070Ti 10GB (232TC, $599?, -11% from 3080?) and since 3070Ti will have better perf/$ vs 3080 in contrast with 6800 vs 6800XT, I think AMD will drop 6800 and possibly even 6900XT. (that's why the $579 peculiar price tag? lol I don't like it at all) Probably 6800 will be $549, competitive with 3070Ti in the same degree as 6800XT vs 3080 regarding performance/$.
And possibly 6900XT to $899, on 10900K w/t Rage, 3080Ti should be +5% vs 6900XT and since 12GB is enough, AMD will have nothing to offer if stays at $999. Of course these are all assumptions, I don't have anything to back this up, maybe I'm overthinking it, I don't know, we will see.
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
Pleasantly surprised by SAM. Thought it was another stupid vendor specific feature that wasn't going to be supported, but no it works on every game and is actually a really cool removal of a traditional bottleneck on PCs.

It's also interesting that AMD is now more efficient than Nvidia. Hopefully they still have some overclocking headroom; AMD seemed to be content not to raise the TDP beyond 300 watts and go full retard like Nvidia did, so I'm hopeful.

As for RT and DLSS, I won't comment without more information. I'm actually really annoyed AMD is being coy with this stuff, even if the 6800XT is 30% slower than the 3080 at ray tracing alone... that's actually not that bad. So I don't understand the silence. And no release date for the DLSS competitor is dumb. 😔
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
They can't actually do a 'real' DLSS competitor. DLSS only works because of all the die space NV have devoted to the tensor cores to speed up neutral net processing.

Otherwise it wouldn't speed anything up.

The tensor cores were originally a deep learning thing that was a very definite hindrance to the gaming side in Turing at least.

NV are getting some major payback now of course but it took a lot of work from their software people.
 

vissarix

Senior member
Jun 12, 2015
297
96
101
Pretty disappointing launch from AMD as i expected, they show the rx 6900xt being on par with the rtx3090 on their own biased benchmarks and probably best case scenario, its safe to say that once reviewers got these cards it will be 10% slower, so they price them lower as always...Its a hard sell since they lack proper raytracing and dlss i would rather spend a little more and have those features...

Just remember when they showed Fury X and Radeon VII, in both cases according to AMD benchmarks they were faster than the GTX 980TI and RTX 2080 but in reality they were quite slower and they both failed miserably
 

Timorous

Golden Member
Oct 27, 2008
1,538
2,538
136
The thing is that AMD's 3080 results in Borderlands and Gears 5 are too low, for example in Gears TechPowerup achieved 84.4fps in Ultra preset with i9-9900K@ 5.0GHz. Also the game suite that AMD used is not exactly in Nvidia's favor. With more games tested I think the average results will be at least +2.5% in Nvidia's favor and we will have:

TPU do not test Bordelands with Badass mode and they use the same API for both cards. AMD said they used the best API for each card.

Cross comparing FPS numbers for different setups is a folly, don't do it. Far too many variables to account for.

Just remember when they showed Fury X and Radeon VII, in both cases according to AMD benchmarks they were faster than the GTX 980TI and RTX 2080 but in reality they were quite slower and they both failed miserably

They also showed the 5700XT as being faster than the 2070 and then when the real benchmarks came out the 5700XT was ahead by a few % more than AMD claimed in their event.

On top of that they avoided games like Death Stranding, F1 2020, Horizon:Zero Dawn where the 5700XT is faster than the 2070S @4k so they could have cherry picked a lot harder if they had wanted to.
 
Jul 27, 2020
15,760
9,829
106
Pretty disappointing launch from AMD as i expected, they show the rx 6900xt being on par with the rtx3090 on their own biased benchmarks and probably best case scenario, its safe to say that once reviewers got these cards it will be 10% slower, so they price them lower as always...Its a hard sell since they lack proper raytracing and dlss i would rather spend a little more and have those features...

Just remember when they showed Fury X and Radeon VII, in both cases according to AMD benchmarks they were faster than the GTX 980TI and RTX 2080 but in reality they were quite slower and they both failed miserably
AMD is not stupid hopefully. They are probably holding back some driver optimizations etc. in case nVidia is hiding an ace up their sleeve. They badly need to win against nVidia in this round of GPU wars.
 
  • Like
Reactions: lightmanek

dzoni2k2

Member
Sep 30, 2009
153
198
116
Pretty disappointing launch from AMD as i expected, they show the rx 6900xt being on par with the rtx3090 on their own biased benchmarks and probably best case scenario, its safe to say that once reviewers got these cards it will be 10% slower, so they price them lower as always...Its a hard sell since they lack proper raytracing and dlss i would rather spend a little more and have those features...

Just remember when they showed Fury X and Radeon VII, in both cases according to AMD benchmarks they were faster than the GTX 980TI and RTX 2080 but in reality they were quite slower and they both failed miserably

You were shitting on Zen3 and looked like a fool and now doing the same with RDNA2. If that's your way of coping then so be it.
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
AMD is not stupid hopefully. They are probably holding back some driver optimizations etc. in case nVidia is hiding an ace up their sleeve. They badly need to win against nVidia in this round of GPU wars.
Come on :) AMD don't badly need anything!
(Unless you could magic up an extra fab or two for TSMC.).

AMD are printing money left and right with Zen3, selling an enormous number of console chips etc.

In some ways they're still not even really quite turning up for this round. Starting months behind with a slow deployment of the smaller chips etc.

Definitely an improvement though :)
 

biostud

Lifer
Feb 27, 2003
18,194
4,675
136
$80 extra is for the 8GB additional VRAM. They need to make a few bucks too, unless nVidia declares an all out price war.

They can only expect a certain percent of dies to be perfect, so they use those in the 6900XT and put a price premium on the card limiting the demand for this card. Most dies can be used for the 6800XT so they put a higher price here than the 6800, and makes the 6800XT the best card price/performance, and giving AMD som good margins. To prevent the 6800 to be the best card in price/perforamnce they put a high price on it, so the don't need to sell too many, and lower their margins.
 

AtenRa

Lifer
Feb 2, 2009
14,000
3,357
136
If NVIDIA had a 15-20% higher performance and double the VRAM against the competition , they would priced their card at 200$ higher and everyone would agree with that price due to higher performance and double the VRAM.
Now AMD does that with 6800 vs 3070 and its overpriced for just 70$ more ??
 

Head1985

Golden Member
Jul 8, 2014
1,863
685
136
They can only expect a certain percent of dies to be perfect, so they use those in the 6900XT and put a price premium on the card limiting the demand for this card. Most dies can be used for the 6800XT so they put a higher price here than the 6800, and makes the 6800XT the best card price/performance, and giving AMD som good margins. To prevent the 6800 to be the best card in price/perforamnce they put a high price on it, so the don't need to sell too many, and lower their margins.
Same with 5700/5700XT.They even wanted sell 250mm2 5700xt for 450usd before nv super lineup.Then they lowered price for 400usd.Anyway 5700 non xt was pretty bad value(even oc locked) and most sales was 5700XT
 
  • Like
Reactions: Kirito

Head1985

Golden Member
Jul 8, 2014
1,863
685
136
If NVIDIA had a 15-20% higher performance and double the VRAM against the competition , they would priced their card at 200$ higher and everyone would agree with that price due to higher performance and double the VRAM.
Now AMD does that with 6800 vs 3070 and its overpriced for just 70$ more ??
15-20% where?They said 18% average with SAM enabled.And SAM working only on ryzen 5000+B550/X570 and add another 5-11% performance.So it will be like 8-10% in wide game test without it.
 
Last edited:

vissarix

Senior member
Jun 12, 2015
297
96
101
What a load of bollocks, have the character to admit it was a decent launch. Not long ago you were 100% sure the previous demo GPU was their flagship. Turns out it wasn't, the AMD of today is clearly adapting past your expectations.
I expect the rx6900xt to be on par or slightly faster than the rtx 3080 once REAL reviews comes out, perhaps 1 or 2% faster, the rtx3080 is 10% slower when compared to the rtx3090
 

gk1951

Member
Jul 7, 2019
170
150
116
The words hate, wipe out, humiliate, smash, demolish etc get so over used when comparing cpus and now gpus.

I think what is obvious from AMDs presentation of its Zen3 cpus and now RDNA2 Radeon 6000 series gpus is that it is back in high end competition with both Intel and Nvidia.

I remember "upgrading" my Gateway 2000 386/33 Intel chip with an AMD 386/40. Oh those were the days.

Neither Intel or Nvidia will sit back and allow themselves to be overtaken. The good news for consumers is neither will AMD anymore.

What I find to be refreshing from the gpu Radeon division is the elimination of bravado boasts from the past.

Dr. Lisa Su has instilled a "Detective Joe Friday from Dragnet" philosophy at AMD. Just the facts Mam, just the facts.

Intel has a boatload of money and some very smart engineers. It will compete just fine.

Nvidia has incredible marketing power and superb engineering and it will do just fine.

For me the 5900x/6800XT combo will really leverage the potential of my MSI X570 Unify mb.