AMD 6000 reviews thread

Page 18 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
2,956
126


Wow, even the number 3 card has 16GB VRAM and is faster than the 2080TI. And the $1000 6900XT matches the $1500 3090 in performance.

The 3000 parts don't look so hot now.

Post reviews edit:
It's astonishing what AMD have managed to achieve with both the Ryzen 5000 and the Radeon 6000, especially given the absolutely minuscule R&D budget and resources compared to nVidia/Intel. Lisa Su is definitely the "Steve Jobs" of AMD with such a remarkable turnaround.

6900XT:
(It's absolutely amazing to see AMD compete with the 3090)


 
Last edited:

CastleBravo

Member
Dec 6, 2019
119
271
96
Dev's really want to do a single code path for all platforms (and GPUs) with minimal per-game work. They are also want "really good high quality imaging", "really good scaling" and "no performance hit".

LOL, and I want AMD to pay me money to use their GPU that has supreme performance and generates electricity rather than consuming it.
 

CastleBravo

Member
Dec 6, 2019
119
271
96
There is a variance in the reviews depending on scene, CPU, RAM etc. There are titles that 6800XT matches or even beats the 3090 at 4k (.eg. Dirt 5, AC Valhalla) so stating "3080 wins at 4k" is more of a narrative you want to believe and not a fact. From where I am looking things, the 6800XT is clearly faster than the 3080 on 1080p which is the e-sports resolution and is super important for the twitch crowd, slightly beats it at 1440p and slightly loses at 4k. DLSS and whatever AMD brings on the table are more of a way to get extra frames for competitive shooters to me, judging by Control and the awful sharpening artifacts I see on flat surfaces like posters and walls, it is not something I will be using on single player titles unless they introduce a slider for the sharpening effect these features add. As for drivers, I really wouldn't take the discussion there especially after the very poor Ampere launch drivers showing. Once PS5/XSX focused cross platform titles start pouring in, it won't look nice at all for Ampere. At least this time they have RT as a strong point else it would have been kepler vs GCN again.

Do we know if AC Valhalla and/or Dirt 5 are using VRS?
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,780
7,107
136
I think AIBs are going to have a field day with Navi 21 given some of the OC headroom without even being able to overvolt the card or OC the memory. Looks like overclocks also translate pretty well into performance increases, unlike Navi 10 where OCing didn't appear to move the actual performance needle much.

All said and done, I am feeling good for AMD. This is a hell of an accomplishment for AMD and puts them in a better competitive position than they've been in since Tahiti (even Hawaii had some serious issues at launch that needed AIBs and "Fine Wine" to sort out).

I strongly suspect that the most disappointing element, the RT performance, has a lot of room for improvement as Dev's structure their rendering around it and as AMD refines performance through drivers.
 

Leeea

Diamond Member
Apr 3, 2020
3,617
5,363
136
All said and done, I am feeling good for AMD.

It is a straight up manufacturing race now. Which ever company is able to make the most will win this generation.

They both have advantages, disadvantages going into this:
RX 6000:
smaller die
commodity memory
TSMC superior manufacturing process
but: production is split with the Ryzen 5000, Xbox Series X, and PS5

Nvidia 3000:
nearly the sole focus for Nvidia
main focus of Samsung's production line
two month lead out the gate
but: custom memory, Samsung is having issues, larger die

Nvidia has focus, AMD has easier design.
It will be a shoot out over the coming months.
 
Last edited:

StinkyPinky

Diamond Member
Jul 6, 2002
6,763
783
126
So no 6000's in stock, no 5 series ryzens in stock, no geforces in stock...TSMC/Samsung need to ramp up more. Feel like they are leaving money on the table here.

But I guess compared to phones, laptops etc desktop parts don't sell in volume to the same degree.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
So no 6000's in stock, no 5 series ryzens in stock, no geforces in stock...TSMC/Samsung need to ramp up more. Feel like they are leaving money on the table here.

But I guess compared to phones, laptops etc desktop parts don't sell in volume to the same degree.

Has more to do with the fact that 3/4 of the world is stuck inside their house and want to play video games. Things are being built as fast as they can. nVidia launched 4 months early to get out ahead of AMD, so their stock sucks because Samsung wont be in volume until Q1 of next year.

And GPU's and CPU's sell out in minutes day one always, even in regular year.
 

gdansk

Platinum Member
Feb 8, 2011
2,078
2,559
136
So no 6000's in stock, no 5 series ryzens in stock, no geforces in stock...TSMC/Samsung need to ramp up more. Feel like they are leaving money on the table here.

But I guess compared to phones, laptops etc desktop parts don't sell in volume to the same degree.
I'm guessing Samsung/TSMC's capacity plan for 8nm/7nm didn't expect a surge in demand for 2020. Not that they decreased capacity but they focused more on bringing their respective 5nm processes up.

It's bad business and wasteful to build multi-billion dollar factories unless confident they'll be able to find customers for years and years. Plus I'm pretty sure ASML et. al have been selling all the machines they can make in advance for years now.

Even if TSMC increased capacity the semi-conductor packaging companies in Taiwan and Malaysia are far behind schedule with increased lead times because of this year's surge.
 
Last edited:

GodisanAtheist

Diamond Member
Nov 16, 2006
6,780
7,107
136
So no 6000's in stock, no 5 series ryzens in stock, no geforces in stock...TSMC/Samsung need to ramp up more. Feel like they are leaving money on the table here.

But I guess compared to phones, laptops etc desktop parts don't sell in volume to the same degree.

-DIY market is getting blue balled hard. Likely most manufacturing is going into fulfilling OEM orders and the like.
 
  • Like
Reactions: lightmanek

Guru

Senior member
May 5, 2017
830
361
106
I mean supply is low, but only if you look from the prism of covid19 and people staying at home and wanting to game, I don't think over half a million units available at launch day is low supply.

I don't think Nvidia has low supply either, they might have had lower supply than half a million at release date, my educated guess somewhere around 250k units, but we are in unprecedented times. First most people upgraded when the Pascal and Polaris launch happened, it was a big move from 28nm or whatever to 16nm and we saw a huge performance boost, most people upgraded then and they've been sitting on those graphics till now, since the 2000 series from Nvidia was crap and AMD only released mid and low range cards last gen, so no real upgrade path there.

Now these new cards bring in a significant boost in performance and people who were sitting on 1070's, 1080's, 1080ti's, Vega 56 and 64 and people with older graphic cards now all want to upgrade! We have over 80 million gamers worldwide, its covid19 year and we have government lockdowns and that sort of redacted, so people are really trigger happy to buy a new graphic card, there is probably easily 2 million demand right now for higher end graphic cards!

Profanity isn't allowed in the tech forums.

AT Mod Usandthem
 

beginner99

Diamond Member
Jun 2, 2009
5,210
1,580
136
I mean supply is low, but only if you look from the prism of covid19 and people staying at home and wanting to game, I don't think over half a million units available at launch day is low supply.

Local big online shop doesn't have a date they will get supply nor have they even set a price yet. That tells me everything about the supply situation. For Ryzen there was at least some supply on launch but not just nothing.
 

Leeea

Diamond Member
Apr 3, 2020
3,617
5,363
136
Things are being built as fast as they can. nVidia launched 4 months early to get out ahead of AMD, so their stock sucks because Samsung wont be in volume until Q1 of next year.

And GPU's and CPU's sell out in minutes day one always, even in regular year.

If Nvidia cannot get there act together before next year they are going to lose a lot of market share. They released the RTX 3000 back in September. Four months+ is a long time for the hype train to stay airborne.

AMD and TSMC may be in a crunch right now, but that is a proven manufacturing line with the bugs worked out. It is a sure thing they will turn out a massive quantity of product.

Most buyers are only going to buy one of the following: xBox, PS5, RX 6000, or RTX 3000. Nvidia is only making $ with one of those.


Sure, Nvidia will be fine. But people will realize AMD drivers are not crap, and going into RDNA3/net gen Nvidia brand loyalty to Nvidia is likely to be frayed.
 
  • Like
Reactions: Tlh97 and Stuka87

DisEnchantment

Golden Member
Mar 3, 2017
1,601
5,779
136
A very long stream but surprisingly informative considering how high-up the people being interviewd are. Nice to see some rumors also being confirmed


  • Mentions decision why the chose the Infinity Cache and how they got thee clock-speed increases (architects from the CPU side were/are involved)
  • Talks about RT perf (developers had 2 years with only RTX cards, Herkelman believes things will improve with new titles)
  • Super Resolution - Gamedevs, Microsoft and Sony essentially begged them to not make a proprietary api, but something that could be used everywhere on all hardware (also Intel and Nvidia). Dev's really want to do a single code path for all platforms (and GPUs) with minimal per-game work. They are also want "really good high quality imaging", "really good scaling" and "no performance hit".
  • Why SAM isn't just a PCIe 4.0. bar switch (well it is, but a lot of firmware and BIOS work needed to be done for it to get the performance it does without regressions in other places, Nvidia will face similar issues)
  • Supply (they are shipping daily to partners for AIB cards) explains why they always release
And plenty of other stuff I missed.

IMO it's really iteresting to get some dibits straight from the horses mouth rather than via endless speculators.
They also mentioned professional parts to be announced shortly.

I did not want to get one of the 6000 series because the 5700XT really is very unstable for me on Linux. Also could be that the 5700XT I have is one of the early stepping chips maybe. My 5700XT is doing nothing, most of my work is using RX480, which, surprise, has ROCm support but the 5700XT not.
But Phoronix has a good review of RX6800XT on Linux w/ ROCm beating the cards from the competition soundly.
So I am a bit swayed. Waiting for the Pro parts to come out, hopefully something with HBM.
Thermals and noise are awesome, will be leaps and bounds above my 5700XT.
 
  • Like
Reactions: Tlh97

Gideon

Golden Member
Nov 27, 2007
1,619
3,645
136
They also mentioned professional parts to be announced shortly.

I did not want to get one of the 6000 series because the 5700XT really is very unstable for me on Linux. Also could be that the 5700XT I have is one of the early stepping chips maybe. My 5700XT is doing nothing, most of my work is using RX480, which, surprise, has ROCm support but the 5700XT not.
But Phoronix has a good review of RX6800XT on Linux w/ ROCm beating the cards from the competition soundly.
So I am a bit swayed. Waiting for the Pro parts to come out, hopefully something with HBM.
Thermals and noise are awesome, will be leaps and bounds above my 5700XT.
Yeah it probably doesn't make sense upgrading too often. But regarding Linux performance Wendell seems to really like it:

According to him the last time things were so stable at launch was maybe Polaris time (and even throws out comparisons to 20 year old Matrox cards). PCIe X reset, etc seem to be working well out of the box (very-unlike the 5700 XT) and the latest ROCm supports Big Navi well.
 

H T C

Senior member
Nov 7, 2018
549
395
136
They also mentioned professional parts to be announced shortly.

I did not want to get one of the 6000 series because the 5700XT really is very unstable for me on Linux. Also could be that the 5700XT I have is one of the early stepping chips maybe. My 5700XT is doing nothing, most of my work is using RX480, which, surprise, has ROCm support but the 5700XT not.
But Phoronix has a good review of RX6800XT on Linux w/ ROCm beating the cards from the competition soundly.
So I am a bit swayed. Waiting for the Pro parts to come out, hopefully something with HBM.
Thermals and noise are awesome, will be leaps and bounds above my 5700XT.

According to Wendell from Level1Techs, it's linux support is awesome:


Gideon beat me to it ...
 

RickITA

Junior Member
Nov 8, 2020
3
0
11
I have not read the Anandtech reviews of the Nvidia 3XXX and the Radeon 6XXX cards. If Anandtech has decided not to review GPUs which are not really available to the general public, this is commendable but you should put at least an editorial explaining why you are doing that.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,325
10,034
126
Regarding mining on the RX 6800(XT) card, as shown in the video - rather disappointing, TBH. Then again, RedPandaMining had some estimates last week, and they weren't far off, if a tiny bit high.

I just last week bought 2x RX 5700XT cards, Asus Dual (posted in Hot Deals) for $360 + tax ea. I don't feel so bad about "missing out" on the RX 6800-series launch (for mining), because I'm getting 99MH/sec for 355W at the wall in a G4560 Intel mining "shell" rig, for $720 + tax for the GPUs.

If you think about it, it makes sense: two more-or-less independent cards, EACH have a 256-bit memory bus. How can a single GPU with a larger VRAM size, but only the same bus width, hope to compete on a memory-bandwidth-dependent benchmark like ETH mining. It's effectively a 512-bit bus, when mining ETH together on both cards.

For gaming, especially on my 40" 4K UHD TV, sure, I'd prefer an RX 6800XT. Maybe I'll pick one up in the future.

It would indeed be a hoot, if as someone commented, that AMD is intentionally sandbagging mining in the drivers, until six months after product launch. :p

Edit: BitsBeTripping YT purchase / testing vid of an RX 6800 non-XT (cliffs: 65MH/sec too, nearly same as RX 6800XT).
Skip to 7:45 to see mining results.
 
Last edited:

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
7,399
2,436
146
Keep in mind it may not just be the drivers being immature for mining, but also possibly the mining apps, such as PM or claymore, need to be updated for better performance with new cards.
 
  • Like
Reactions: Tlh97 and Makaveli

Veradun

Senior member
Jul 29, 2016
564
780
136
4k is a cursed resolution...
Ampere doubled FP32, massive bandwidth... not very impressive results

rdna2 doubled everything, plus 0.5Ghz on top... reee.. even worse
That's why Nvidia started the dlss train soon to welcome AMD aboard.

People want to see "ultra" in the settings, so you conjure a tool that puts the game at "high" while you set it at "ultra" and you are good to go.
 
Last edited:

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
@VirtualLarry
Keep in mind it may not just be the drivers being immature for mining, but also possibly the mining apps, such as PM or claymore, need to be updated for better performance with new cards.

Why do you guys think that it'll be true? I said mining performance will be mediocre, because of the 256-bit bus. At 65MH/s its already taking advantage of the 128MB cache, as 65 x 8 = 520GB/s, which is actually higher than the memory bandwidth of the card plus you have to take off 10% since it won't be 100% efficient.

As for RT performance, few sites are showing worse image quality, so even the lower performance is over represented. Tomshardware results show the 6800 RT is blurry. I wouldn't use it at all if I was considering RT, as even Turing is superior when you count the IQ.

It's a very good effort. Let's see RDNA3.