AMD 6000 reviews thread

Page 19 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,672
2,817
126


Wow, even the number 3 card has 16GB VRAM and is faster than the 2080TI. And the $1000 6900XT matches the $1500 3090 in performance.

The 3000 parts don't look so hot now.

Post reviews edit:
It's astonishing what AMD have managed to achieve with both the Ryzen 5000 and the Radeon 6000, especially given the absolutely minuscule R&D budget and resources compared to nVidia/Intel. Lisa Su is definitely the "Steve Jobs" of AMD with such a remarkable turnaround.

6900XT:
(It's absolutely amazing to see AMD compete with the 3090)


 
Last edited:

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
7,381
2,415
146
@VirtualLarry


Why do you guys think that it'll be true? I said mining performance will be mediocre, because of the 256-bit bus. At 65MH/s its already taking advantage of the 128MB cache, as 65 x 8 = 520GB/s, which is actually higher than the memory bandwidth of the card plus you have to take off 10% since it won't be 100% efficient.

As for RT performance, few sites are showing worse image quality, so even the lower performance is over represented. Tomshardware results show the 6800 RT is blurry. I wouldn't use it at all if I was considering RT, as even Turing is superior when you count the IQ.

It's a very good effort. Let's see RDNA3.
You may be right about the bus, likely it is somewhat limiting for ETH, that said, optimized miners/settings can help a lot. We have seen this before with other cards. I am not saying it will be a better miner than Ampere, just that I expect some improvements.

Gaming performance looks pretty good. I will likely get a 6800XT when I can, if possible being my first choice. Then RTX 3080 as 2nd.

Finally, if AMD releases some RDNA2 card with HBM2 or something, maybe a pro card, I suspect that would be beast for mining.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,222
5,224
136
Hardware unboxed RX6800 review. Makes 3070 look like entry level card.

Maybe you should just change your profile photo to the AMD logo.

If you changed the manufacturer name on the cards, you would be touting the superior "Cost per Frames" of the 3070, as was AMDs previous generation position.

They are both good cards. Even before testing, it was obvious that 6800 would easily outperform the 3070, at higher price point. So no real surprise. But even the 3070 is over my price limit, so that only pushes the 6800 even further away.

I am more interested in the 3060/3060 Ti vs 6600/6700 battle. Though I am actually thinking of shifting my Build funds to an M1 Mac. In a year of many cool new computer products (Zen 3, RDNA 2, Ampere), the M1 Macs impress me the most, and I never owned an Apple product in my life before...
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,222
5,224
136
But can you even game on an M1 mac?

Obviously not even close to a modern Windows PC. Though likely better than on my 12 year old Windows PC, so it won't be a step backwards for me. Codeweavers (Crossover) were already demonstrating running Windows game in emulation, better than my old PC would handle it.
 

AtenRa

Lifer
Feb 2, 2009
14,000
3,357
136
Maybe you should just change your profile photo to the AMD logo.

Maybe you should refrain from making it personal ??

If you changed the manufacturer name on the cards, you would be touting the superior "Cost per Frames" of the 3070, as was AMDs previous generation position.

"superior" is a strong word for just 1% lower Cost per Frame at 1440p


They are both good cards.

With just 8GB of Vram the 3070 is not a good card for a 500$ product in late 2020/start 2021
 

Whitestar127

Senior member
Dec 2, 2011
397
24
81
I've seen the reviews. Who the heck buys a 3080 to play at 1080p?
It's not as outrageous as you might think. I'm using strobing (v-sync on) on my monitor to remove blur. So wherever possible I will try to use anywhere from 75 to 120Hz refresh rate (pretty much as high as I can go), because 60Hz is a bit flickery and dark (and laggy). And I need the fps to stay above refresh rate otherwise stutters/judders.

If you look at HUB's benchmark of Valhalla for example; even at 1080p Ultra the 1% low is well below 60 for the 3080. Other new games seem to be quite performance heavy as well, and don't get me started on MS Flight Simulator. :grinning:

So yeah, I'll definitely use 1080p for some titles.

EDIT: For the record I'm still on the fence between the 3080 and the 6800XT
 
Last edited:
  • Like
Reactions: Tlh97 and Elfear

Midwayman

Diamond Member
Jan 28, 2000
5,723
325
126
There is a variance in the reviews depending on scene, CPU, RAM etc. There are titles that 6800XT matches or even beats the 3090 at 4k (.eg. Dirt 5, AC Valhalla) so stating "3080 wins at 4k" is more of a narrative you want to believe and not a fact.
Multi game averages from multiple sites show it ahead. Pointing out a few specific titles is 'creating a narrative you want to believe.'

Once PS5/XSX focused cross platform titles start pouring in, it won't look nice at all for Ampere. At least this time they have RT as a strong point else it would have been kepler vs GCN again.
Maybe, maybe not. AMD has had consoles for a long time now and you'd think you would already see amazing optimizations on current hardware. I bought AMD before on that premise and it didn't really materialize. We'll see. I'm evaluating it on how things stand now. Honestly both are going to see improvements through driver optimizations. Its early days for the ampere drivers as well.

It's not as outrageous as you might think. I'm using strobing (v-sync on) on my monitor to remove blur. So wherever possible I will try to use anywhere from 75 to 120Hz refresh rate (pretty much as high as I can go), because 60Hz is a bit flickery and dark (and laggy). And I need the fps to stay above refresh rate otherwise stutters/judders.
Sure, I have a ULMB monitor as well and know all about it. Still they both tend to be more than fast enough on the titles that matter for strobing displays to work well at 1440p. I'm less concerned about single player adventure type games with it. Adaptive sync is more important for those generally.

Anyhow, there are always reasons why you might pick one over the other. Neither card is a bad choice by any means. I was going to buy a 6800XT if I could get my hands on one so I could do an A:B comparison with my 3080 and keep the one I liked better. I expected more OC headroom out of the 6800XT prior to launch. That would have pushed it over the edge. Who knows, maybe the AIB models will do better in a couple weeks. I'll probably try to grab one then and try both since I know I'd be able to see either for at least what I payed.
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
Everyone was clowning the 6800 but it really does look damn good vs the 3070 like I said it would. Just goes to prove, I'm smarter than everyone else as expected.

Anyway, these look like decent AMD GPUs for once, so I'll probably pick one up, whichever I can find next week, doesn't matter which (might even be an Nvidia GPU)... I just need an upgrade.
 

Sonikku

Lifer
Jun 23, 2005
15,749
4,558
136
As has been said many times already.

Lets wait for AIB's inventory.

AMD may simply have had very, very small inventory of REFERENCE designs, and much, much larger inventory of AIB GPUs.

I don't really understand why do we lose minds about reference inventory. IF, and that is a big IF, AIB inventory is as bad - that is the real problem.
But the AIB won't be the value that the reference is. They'll inflate the price.
 

AtenRa

Lifer
Feb 2, 2009
14,000
3,357
136
Who the heck buys a 3080 to play at 1080p?

Gamers buying those monitors bellow

1080p @ 360Hz




 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,839
3,174
126
Sigh 1400 dollars on a RX 600XT.... lmao...

And on top some guy even had 4 he was selling for 1600 each.
I guess AMD did not put any form of antibots in there checkout.

Didnt think they could botch the launch more then Nvidia, but it seems like they both took a leap of death off the grand canyon together with this launch and went straight down hill.
 

vissarix

Senior member
Jun 12, 2015
297
96
101
Heheheheh look what happened once real reviews came out...Unlike those shown by AMD these are different..Too bad when i said this 1 month ago everyone downvoted me
relative-performance_3840-2160.png