AMD 6000 reviews thread

Page 17 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
2,958
126


Wow, even the number 3 card has 16GB VRAM and is faster than the 2080TI. And the $1000 6900XT matches the $1500 3090 in performance.

The 3000 parts don't look so hot now.

Post reviews edit:
It's astonishing what AMD have managed to achieve with both the Ryzen 5000 and the Radeon 6000, especially given the absolutely minuscule R&D budget and resources compared to nVidia/Intel. Lisa Su is definitely the "Steve Jobs" of AMD with such a remarkable turnaround.

6900XT:
(It's absolutely amazing to see AMD compete with the 3090)


 
Last edited:

gdansk

Platinum Member
Feb 8, 2011
2,078
2,559
136
So avg gaming power usage one site said the difference between the 3080 VS the 6800XT was a delta of ~100w in favor of the 6800XT using less wattage. This is an average between the benchmarks they put it through. How are the other sites reporting the power usage difference?
ComputerBase reported figures inline with their advertised board power. I.e. a difference of about 20-25W in favor of the 6800XT. They report this as giving RDNA2 a slightly higher performance per watt.
 

blckgrffn

Diamond Member
May 1, 2003
9,123
3,057
136
www.teamjuchems.com
at launch of Radeon 5700XT , it was behind 2070.now after one year , it's ahead of 2070 and almost on par with 2070 Super , so 6800XT will surpass 3080 in 1440p during one year

I was impressed @ Tom's how well the 5700xt was hanging in there as they re-ran a bunch of benches. In BL3, the game that is nearest and dearest to my heart and basically the reason I bought the darn thing, the 5700xt hangs with a 2080S.

1605733616666.png

Spoiler alert, it mostly gets stomped by the 2070S in the other games in their test suite, but I play zero of those so I don't care.

Given no current RT enhancements (sounds like the goal is 60 fps lock @ 4k on Series X and maybe even PS5) I doubt there is anything inbound either.

Honestly, I turn down one thing (volumetric fog) to low and get really solid FPS. I have it set to 100 fps max to give the card a rest during scenes that allow for crazy FPS.

I don't know if I agree that the current cards will get "better", but I do think that it's important to rebench cards with current drivers and builds of games on the regular to get a real feel for things. Heck, the games get some *really* major patches these days.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,628
158
106
There is just so much this now applies to - almost all of the 'launches' are imaginary where supply is such that it takes lottery to get anything
- Nvidia 3000 series
- Zen 3
- PS5 and XBOX X
- AMD 6000

has there been anything in the last 2 month that actually launched?
Obviously the market is bigger now - 1 PC per household isn't enough when everyone is at home.

Also, hardware that isn't a fail was always hard to get at launch day.

That said, AMD probably could use all of the TSMC 7nm allocation and it wouldn't be enough for all the products they are pumping out.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,949
504
126
it would be quite corking to see a random bar for gtx 960 randomly pop into the current benchmarks, wouldn't it? Just, sitting right there for some reason. Never mentioned in the article at all. Just...there it is, in all the charts...
Thumbs up for using corking in a sentence.
Yep, that's a pretty tone deaf tweet.
I don't get it, how come?
 

Midwayman

Diamond Member
Jan 28, 2000
5,723
325
126

HU-18-1080p.png


HU-18-1440p.png

I've seen the reviews. Who the heck buys a 3080 to play at 1080p?They're about equal at 1440p. Close enough that overall advantage depends on which games they tested and if they included SAM in the 6800XT numbers. At 4k the 3080 has a clear win.

Then you have ray tracing where the 6800xt gets creamed. DLSS where there isn't even an answer. Nvidia has a better history with drivers. Then you can add in less gaming oriented things like their broadcast tools, or CUDA.

Like I said, at the resolutions a normal person would buy this card to play on, its a win for the 3080 unless 10gb turns out to be an issue.
 

Hitman928

Diamond Member
Apr 15, 2012
5,244
7,793
136
  1. I believe it's the man who bet $10 the AMD/RTG launch would go better than Ampere
  2. An order nearly half an hour after launch? Unusual

1) He bet $10 that Big Navi wouldn't be a paper launch. There is no formal definition of paper launch but later he followed up basically saying he just meant they would have some unspecified amount of supply. However, he was replying to someone complaining about Ampere supply calling it a paper launch and then betting Big Navi would be too. In that context, the original person clearly was frustrated that Ampere sold out instantly. Frank 'taking the bet' in context would only make sense if Big Navi didn't sell out instantly, but it did. So while Frank is correct in that it wasn't a literal paper launch (no supply at all), it was tone deaf to the situation that people are upset about.

2) Him showing a single order making it through doesn't mean anything. Ampere had orders go through too. Again, tone deaf to the situation that hardly anybody who tried was able to actually purchase one and they were gone basically instantly after coming in stock across multiple retailers.
 
  • Like
Reactions: lightmanek

Glo.

Diamond Member
Apr 25, 2015
5,705
4,549
136
As has been said many times already.

Lets wait for AIB's inventory.

AMD may simply have had very, very small inventory of REFERENCE designs, and much, much larger inventory of AIB GPUs.

I don't really understand why do we lose minds about reference inventory. IF, and that is a big IF, AIB inventory is as bad - that is the real problem.
 
  • Like
Reactions: Tlh97 and KompuKare

Lonyo

Lifer
Aug 10, 2002
21,939
6
81
I've seen the reviews. Who the heck buys a 3080 to play at 1080p?They're about equal at 1440p. Close enough that overall advantage depends on which games they tested and if they included SAM in the 6800XT numbers. At 4k the 3080 has a clear win.

Then you have ray tracing where the 6800xt gets creamed. DLSS where there isn't even an answer. Nvidia has a better history with drivers. Then you can add in less gaming oriented things like their broadcast tools, or CUDA.

Like I said, at the resolutions a normal person would buy this card to play on, its a win for the 3080 unless 10gb turns out to be an issue.


There are 2 games consoles which just came out, both using AMD's architecture. Sure, developers might do some extra work for supporting NV's implementation of features, but they will be doing the work on cross platform games which works for the AMD GPUs in both consoles, and also on the PC hardware.

Remember hardware PhysX support? Having a feature is only meaningful if developers implement it, and the best way to get widespread adoption is for it to be supported by the key hardware. Which will be consoles. Otherwise it's NV working with developers to encourage them to use their own side of things.
 
  • Like
Reactions: Tlh97 and KompuKare

Mopetar

Diamond Member
Jan 31, 2011
7,831
5,980
136
Problem is doubling isn't enough. We have to quadruple everything.

But that's just doubling the doubling if you stop to think about it.

So avg gaming power usage one site said the difference between the 3080 VS the 6800XT was a delta of ~100w in favor of the 6800XT using less wattage. This is an average between the benchmarks they put it through. How are the other sites reporting the power usage difference?

You're probably talking about the TPU results. It was one of the first reviews posted here and the results aren't in line with anything else that's been reported so something must have gone wrong with their testing.

The tweet was half an hour after launch, the order would have been some time in between.

He probably had an advantage since he's closer to AMD's servers and his packets would get there much faster! What a cheater.
 
  • Like
Reactions: guachi and Tlh97

Midwayman

Diamond Member
Jan 28, 2000
5,723
325
126
There are 2 games consoles which just came out, both using AMD's architecture. Sure, developers might do some extra work for supporting NV's implementation of features, but they will be doing the work on cross platform games which works for the AMD GPUs in both consoles, and also on the PC hardware.

Remember hardware PhysX support? Having a feature is only meaningful if developers implement it, and the best way to get widespread adoption is for it to be supported by the key hardware. Which will be consoles. Otherwise it's NV working with developers to encourage them to use their own side of things.

Sure, but ray tracing is a DX standard. I expect both to get supported either way. Its a matter of optimization. DLSS seems to be supported on most newer titles that it really matters. We can speculate all day on what the optimization might look like in a year or two, but I'm just saying what reviews are showing right now. General speculation I've heard is that it'll end up near the 2000 series in terms of RTX power, minus all the weird dips it is seeing now.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,628
158
106
I've seen the reviews. Who the heck buys a 3080 to play at 1080p?They're about equal at 1440p. Close enough that overall advantage depends on which games they tested and if they included SAM in the 6800XT numbers. At 4k the 3080 has a clear win.

Then you have ray tracing where the 6800xt gets creamed. DLSS where there isn't even an answer. Nvidia has a better history with drivers. Then you can add in less gaming oriented things like their broadcast tools, or CUDA.

Like I said, at the resolutions a normal person would buy this card to play on, its a win for the 3080 unless 10gb turns out to be an issue.

1- Well considering everyone was pointing at 2080ti at 1080p to justify intel cpus, maybe alot? (half joking)

2- RT performance is about the same or better as what people were getting just a month ago on the same 3-5 games. On a game like dirt 5, which was optimized for AMD, rt performance was considerably better.

DLSS, AMD claims it is working on their super resolution version. Will see, but I doubt RT will be mainstream if it requires AI to come up with a solution to performance loss. And what happens if you are one architecture behind? Will they keep spending resources to keep your "obsolete" hardware up to snuff?

Some of the streaming stuff is quit cool and so is rtx voice.

Anyway as AMD starts to rack more money, expect them to be more aggressive on sponsoring games - and we saw that AMD sponsored games fall just as strong for AMD as Nvidia sponsored games fall for NVIDIA.
 
  • Like
Reactions: Tlh97

Gideon

Golden Member
Nov 27, 2007
1,625
3,650
136
A very long stream but surprisingly informative considering how high-up the people being interviewd are. Nice to see some rumors also being confirmed


  • Mentions decision why the chose the Infinity Cache and how they got thee clock-speed increases (architects from the CPU side were/are involved)
  • Talks about RT perf (developers had 2 years with only RTX cards, Herkelman believes things will improve with new titles)
  • Super Resolution - Gamedevs, Microsoft and Sony essentially begged them to not make a proprietary api, but something that could be used everywhere on all hardware (also Intel and Nvidia). Dev's really want to do a single code path for all platforms (and GPUs) with minimal per-game work. They are also want "really good high quality imaging", "really good scaling" and "no performance hit".
  • Why SAM isn't just a PCIe 4.0. bar switch (well it is, but a lot of firmware and BIOS work needed to be done for it to get the performance it does without regressions in other places, Nvidia will face similar issues)
  • Supply (they are shipping daily to partners for AIB cards) explains why they always release
And plenty of other stuff I missed.

IMO it's really iteresting to get some dibits straight from the horses mouth rather than via endless speculators.
 

Kuiva maa

Member
May 1, 2014
181
232
116
I've seen the reviews. Who the heck buys a 3080 to play at 1080p?They're about equal at 1440p. Close enough that overall advantage depends on which games they tested and if they included SAM in the 6800XT numbers. At 4k the 3080 has a clear win.

Then you have ray tracing where the 6800xt gets creamed. DLSS where there isn't even an answer. Nvidia has a better history with drivers. Then you can add in less gaming oriented things like their broadcast tools, or CUDA.

Like I said, at the resolutions a normal person would buy this card to play on, its a win for the 3080 unless 10gb turns out to be an issue.

There is a variance in the reviews depending on scene, CPU, RAM etc. There are titles that 6800XT matches or even beats the 3090 at 4k (.eg. Dirt 5, AC Valhalla) so stating "3080 wins at 4k" is more of a narrative you want to believe and not a fact. From where I am looking things, the 6800XT is clearly faster than the 3080 on 1080p which is the e-sports resolution and is super important for the twitch crowd, slightly beats it at 1440p and slightly loses at 4k. DLSS and whatever AMD brings on the table are more of a way to get extra frames for competitive shooters to me, judging by Control and the awful sharpening artifacts I see on flat surfaces like posters and walls, it is not something I will be using on single player titles unless they introduce a slider for the sharpening effect these features add. As for drivers, I really wouldn't take the discussion there especially after the very poor Ampere launch drivers showing. Once PS5/XSX focused cross platform titles start pouring in, it won't look nice at all for Ampere. At least this time they have RT as a strong point else it would have been kepler vs GCN again.