• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."
  • Community Question: What makes a good motherboard?

AMD 6000 reviews thread

Page 32 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Mopetar

Diamond Member
Jan 31, 2011
5,236
1,850
136
Why are you moving goal-posts? We were talking about the validity of the (your) assumption that old titles always run better than new titles. Why are you now talking about modern titles?

It's about expectations. If we run with your claim that old titles always run better on new hardware through sheer brute-force, regardless of optimization, then upon finding out that a new RX 6800XT is even slower than a last gen RTX 2080 Super, we end up with a situation that runs contrary to your claim. For you that's fine, but for me and many others it is not.
I don't think you're reading what I actually wrote or even bother to understand the point I'm trying to argue and so you just distort it and then argue against something I've not said.

Here's my point illustrated perfectly: https://www.tomshardware.com/reviews/crysis-10-year-anniversary-benchmarks,5329.html

It's doubtful that AMD or Nvidia spent much if any serious effort fixing drivers to optimize performance for Crysis as the years passed by, yet the more modern cards run it just fine. It looks like Nvidia has better performance, but are the AMD cards getting unacceptable frame rates? No, because the cards all became powerful enough to brute force their way past any lack of driver optimization.

People have backlogs of games, they wait till they have the requisite hardware to play games at the fidelity levels they want, by which time a new game becomes "old". It happens all the time among my friends' circle who play games. I'd take my experience over your improbable claims any day of the week.
So what's the problem? If you've got an AMD card wait a few more years if that's what it takes. There're probably other games you can play that don't have the same problem, never mind plenty of new games that aren't even graphically intensive. If you're already willing to wait until you can run the game at the graphical fidelity levels you want, just wait until the card you have is powerful enough.

Another justification for what AMD can/will/won't do. You could just accept that it is a flaw instead of rationalizing it.
Okay fine, you win. It's a flaw. Congratulations, here's your internet points. Now why don't you go back to your bridge and enjoy them and stop crapping up this thread.
 
  • Like
Reactions: Tlh97 and Leeea

DaaQ

Senior member
Dec 8, 2018
295
166
86
I don't think you're reading what I actually wrote or even bother to understand the point I'm trying to argue and so you just distort it and then argue against something I've not said.

Here's my point illustrated perfectly: https://www.tomshardware.com/reviews/crysis-10-year-anniversary-benchmarks,5329.html

It's doubtful that AMD or Nvidia spent much if any serious effort fixing drivers to optimize performance for Crysis as the years passed by, yet the more modern cards run it just fine. It looks like Nvidia has better performance, but are the AMD cards getting unacceptable frame rates? No, because the cards all became powerful enough to brute force their way past any lack of driver optimization.



So what's the problem? If you've got an AMD card wait a few more years if that's what it takes. There're probably other games you can play that don't have the same problem, never mind plenty of new games that aren't even graphically intensive. If you're already willing to wait until you can run the game at the graphical fidelity levels you want, just wait until the card you have is powerful enough.



Okay fine, you win. It's a flaw. Congratulations, here's your internet points. Now why don't you go back to your bridge and enjoy them and stop crapping up this thread.
Wait, he didn't address my post.
 

Leeea

Senior member
Apr 3, 2020
327
321
96
Wait, he didn't address my post.
He most likely feels he already answered your claim in this thread:

( note: my repost of his answer is purely informational, and should not be considered as agreement. )

. . .

My "hot" take:
Nvidia designed a video card for dx11. Nvidia's dx11 implementation is superior to AMD. If dx11 is all you care about, you should consider nvidia. Unless your planning on modding the game to use more >10 GB of ram.

More over, AMD will not be spending more resources on dx11. Nobody expects them to. Nobody developing games is doing it for dx11. New consoles do not support it. Game creation tools (unreal) are all dx12. It is an old tech that nobody developing widely played games cares about any more.


In short, the last few pages of this thread have been extremely relevant to the year 2016.
 
Last edited:

CP5670

Diamond Member
Jun 24, 2004
4,607
107
106
Crysis actually still struggles in some areas on modern systems, but that is because it's completely CPU bottlenecked by a single thread. The VTOL level still drops to 30-40fps on modern systems.
 
  • Like
Reactions: Tlh97

Gideon

Golden Member
Nov 27, 2007
1,163
2,106
136
I really wanted a 6800 XT, but seeing how impossible these are to get at a fair price I'm reaaaaaaly tempted
I almost immediately decided to go for the 6800 and after what felt like eons in transit I've finally recieved it!

I have to say the reference design is considerably higher quality than I expected (and the card is seriously heavy for the compact style), it's also very quiet.
I'm really impressed by what it can pull off @1440p (Doom Eternal runs maxed out at around 200FPS, Metro Exodus is actually playable even with RT, but rock solid-over 100FPS without, etc).

... aand It arrived just in time for Cyberpunk. Many thanks to @lightmanek!
 

Stuka87

Diamond Member
Dec 10, 2010
5,294
1,075
136
SAM is very impressive in some titles. Not just 1-2%, but a significant jump.

Something I noticed in the TPU review (again) is just how bad the frame times are on Ampere cards. nVidia at one point made a huge deal about frame times, and even gave their in-house tool to reviewers to push that. But now the tables have turned, and especially the 3090 is all over the place.

In this graph, you can see the dark green line where the majority of the 6900 XT's times come in at.
1607439338799.png
 

repoman0

Diamond Member
Jun 17, 2010
3,195
1,463
136
So 6900XT is worth 7% performance uplift vs the 6800XT and has otherwise identical specs (VRAM etc) for >50% price increase. I do believe they’ve managed to somehow take the “worst value” crown from the 3090, or at least tie it. That would be an impossible sell if the supply situation wasn’t so ridiculous. At least the 3090 has extra VRAM for its outrageous price, which a lot of gamers pretend they need and a lot of professionals actually need.
 

Stuka87

Diamond Member
Dec 10, 2010
5,294
1,075
136
So 6900XT is worth 7% performance uplift vs the 6800XT and has otherwise identical specs (VRAM etc) for >50% price increase. I do believe they’ve managed to somehow take the “worst value” crown from the 3090, or at least tie it. That would be an impossible sell if the supply situation wasn’t so ridiculous. At least the 3090 has extra VRAM for its outrageous price, which a lot of gamers pretend they need and a lot of professionals actually need.
I think the 3090 still wins the worst value title. There are times where it is only 5-10% faster than a 3080 for an additional $800.
 

repoman0

Diamond Member
Jun 17, 2010
3,195
1,463
136
SAM is very impressive in some titles. Not just 1-2%, but a significant jump.

Something I noticed in the TPU review (again) is just how bad the frame times are on Ampere cards. nVidia at one point made a huge deal about frame times, and even gave their in-house tool to reviewers to push that. But now the tables have turned, and especially the 3090 is all over the place.

In this graph, you can see the dark green line where the majority of the 6900 XT's times come in at.
View attachment 35287
I noticed that too, but it’s just BF5. The rest of the Ampere graphs are similarly consistent as RDNA2, not sure I agree that anything inherent in the architecture or drivers makes frame times go “all over the place”.
 

Stuka87

Diamond Member
Dec 10, 2010
5,294
1,075
136
I noticed that too, but it’s just BF5. The rest of the Ampere graphs are similarly consistent as RDNA2, not sure I agree that anything inherent in the architecture or drivers makes frame times go “all over the place”.
Yeah, I may have exaggerated a bit. Many games are very tight for both. But Metro has a similar issue:
1607440719536.png
 

Mopetar

Diamond Member
Jan 31, 2011
5,236
1,850
136
I do believe they’ve managed to somehow take the “worst value” crown from the 3090, or at least tie it. That would be an impossible sell if the supply situation wasn’t so ridiculous.
That isn't even true if you're trying to make the comparison between the full die cards and the cut down versions, which is a rather silly way to go about comparing the cards anyhow comparison anyways.

At 4K the 6900XT is 40% more cost per frame than the 6800XT. The 3090 is 93% more cost per frame than the 3080. If you compare the 3090 to the 6900XT directly, the 3090 has a 40% greater cost per frame. The 3090 just has such a disproportionate cost that the only thing that can possibly have worse cost per frame is the 2080 Ti.

If you're concerned with cost per frame at 4K, neither the 3090 or 6900XT are good values when the 6800XT and 3080 offer some of the best value you can expect to get. The 3070 and 6800 are both slightly better, but not by much and the frame rates aren't nearly as good.
 

repoman0

Diamond Member
Jun 17, 2010
3,195
1,463
136
That isn't even true if you're trying to make the comparison between the full die cards and the cut down versions, which is a rather silly way to go about comparing the cards anyhow comparison anyways.

At 4K the 6900XT is 40% more cost per frame than the 6800XT. The 3090 is 93% more cost per frame than the 3080. If you compare the 3090 to the 6900XT directly, the 3090 has a 40% greater cost per frame. The 3090 just has such a disproportionate cost that the only thing that can possibly have worse cost per frame is the 2080 Ti.

If you're concerned with cost per frame at 4K, neither the 3090 or 6900XT are good values when the 6800XT and 3080 offer some of the best value you can expect to get. The 3070 and 6800 are both slightly better, but not by much and the frame rates aren't nearly as good.
Yeah, I’m well aware that the 3090 is actually the worst value per frame for gaming, and the comment was tongue in cheek, but there is also some truth to it. The 6900XT has no real advantage over the 6800XT aside from ~7% gaming and compute performance — no product differentiation aside from that. On the other hand there are AI/ML workloads at work that I can run on a 3090/Titan class GPU that I can’t on a 3080. Arguably that’s the intended use case (along with trying to keep the overall performance crown for marketing purposes) even though plenty of gamers buy them anyway, and for the intended use case that is a LOT of value.

The good news is patient gamers can save a bunch of money and just buy the 6800XT, and not miss out on anything.
 

tviceman

Diamond Member
Mar 25, 2008
6,733
510
126
www.facebook.com
I'm incredibly impressed with how much AMD has fought back and gained vs. Nvidia, but overall I'm disappointed with both camps this round.

Ray tracing on AMD is exactly the same performance as Turing - a feature everyone rightfully called a tech demo then. I imagine AMD will work with developers to optimize (aka compromise) ray tracing effects as so not horribly tank performance, but RT is already unplayable with AMD in most games that offer it.

On the flip side, Nvidia skipped out on vram storage. 8 gig cards will be fine at 1440p for probably 3 more years, but 4k is going to hit a wall fast. However, if DLSS continues to gain traction, that will lessen the vram conundrum.

Overall, both companies are releasing products today that will age faster at 4k than many past halo cards at their respective top-tier resolutions for their time.
 

mohit9206

Golden Member
Jul 2, 2013
1,341
471
136
Now that the review is out, it's obvious that 6900XT is not worth the money. 6800XT is $350 cheaper and that's the card to get. Or 3080 depending on the preference.
 

Mopetar

Diamond Member
Jan 31, 2011
5,236
1,850
136
Yeah, I’m well aware that the 3090 is actually the worst value per frame for gaming, and the comment was tongue in cheek, but there is also some truth to it. The 6900XT has no real advantage over the 6800XT aside from ~7% gaming and compute performance — no product differentiation aside from that. On the other hand there are AI/ML workloads at work that I can run on a 3090/Titan class GPU that I can’t on a 3080.
The 3090 isn't a Titan and doesn't have Titan drivers. The only use case for it over the 3080 is one where you really do need 24 GB or VRAM. At that point you probably want a professional card with the better drivers. Both the 6900XT and 3090 are for people with more money than sense.

Now that the review is out, it's obvious that 6900XT is not worth the money. 6800XT is $350 cheaper and that's the card to get. Or 3080 depending on the preference.
Well if you managed to snag one at MSRP right away it was considering that it costs around $1000 for a 6800XT at the moment anyhow. :cry:

I wonder what these will end up being scalped for.
 

repoman0

Diamond Member
Jun 17, 2010
3,195
1,463
136
The 3090 isn't a Titan and doesn't have Titan drivers. The only use case for it over the 3080 is one where you really do need 24 GB or VRAM. At that point you probably want a professional card with the better drivers.
There are many professional and research use cases that don’t require or care about Quadro or Titan drivers and do need the VRAM. Universities and small labs would much rather spend $1500 than $4-6k when they just need a fast chip with a lot of RAM for CUDA .. my group at work will be buying plenty for that reason. My point was that AMD didn’t do enough to differentiate the 6900XT in comparison.

Both companies will continue to sell every chip they can make, so it doesn’t really matter.
 

Sonikku

Lifer
Jun 23, 2005
15,558
4,015
136
Anybody have any tips on getting a 6800xt? I've heard some people using bots, not so much to scoop them up automatically, but to alert them when stock is available.
 

GodisanAtheist

Platinum Member
Nov 16, 2006
2,679
1,170
136
Again I'm left a wondering what people were expecting from the 6900XT that it didn't deliver on in its "Founders Edition" form.

We've known for a while that it is an 6800XT with 8 extra CUs, with virtually everything else the same (same RAM/same clocks/same TMUs/same ROPs). So that's 11% extra CUs, which translated into 5%-7% additional performance depending on resolution. Not too shabby given how bottlenecked AMD's previous designs were were virtually all of the top two cards performance difference was the result of clock speed differences.

The general feeling here was that the 6900XT would land somewhere between the 3080 and the 3090, which is exactly what it did.

Next up, we wait and see how the AIBs are able to play with the card and what they can wring out of it. My money is on an AIB 6800XT + 5%-7% additional performance, which should put it just over a stock 3090.
 

KompuKare

Senior member
Jul 28, 2009
641
158
116
So 6900XT is worth 7% performance uplift vs the 6800XT and has otherwise identical specs (VRAM etc) for >50% price increase. I do believe they’ve managed to somehow take the “worst value” crown from the 3090, or at least tie it. That would be an impossible sell if the supply situation wasn’t so ridiculous. At least the 3090 has extra VRAM for its outrageous price, which a lot of gamers pretend they need and a lot of professionals actually need.
Don't give them ideas, but they only reason it looks like that is because all three (6800, 6800XT and 6900XT) have 16GB of VRAM, the obvious tactic (which Nvidia and Intel do in abundance) would be to segment the cards by VRAM: a hypothetical 12GB 6800, and 14GB 6800XT would make a 16GB 6900XT look much better value.
 

Mopetar

Diamond Member
Jan 31, 2011
5,236
1,850
136
Don't give them ideas, but they only reason it looks like that is because all three (6800, 6800XT and 6900XT) have 16GB of VRAM, the obvious tactic (which Nvidia and Intel do in abundance) would be to segment the cards by VRAM: a hypothetical 12GB 6800, and 14GB 6800XT would make a 16GB 6900XT look much better value.
The flip side is that it makes the lower products look like a better value. Basically you get ~93% of a 6900XT for 65% of the cost!

Apparently the supplies are so low that there won't be too many people buying one. One that I heard suggested the total number of 6900XT cards for the UK was 200. That's 200 split across all retailers.

I think AMD is going to need to change from allocating wafer per month to wafers per month with their GPUs if they want to even make a dent in the high-end space.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,917
411
126
I find it odd how quickly attitudes have changed. Before Big Navi hit it was generally considered unthinkable that AMD would be able to hang with the 3090, or any Nvidia card. The 6900 is an amazing piece of hardware as is the 3090 but I can't get excited about either since I can't ****ing buy one.
 

ASK THE COMMUNITY