• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."
  • Community Question: What makes a good motherboard?

Question Speculation: RDNA2 + CDNA Architectures thread

Page 189 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

kurosaki

Senior member
Feb 7, 2019
225
241
76
I don't really understand your reasoning. It's clearly better that reducing the resolution manually. Turning down other quality settings cause higher drop in image quality.
I'd rather play in 1080p native than "4k"DLSS, any day of the weak.
Spit the blue pill out while there's still hope left!
 

kurosaki

Senior member
Feb 7, 2019
225
241
76
Just because a product is available, doesn't mean it is mainstream. Mainstream represents the largest install base. And that is far and away 1080P, by a gigantic margin. For a lot of people $300 on a monitor is too much, when they can get a similar sized one for $120. They don't know the difference between 1440 and 1080.

And as long as games keep getting more demanding, GPU's will also have to get more demanding. Technically we had 4K gaming cards five years ago. And they will run games from five years ago. But definitely not todays games. So it should not be a surprise that you need an upper end card to play on a higher resolution display. A high end card may be twice as fast as a low end card, but a high end display (4K) has quadruple the pixels of a mainstream display (1080).
I know. I think I'm just disappointed that games always are five years ahead of graphics cards, it gets worse, the more powerful setup you have. With a small 1080p screen, i would probably manage five years with relative cheap cards. If I just by a notch go up to 1440p, the demand on the card grows, and the price gets more expensive for the same settings otherwise, and on top of that, the card grows older exponentially. 4k, with the pricing of cards today, and having the urge to buy next gen before the current has arrived on the mail. No thank you.
 

PhoBoChai

Member
Oct 10, 2017
114
363
106
AMD and Nvidia implementations differ quite a bit, so if someone was expecting RTX optimized games (or tech demos) to run well on 6800 series they are seriously naive. Not saying that Dirt5 is representative of what we can expect, but as in all things thuth is in the middle. And btw, expect ray tracing being the "next optimization war".
Why did ppl even expect AMD GPUs to run RTX + NV optimized games well in the first place?

Have they not learnt anything in the past decade?

NV has shown they go out of their way to gimp AMD GPUs whenever they sponsor a game, and even game engines.

I don't expect this to change or suddenly Jensen to be a nice guy.
 

Elfear

Diamond Member
May 30, 2004
6,999
517
126

Mopetar

Diamond Member
Jan 31, 2011
5,049
1,554
136
It's also interesting to look at all of the different results like that in one place. It really makes a few things jump out.

Tom's found the 6800 XT to be 89% of the 3080 in Strange Brigade, but TPU found that it was 107% which is a pretty big difference. I'm kind of curious what's going on that we see such a different result. Before someone says "'cause TPU!" there are plenty of their entries that are inline with other publisher's results.
 
  • Like
Reactions: lightmanek

Hans Gruber

Senior member
Dec 23, 2006
992
291
136
It's also interesting to look at all of the different results like that in one place. It really makes a few things jump out.

Tom's found the 6800 XT to be 89% of the 3080 in Strange Brigade, but TPU found that it was 107% which is a pretty big difference. I'm kind of curious what's going on that we see such a different result. Before someone says "'cause TPU!" there are plenty of their entries that are inline with other publisher's results.
What CPU were they using. What ram were they using and the timings. These are the mysteries of the unknown. Probably a lot of different setups which yield different results.
 

kurosaki

Senior member
Feb 7, 2019
225
241
76
It's also interesting to look at all of the different results like that in one place. It really makes a few things jump out.

Tom's found the 6800 XT to be 89% of the 3080 in Strange Brigade, but TPU found that it was 107% which is a pretty big difference. I'm kind of curious what's going on that we see such a different result. Before someone says "'cause TPU!" there are plenty of their entries that are inline with other publisher's results.
DX12 or vulcan differ hugely in some titles. Might be different APIs?
 
  • Like
Reactions: Tlh97 and Martimus

Mopetar

Diamond Member
Jan 31, 2011
5,049
1,554
136
What CPU were they using. What ram were they using and the timings. These are the mysteries of the unknown. Probably a lot of different setups which yield different results.
Tom's was using both a 5900X and 10900K with 4x8 DDR-4000 memory for the AMD CPU and DDR-3600 for the Intel. TPU used a 9900K with 2x8 DDR-4000 memory. Tom's doesn't list the timings for the memory, but at 4K the CPU shouldn't matter as you can practically use a Celeron and get the same results as the top Intel gaming CPU.

It's not as though a difference is completely unexpected. SotTR was tested in 9 reviews. 4 of them had the 6800 XT at 91 - 93% of the 3080. 2 reviews had it at 97-98%, and the remaining 3 had the 6800 XT at 101 - 102% over the 3080. However those aren't too wide and could be chalked up to which part of the game is being tested.

The swing between Strange Brigade is much larger (and even a little larger than you would expect since it's not as simple as just taking the difference of the two numbers) and once again it's difficult to imagine that any minor differences in CPU (they all test using similarly high-end Intel or AMD CPUs anyways) would create that result since those differences are basically negated at 4K
 
  • Like
Reactions: lightmanek

vissarix

Senior member
Jun 12, 2015
286
91
101
Another overhyped train derailed? Some wise posters over here telling us how Nvidia rushed the RTX30 series because they had an inferior product :tearsofjoy:, and how the RX6800XT was going to be much faster than the RTX3080, but as always reality shows up;)
Screenshot_1.png
 

psolord

Golden Member
Sep 16, 2009
1,308
352
136
^Dude can you quit with the trolling already? How old are you seriously?

Having spent the last 6 years on Nvidia cards, I am certainly not an avid AMD lover, but it seems you forgot other important graphs of that review.Screenshot_4.png

They are selling you a similarly performing card within 5%, at 50$ less and with 60% more VRAM. WTF more you want?

And who said that the 6800XT was going to be faster than the 3080?
 

vissarix

Senior member
Jun 12, 2015
286
91
101
^Dude can you quit with the trolling already? How old are you seriously?

Having spent the last 6 years on Nvidia cards, I am certainly not an avid AMD lover, but it seems you forgot other important graphs of that review.

They are selling you a similarly performing card within 5%, at 50$ less and with 60% more VRAM. WTF more you want?

And who said that the 6800XT was going to be faster than the 3080?
For someone not being an ''AMD lover'' you got seriously butthurt and came back with a weak response and personal attacks, So you would rather buy a card that is 3% more efficient or 5% faster? Guess what? No one in earth is going to consider that 3% of more efficiency in this case..
 

Antey

Member
Jul 4, 2019
92
130
66
Another overhyped train derailed? Some wise posters over here telling us how Nvidia rushed the RTX30 series because they had an inferior product :tearsofjoy:, and how the RX6800XT was going to be much faster than the RTX3080, but as always reality shows up;)
uhm, vissarix, i think you are confused, wccftech would be that way

-->
 

Qwertilot

Golden Member
Nov 28, 2013
1,568
226
106
There were some fairly silly things dreamed of, I think :) In both directions! The cards are priced quite well against each other as you'd expect given the supply/demand situation.

The situation with notebooks could be rather interesting. Looks quite balanced overall (the 3080/90 do drop some perf/watt to get huge compute in). Which is a huge advance for AMD but not going to turn the market over.
 
  • Like
Reactions: prtskg

Glo.

Diamond Member
Apr 25, 2015
4,589
3,190
136
Another overhyped train derailed? Some wise posters over here telling us how Nvidia rushed the RTX30 series because they had an inferior product :tearsofjoy:, and how the RX6800XT was going to be much faster than the RTX3080, but as always reality shows up;)
View attachment 34214
Why don't you also include power draw figures, and 1080p and 1440p charts?

Because simply AMD has superior products, and you do not want that to disprove your agenda.
 
  • Like
Reactions: Tlh97

Leadbox

Senior member
Oct 25, 2010
737
57
91
Why don't you also include power draw figures, and 1080p and 1440p charts?

Because simply AMD has superior products, and you do not want that to disprove your agenda.
Like all the other rabid nvidia fanboys, he's hiding behind 4K and RT results now. Those are the only metrics that matter to them now.
 
  • Like
Reactions: Tlh97

CastleBravo

Member
Dec 6, 2019
117
263
96
Why don't you also include power draw figures, and 1080p and 1440p charts?

Because simply AMD has superior products, and you do not want that to disprove your agenda.
I wouldn't get too carried away. IMO, AMD and NV have both "released" excellent GPUs this generation. I am still leaning towards a 3080, but if CP2077 ends up working decently well with the 6800 XT, and the 6800 XT is the only one available for purchase, I will go team red.
 
  • Like
Reactions: Tlh97 and Martimus

leoneazzurro

Senior member
Jul 26, 2016
325
429
136
Maybe you should include also the tests of Computerbase on new titles, that were not included in that average but have a separate table. Should I?

FPS-Unterschiede RX 6000 vs. RTX 3000

Spiel (3.840 × 2.160)6800 XT vs. 30806800 vs. 3070
Assassin's Creed: Valhalla106 %114 %
COD Black Ops: Cold War99 %113 %
Dirt 5120 %124 %
Mafia: Definitive Edition83 %99 %
Serious Sam 4103 %113 %
Star Wars: Squadrons106 %117 %
Watch Dogs: Legion105 %115 %
Siege AMD/Nvidia5/26/1
3080 und 3070 normiert auf 100 Prozent

 

ASK THE COMMUNITY