• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

AMD 6000 reviews thread

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Gloomy

Golden Member
Oct 12, 2010
1,468
16
81
Those results really explain why Nvidia is rumored to be releasing a 3080Ti and 3070Ti. They make the 3070 look downright pathetic.

I'll probably do a full build this christmas, going to do the big stupid and go full high end everything then settle for a 6700XT if it lines up in performance.

The idea being to hold out until the next refresh when raytracing performance is more reliable at 1440p. I really just need to get off my R9290-- it's been very good to me since I bought it at launch, but it's time to move on I think.
 

blckgrffn

Diamond Member
May 1, 2003
7,690
917
126
www.teamjuchems.com
Those results really explain why Nvidia is rumored to be releasing a 3080Ti and 3070Ti. They make the 3070 look downright pathetic.

I'll probably do a full build this christmas, going to do the big stupid and go full high end everything then settle for a 6700XT if it lines up in performance.

The idea being to hold out until the next refresh when raytracing performance is more reliable at 1440p. I really just need to get off my R9290-- it's been very good to me since I bought it at launch, but it's time to move on I think.
I hear you. I felt the need to pull the trigger a year ago and moved to 5700xt from my 290x but the longer you can wait the better the jump will be! Given the 6700xt should have more grunt than the PS5 it seems that some RT should be available even with that card to tide you over for another year or 18 months :)
 

alexruiz

Platinum Member
Sep 21, 2001
2,751
393
126
I agree with the 6800XT being the better buy.

However if my eyes were set on the 6800 a $80 or $50 price different would not stop me. That is one night of not going out for drinks. And due to covid and no one going out that money is already saved and waiting to be used :p
You are totally right.
It should be fairly clear that AMD does NOT want to sell RX 6800s.
They cut it more than expected (60CU actual vs 64CU speculated), it is clocked much lower, AND it is priced too close to the 6800XT.
They just didn't have a card to cover the ~$500 market, so they just got the saw and cut the Navi21 die even more.

I personally speculate that the 6800 XT performed better than expected, so they were able to make it a 3080 competitor.
That obviously left them without a 3070 competitor, they had to rush one.

Unless they have a 6800 v2 down the road based on a 60CU die, they don't want to sell RX 6800s
 
  • Like
Reactions: Tlh97 and Makaveli

Head1985

Golden Member
Jul 8, 2014
1,853
666
136
I am sure they dont want sell rx6800 for same price as 3070 because they want increase price for rest of the lineup...So they ovepriced rx6800 and now they can sell 6700XT for up to 450-499usd.
They wanted sell 250mm2 tiny die as 5700XT for 450usd, but that didnt work out well.
Anyway with 580usd they didnt kill 3070 same way as they killed 3080 with 6800XT.HUGE mistake.

I am sure 3070 will sell far better because much lower price tag and feature set like DLSS 2.0 and probably better ray tracing performance.If AMD priced rx6800 at 499usd they would kill 3070 for sure.
 
Last edited:

Hitman928

Diamond Member
Apr 15, 2012
3,478
3,723
136
I am sure they dont want sell rx6800 for same price as 3070 because they want increase price for rest of the lineup...So they ovepriced rx6800 and now they can sell 6700XT for up to 450-499usd.
They wanted sell 250mm2 tiny die as 5700XT for 450usd, but that didnt work out well.
Anyway with 580usd they didnt kill 3070 same way as they killed 3080 with 6800XT.HUGE mistake.

I am sure 3070 will sell far better because much lower price tag and feature set like DLSS 2.0 and probably better ray tracing performance.If AMD priced rx6800 at 499usd they would kill 3070 for sure.
They'd probably kill their own profitability as well having to cut so many good dies down to meet the volume needed at the $500 price point. AMD will most likely sell every GPU they make for the next few months, there's no reason for them to reduce margins further on what is already their lowest margin 7 nm product line when they can't make enough to meet demand already.

AMD will come out with a mid range card that will probably almost reach a 3070 but at a decently higher perf/$. It just won't be right away. Probably sometime early next year as console demand decreases and they have more supply for additional GPUs.
 

alexruiz

Platinum Member
Sep 21, 2001
2,751
393
126
I am sure they dont want sell rx6800 for same price as 3070 because they want increase price for rest of the lineup...So they ovepriced rx6800 and now they can sell 6700XT for up to 450-499usd.
They wanted sell 250mm2 tiny die as 5700XT for 450usd, but that didnt work out well.
Anyway with 580usd they didnt kill 3070 same way as they killed 3080 with 6800XT.HUGE mistake.

I am sure 3070 will sell far better because much lower price tag and feature set like DLSS 2.0 and probably better ray tracing performance.If AMD priced rx6800 at 499usd they would kill 3070 for sure.
AMD is happy this round competing at the top, so not killing the rtx 3070 is not a mistake.
The supply of rtx 3070 is so low, that if the supply of RDNA2 GPUs is better, a lot of people will buy them simply because there are no green GPUs to get.
Hence, AMD would like them to get Navi 21 XT instead on Navi 21 XL.
 
  • Like
Reactions: Tlh97 and blckgrffn

Z15CAM

Platinum Member
Nov 20, 2010
2,127
47
91
www.flickr.com
AMD has apparently killed the nVidia RTX 3070 launch for $50 more with their RX 6800 but at the same time killing the RTX 2080 Ti. It's all about profit and I'm sure Wall Street will endorse AMD over Intel and nVidia this time around.

It's about time for competition and hopefully GPU prices should drop inline for the average consumer - That's providing Covid don't kill me 1st in the mean time ;o)

Still after this, GPU prices for the consumer is still ridiculously unreal.
 
Last edited:

coercitiv

Diamond Member
Jan 24, 2014
4,289
5,459
136
Still after this, GPU prices for the consumer is still ridiculously unreal.
Maybe it's time for people to acknowledge we're going through a similar period with the mining boom in terms of supply & demand ratio.

When miners were involved, the market was opportunistic and chip makers could not predict the shifts in demand, so they priced their MSRP according to their (seeingly generous) supply and previous consumer data. The result was cards were nowhere to be found near MSRP. Nowadays demand is mostly gaming based and can be gauged more easily, but both chip makers are obviously unable to meet even their own demand estimates. That's why we're seeing high prices out the gate, they know everything will sell fast. The only reason we're not seeing higher prices still is competition and the tension of a probably price shift once supply improves.

My take on all of this is H1 2021 will see far better price/perf in the GPU arena, and anyone smart enough to play games now and buy cards next year will have the most fun and the best fun.
 

Qwertilot

Golden Member
Nov 28, 2013
1,586
243
106
Maybe it's time for people to acknowledge we're going through a similar period with the mining boom in terms of supply & demand ratio.

When miners were involved, the market was opportunistic and chip makers could not predict the shifts in demand, so they priced their MSRP according to their (seeingly generous) supply and previous consumer data. The result was cards were nowhere to be found near MSRP. Nowadays demand is mostly gaming based and can be gauged more easily, but both chip makers are obviously unable to meet even their own demand estimates. That's why we're seeing high prices out the gate, they know everything will sell fast. The only reason we're not seeing higher prices still is competition and the tension of a probably price shift once supply improves.

My take on all of this is H1 2021 will see far better price/perf in the GPU arena, and anyone smart enough to play games now and buy cards next year will have the most fun and the best fun.
Very sensible indeed :) What's a bit scary is that this is happening with NV having a fairly mature Samsung process nearly to themselves & AMD a decent chunk of 7nm.

I guess one issue for them is that it isn't clear if overall demand actually will be up much over the predicted two year lifespan of these cards. There's good reason to think it can't be up that much.

If all that's happening is a massive front loading of demand into preorders what could they possibly do?
 
  • Like
Reactions: Tlh97 and coercitiv

Mopetar

Diamond Member
Jan 31, 2011
5,741
2,492
136
I think demand will be a bit higher just because a lot of it is pent up between the mining boom making even the lower end consumer cards unbelievably expensive and Turing being such a massive increase in price that a lot of people held off. AMD offered no real competition to NVidia for a large part of that time so there weren't a lot of options.

The high end demand seems crazy right now because the high end prices have come back to previous levels and if you compare those to Turing they seem like an insanely good deal.

Once the $200-$350 cards start to drop we'll probably see an even bigger surge in demand as people start to replace Polaris and Pascal cards they've been hanging onto since before the mining boom or even something predating that generation.

Maybe the new consoles will blunt that slightly since both will offer equivalent performance to the upper end of that category. Hell the new Xbox GPU is basically a slightly more cut down 6800 without the infinity cache and no boost clock. Sure that card has a bad price, but it does suggest that we're aren't going to see $300 cards that blow out the consoles coming next spring.
 

Tup3x

Senior member
Dec 31, 2016
504
375
136
RX 6800 XT would be very compelling upgrade and pair well with my Ryzen 9 5900X build (can't wait to grab it - have other stuff ready) but unfortunately not being able to force anisotropic filtering and not being able to cap frame rate with v-sync when using adaptive sync are deal breakers for me. There are way too many games with broken in game anisotropic filtering that it's just something I can't live without.

Uhh... I hate this situation. NVIDIA has the features but the VRAM size and power consumption (though, that remains to be seen where custom RX 6800 XT cards actually land.) are a big turn off. If AMD would actually offer feature parity with software they would make things very easy for me but they don't. They just keep removing features... Officially they support AF for DX9 games only but apparently that seems to be broken or at least vary greatly from game to game. Their video settings offer only brightness setting (I mean seriously... come on). Also LFC doesn't work perfectly at least on my Renoir laptop and in certain cases tearing happens briefly.

I don't have any plans to upgrade to overpriced 10GB card when there's comparable cards for less with almost double the VRAM.

When it looks like they have finally a winner it's the little things that really make the obvious choice not so obvious.
 
  • Like
Reactions: Tlh97 and CP5670

CP5670

Diamond Member
Jun 24, 2004
4,788
203
106
RX 6800 XT would be very compelling upgrade and pair well with my Ryzen 9 5900X build (can't wait to grab it - have other stuff ready) but unfortunately not being able to force anisotropic filtering and not being able to cap frame rate with v-sync when using adaptive sync are deal breakers for me. There are way too many games with broken in game anisotropic filtering that it's just something I can't live without.

Uhh... I hate this situation. NVIDIA has the features but the VRAM size and power consumption (though, that remains to be seen where custom RX 6800 XT cards actually land.) are a big turn off. If AMD would actually offer feature parity with software they would make things very easy for me but they don't. They just keep removing features... Officially they support AF for DX9 games only but apparently that seems to be broken or at least vary greatly from game to game. Their video settings offer only brightness setting (I mean seriously... come on). Also LFC doesn't work perfectly at least on my Renoir laptop and in certain cases tearing happens briefly.

I don't have any plans to upgrade to overpriced 10GB card when there's comparable cards for less with almost double the VRAM.

When it looks like they have finally a winner it's the little things that really make the obvious choice not so obvious.
I hadn't heard about this. Issues like this are more important to me than slight differences in performance. I've used forced AF across all games for many years and definitely want to keep that. Is there any equivalent of Nvidia Inspector for AMD? I used RadeonPro back in the day but it doesn't expose quite as many options. Nvidia's ecosystem of third-party tools is more developed at this point, and I have already spent the effort setting up profiles and getting my entire existing library of games (going back 20-30 years) working with it. I migrate those profiles whenever I get a new driver or card.

Nvidia does have their own problems, like framerate caps broken in DX8 games and broken MSAA in some old games. There are workarounds but they take time and effort to discover. I haven't used an AMD card in almost 10 years now but liked the ones I had in the past and am open to going with AMD, but this AF issue would be a dealbreaker.
 
  • Like
Reactions: Tup3x

GodisanAtheist

Platinum Member
Nov 16, 2006
2,958
1,454
136
I think demand will be a bit higher just because a lot of it is pent up between the mining boom making even the lower end consumer cards unbelievably expensive and Turing being such a massive increase in price that a lot of people held off. AMD offered no real competition to NVidia for a large part of that time so there weren't a lot of options.

The high end demand seems crazy right now because the high end prices have come back to previous levels and if you compare those to Turing they seem like an insanely good deal.

Once the $200-$350 cards start to drop we'll probably see an even bigger surge in demand as people start to replace Polaris and Pascal cards they've been hanging onto since before the mining boom or even something predating that generation.

Maybe the new consoles will blunt that slightly since both will offer equivalent performance to the upper end of that category. Hell the new Xbox GPU is basically a slightly more cut down 6800 without the infinity cache and no boost clock. Sure that card has a bad price, but it does suggest that we're aren't going to see $300 cards that blow out the consoles coming next spring.
-My hope and prayer is that we see something like the first mining boom: a totally unpredictable scenario that hijacked the demand algorithms and resulted in a glut of unsold inventory that had to be heavily discounted to move once the boom went bust.

In a way, The Rona is similar in that it's something no demand algo is going to be able to properly account for, and doesn't have a clearly defined end date.

As a result, AMD and NV crank up production to the hilt so to not lose or discourage potential buyers, but then the demand evaporates as either the vaccine starts making it's rounds and things open up or (God forbid) we go into a full blown depression.

Card prices crash downward and anyone patient enough to wait walks away with a killer deal (picked up my HD7950 post boom for $200 when it was going for double or triple that mid boom).

One can dream anyway.
 

Mopetar

Diamond Member
Jan 31, 2011
5,741
2,492
136
Well AMD probably can't ramp production that much since TSMC doesn't have additional capacity and they'd have to stop making as many Zen chiplets or console SoCs. It's unlikely they would be able to overproduce right now.

Even if they did, AMD could just swap to more Zen production while their GPU inventory levels out instead of having to cut prices if they think it's just a temporary sales glut.

NVidia will only cut prices if they face competition that's selling at lower prices themselves and AMD doesn't seem interested in undercutting them or at least not by any significant amount.
 

Ajay

Diamond Member
Jan 8, 2001
8,409
3,250
136
NVidia will only cut prices if they face competition that's selling at lower prices themselves and AMD doesn't seem interested in undercutting them or at least not by any significant amount.
No sense for AMD to start a price war with NV. They need more money to expand their businesses, not less. Plus, they need to keep their stock price high as they can, like any publicly traded company. Heck, it was a high stock price that allowed them to buy Xilinx. Lisa Su is a smart cookie, she has no intention of letting off the gas - she wants to make AMD a blue chip semiconductor business.
 

Mopetar

Diamond Member
Jan 31, 2011
5,741
2,492
136
Honestly, the prices this generation are basically back to where they were during the Kepler/Maxwell eras where the flagship card comes in around $700.

The only difference is that AMD has cards that can go blow for blow with NVidia. They aren't just relegated to being the bargain value brand that's limited to mid-range products or tickling the bottom of the high end. They'll price like NVidia now as well.
 

Geranium

Member
Apr 22, 2020
78
98
61
I am sure they dont want sell rx6800 for same price as 3070 because they want increase price for rest of the lineup...So they ovepriced rx6800 and now they can sell 6700XT for up to 450-499usd.
They wanted sell 250mm2 tiny die as 5700XT for 450usd, but that didnt work out well.
Anyway with 580usd they didnt kill 3070 same way as they killed 3080 with 6800XT.HUGE mistake.

I am sure 3070 will sell far better because much lower price tag and feature set like DLSS 2.0 and probably better ray tracing performance.If AMD priced rx6800 at 499usd they would kill 3070 for sure.
Did $200 RX 480 killed the GTX which was priced 250(Partner)/300(Founder) ?? Ans is no. RX 6800 will less even AMD priced it $500 or 580 with 16GB VRAM.
 

Geranium

Member
Apr 22, 2020
78
98
61
RX 6800 XT would be very compelling upgrade and pair well with my Ryzen 9 5900X build (can't wait to grab it - have other stuff ready) but unfortunately not being able to force anisotropic filtering and not being able to cap frame rate with v-sync when using adaptive sync are deal breakers for me. There are way too many games with broken in game anisotropic filtering that it's just something I can't live without.

Uhh... I hate this situation. NVIDIA has the features but the VRAM size and power consumption (though, that remains to be seen where custom RX 6800 XT cards actually land.) are a big turn off. If AMD would actually offer feature parity with software they would make things very easy for me but they don't. They just keep removing features... Officially they support AF for DX9 games only but apparently that seems to be broken or at least vary greatly from game to game. Their video settings offer only brightness setting (I mean seriously... come on). Also LFC doesn't work perfectly at least on my Renoir laptop and in certain cases tearing happens briefly.

I don't have any plans to upgrade to overpriced 10GB card when there's comparable cards for less with almost double the VRAM.

When it looks like they have finally a winner it's the little things that really make the obvious choice not so obvious.
LFC will only work well if difference between Highest refresh rate is 2X of lowest refresh rate. For example if your highest refresh rate is 120 and lowest is 60 then LFC can work. Otherwise LFC will not engage and you will see VVR less related problems. Most laptop screen besides gaming ones usually has 40-60Hz refresh range which doesn't support LFC.

AMD's driver panel do give you option for controlling AF/AA/MAA/Texure filtering Quality/Tessellation Mode control.
 

Geranium

Member
Apr 22, 2020
78
98
61
So RDNA2 can use the ray tracing in Shadow of Tomb Raider? I had thought that was only for Nvidia RTX cards. So can RDNA run ray tracing in all of the other games too such as Control and Metro Exodus?
RDNA2 can do Raytracing if the game uses/supports Microsoft's DRX 1.1 API. If Control and Metro Exosdus uses DXR not Nvidia proprietary one, then RDAN2 can run that.
 

ASK THE COMMUNITY