AMD 6000 reviews thread

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
2,958
126


Wow, even the number 3 card has 16GB VRAM and is faster than the 2080TI. And the $1000 6900XT matches the $1500 3090 in performance.

The 3000 parts don't look so hot now.

Post reviews edit:
It's astonishing what AMD have managed to achieve with both the Ryzen 5000 and the Radeon 6000, especially given the absolutely minuscule R&D budget and resources compared to nVidia/Intel. Lisa Su is definitely the "Steve Jobs" of AMD with such a remarkable turnaround.

6900XT:
(It's absolutely amazing to see AMD compete with the 3090)


 
Last edited:

Saylick

Diamond Member
Sep 10, 2012
3,125
6,296
136
Outside of no motion in the scene, the noise caused by the low number of rays is terribly distracting, even with their attempt to blur everything in the scene outside of like 1 -2 models to mask the noise. I think this new synthetic test shows why the hybrid approach is necessary right now and honestly, for quite the foreseeable future.
Agreed. If the benchmark is just rendering static images then it's not an apples-to-apples comparison or benchmark for games, where things have to be rendered in real-time. We'll just have to wait for legit reviews to see the true "usable" RT performance in real games to know how RDNA2 stacks up to Ampere. It does make me wonder if AMD/Nvidia could make an RT-only GPU and sell it to the likes of Pixar for their render farms, however.
 

Mopetar

Diamond Member
Jan 31, 2011
7,831
5,980
136
The 2080ti was the most expensive flagship until the 3090. Both ridiculous bad values.

This is a side point, but I think NVidia has said the 3080 is their "flagship" GPU. The term itself is kind of nebulous and usually excludes the Titan cards. The 3090 obviously isn't a Titan, but it's priced a bit like one even though it doesn't have the drivers or the usual performance uplift to go along with it. Whether it's the real flagship or not is just fodder for endless argument. It is a bad value though.


Cards like the 5850 comfortably beat last gen flagships for $250. 7870 close to the 6970 and beat 5870.

Cards like the 5850 had the advantage of replacing dies that were only a few hundred square millimeters. RV770 was ~250 mm^2 and Cypresss was 350 mm^2 so it makes it easier to do.

The biggest Turing dies were pretty much to the reticle limit. The big dies are so big now that you can't expect performance gains from going even bigger. Similarly we've hit the limits of PCI specifications because the TDP budget has been used as well.

When you aren't up against those walls it's easy to release a card that has 80% more shaders at a higher clock speed due to having a new node to play with. Neither AMD or NVidia have that headroom to let them demolish their own previous generation chips like that anymore.
 

DiogoDX

Senior member
Oct 11, 2012
746
277
136
UL released a new Raytracing test for 3dmark. This one excluseively using Raytracing (unlike Port Royale which was hybrid rendering, like all games except Quake 2 RTX):


My guess is that AMDs hybrid performance was too good, so Nvidia lobbied for this bench to show it's GPUs in better light.

Looking at the video it's clear that this is 100% a synthetic test (like the PCIe 4.0 bandwidth one):

And not a demo-like test like Port Royale (being more similar to actual games using RT, say the latest Watch Dogs):
3DMark is pure joke at this point. Since that limited async compute in they DX12 test to not put pascal in bad light they should have be renamed NvidiaMark.
 

amrnuke

Golden Member
Apr 24, 2019
1,181
1,772
136
Last time I checked the 6800 is starting at $579. The 3070 is starting at $499.

The 2080ti was the most expensive flagship until the 3090. Both ridiculous bad values.

Cards like the 5850 comfortably beat last gen flagships for $250. 7870 close to the 6970 and beat 5870.

GTX 660 close to the GTX 580 performance and that was on a generation where NVIDIA decide to call the 104 die a flagship instead of a midrange.

Even as recently as 2016 cards like the 1060 and 480 competed well with the 980 and 980ti for $250.

Mid range performance has been going down compared to previous flagships while the price has been going up.

All these 3070/3080 and 6800XT/6800 are $100-200 overpriced.
You mention that the 5850 comfortably beat last gen flagships for $250. It was actually $260, and it was 17% faster than the 4890 ($249) but 16% slower than the 4870x2 ($429). So it was slower than the previous generation's flagship.

The 7870 was $350 MSRP, 6970 was $369 MSRP, and the 5870 was $379 MSRP. The 7870 beat the 6970 by 10% and the 5870 by 20%. But the previous generation's flagship wasn't a 6970 or 5870, it was the 6990 or 5970, which beat the 7870 by 30-35% and 10% respectively.

1060 6GB AIBs were at the cheapest $250 but ranged up to $330 and were 15-20% slower than the 980 Ti, which MSRP'd at $649. But remember, it wasn't the top end card, the Titan X at $999 was the top card, and the 1060 6GBs were 25-30% slower than the Titan X. The 1060 6GBs were about on par with the 980, MSRP $549. In other words, a mid-range card from the 10xx generation was on par with a high-mid-range (or low-high-end) card from the 9xx generation.

If we look at Vega -> Navi and compare the 5600XT ($280) mid-range, it fell 25-30% shy of the previous generation's flagship $699 Radeon VII, but it's about on par with the Vega 56/64.

I am having a hard time seeing why there is so much gnashing of teeth over prices. Do people forget that the 6990 and GTX 590 were $699 in 2011 ($820 in today's $$) and the original Titan X was $999 ($1100 in today's $$)? Even the 5970 MSRP'd at $580, the GTX 580 at $500 (>$600 in today's $$).

The 3070/3080 and 6800/6800XT are only overpriced if they are mid-range cards, which I'd argue they aren't. Let's wait for the 6600XT before we start comparing current generation mid-range prices and performance to previous generations.

FWIW, I agree the 6800 is over-priced, but I feel $499-529 is a reasonable price for it.

6800XT at $649 is about right. And historically, so is the halo price of the 6900XT at $999.

This really has been par for the course since 2012 or so. If you look back all the way to at least the 600 series GeForce cards, it's been the case that you have an expensive halo card, and two top end cards priced really high just below it:

600 series - $400 for 670, $500 for 580, and $1000 for 690.
700 series - $649 for 780, $699 for 780 Ti, and $999 for Titan.
900 series - $549 for 980, $649 for 980 Ti, $999 for Titan X.
1000 series - $599 for 1080, $699 for 1080 Ti, $1200 for TITAN X

Same with AMD:
HD7000 series - $450 for 7950, $550 for 7970, $1000 for 7990
RX 200 series - $399 for 290, $549 for 290X, $1499 for 295X2
RX 300 series - $549 for R9 Fury, $649 for R9 Fury X, $1499 for the Pro Duo
RX 400 series - (aberration, the top card, the 480, was $239 but was slower than a 290X from three years prior)
RX 500 series - (same, RX 590 at $279 was slower than an R9 Fury X from three years prior)
Vega series - $399 for Vega 56, $499 for Vega 64, $699 for VII
RX 5000 series - $279 for 5600XT, $349 for 5700, $399 for 5700XT (again somewhat of an aberration without a halo card or the top tier performance to justify higher prices against the RTX 2000 series)

So what does this say?

For the last ~8 years, the script is the same - an expensive halo card and 1-2 high end cards right below it. This is the case with RDNA2 with the 6800 (~GeForce xx80), 6800XT (~GeForce xx80 Ti), and 6900XT (~GeForce Titan). We also see continuous progress in the mid-range (which I'll leave you to review in detail on your own), but in the past there was not the kind of leaps-and-bounds improvement you claim (mid-range beating the flagship from the previous generation). At best, the mid-range is about on par with the 2nd or 3rd best card from the previous generation.
 
Last edited:

GodisanAtheist

Diamond Member
Nov 16, 2006
6,783
7,115
136
With the price creep that has been going on in GPUs for the last however long, my feelings are the sub ~$400 market is best being served by the used market as people playing in the $500-$1500 bracket unload their old cards for the new hotness. The market should soon be flush with 5700XTs, 2060, 2070 and 2080 series cards for reasonable prices. I know as soon as I can get my hands on both a 6800XT and 6900XT my 2080ti is on the way out the door.

- I made a spectacularly bad prediction much like this one in one of the speculation threads prior to the 3xxx card release. The X factor we're all missing is the supply.

Back when Pascal released it definitively beat the prior generation (1080/1070 vs 980Ti/980/970) with better characteristics all around. It was like Kepler 2.0. I was able to pick up my 980ti for $330 shipped used, and it has been the best $/performance deal I have ever gotten on a card.

Trick was, there was plenty of supply on account of the smaller die sizes and less demand on account of their being no pandemic.

Right now if I look on ebay or even my local craigslist, used cards are going for $100-$200 off their new price at best. Plenty of 5700XTs are moving for only $50 off their list price.

As the weather goes sour and people turn to more digital entertainment (than they already were), they're snatching up any kind of deal they can get their hands on.

It's wild, I've never see anything like it.

NV and AMD likely do not have the supply to meet this market at the moment and have made the choice to pick up as much margin as they can on the "Whales" of PC gaming by dropping their biggest chips first and hopefully get some upsell while people are desperate and spending against their best interests.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,628
158
106
Wall of text where it says dual GPU cards were the flagship
SLI or CF solutions were always worse than single fast cards and often wouldn't work. I know at the time people weren't making charts with frame time variance but there is a reason SLI and CF were phased out.

The last 3-5 years have been extremely slow for GPU performance.
 
  • Like
Reactions: KompuKare

amrnuke

Golden Member
Apr 24, 2019
1,181
1,772
136
SLI or CF solutions were always worse than single fast cards and often wouldn't work. I know at the time people weren't making charts with frame time variance but there is a reason SLI and CF were phased out.

The last 3-5 years have been extremely slow for GPU performance.
I'll simplify it since you seem to have forgotten all the arguments you've made in previous posts.

1) The 6800 isn't a mid-range card.

2) Mid-range performance advances at iso-price are larger than top-tier performance advances regardless of price:

Top-tier:
1080 Ti - mid 2015 - $699
3090 - late 2020 - $1500
3090 is twice as fast as 1080 Ti for twice the inflation adjusted price, over a 5 year period
* this assessment will change with the release of the 6800XT, which looks to double 5700XT performance for less than double the price in 1.5 years

Mid-range:
1060 6GB - mid 2016 - $250
5600XT - early 2020 - $280
5600XT is twice as fast at 1060 6GB for the same inflation adjusted price, over a 3.5 year period

3) The last 3-5 years have brought a doubling in 4K/1440 performance on the GPU side, for the mid-range that has occurred over only 3.5 years and for the top end that has occured over 5 years. For the mid-range, that averages 20-25% gains per year for the same inflation-adjusted price. For reference, on the CPU side, gains are about 15% per year for iso-price from AMD, and a little less than that from Intel.
 
  • Like
Reactions: Tlh97

trinibwoy

Senior member
Apr 29, 2005
317
3
81
It does make me wonder if AMD/Nvidia could make an RT-only GPU and sell it to the likes of Pixar for their render farms, however.

It won’t happen but it would be interesting. You could toss the rasterizer, ROPs and tessellation hardware but would still need the shaders and texture units. Might even want to keep tensors for denoising one day.
 

Thala

Golden Member
Nov 12, 2014
1,355
653
136
It won’t happen but it would be interesting. You could toss the rasterizer, ROPs and tessellation hardware but would still need the shaders and texture units. Might even want to keep tensors for denoising one day.

With raytracing you also skip the whole T&L stage (vertex-, geometry-shading and projection), since you are shooting rays in world-space. But since T&L is done with (unified) shaders these days, there is not much to toss here on the HW side.
 
Last edited:

Steltek

Diamond Member
Mar 29, 2001
3,042
753
136
  • Like
Reactions: spursindonesia

GodisanAtheist

Diamond Member
Nov 16, 2006
6,783
7,115
136

Disappointing, but not the end of the world. It is obviously more important that they get it slow-but-right rather than fast and messed up.

- If its like most things AMD where it arrives a little later, works a bit less well, but works on everything for everybody since its built off a bunch of "open" or "universal" standards it will be worth it full stop.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,628
158
106
I'll simplify it since you seem to have forgotten all the arguments you've made in previous posts.

1) The 6800 isn't a mid-range card.
Never said the 6800 was a midrange card.
I said midrange cards starting at $400+ is ridiculous. I even replied to you saying the 3070 and the 6800 started at $500 and $580.

Obviously if the lower high-end is at $500+ only barely beating the last gen flagships, the midrange is going to be at least $350 and/or probably be a good chunk slower than last gen flagship. The 1080Ti is still quite a fast card in comparision to some of these cards.
 

amrnuke

Golden Member
Apr 24, 2019
1,181
1,772
136
I said midrange cards starting at $400+ is ridiculous.
What midrange cards are starting at $400+?

Obviously if the lower high-end is at $500+ only barely beating the last gen flagships
This is by no means characteristic of past performance of the lower end of top tier cards in the past (I'll exclude the halo cards and instead use the xx80 Ti as top-top, xx80 as mid-top, and xx70 as low-top):
- the 3070 barely beats the 2080 Ti
- the 2070 barely loses to the 1080 Ti
- the 1070 handily beats the 980 Ti
- the 970 loses to the 780 Ti
- even the 780 handily loses to the 690
- the 670 loses to the 590

Simple fact of the matter is that even looking back 8 years, the lower high-end has never consistently beaten the last-gen flagship -- not even close if you include the Titan cards.

the midrange is going to be at least $350 and/or probably be a good chunk slower than last gen flagship. The 1080Ti is still quite a fast card in comparision to some of these cards.
Gonna need some backup on the "Obviously [...] the midrange is going to be at least $350 and/or probably be a good chunk slower than last gen flagship."

First of all, you are 100% ignorant (like all of us) about the pricing of the midrange on Nvidia's 3000 series and AMD's 6000 series because they haven't been revealed yet. The pricing of the 3 released cards falls in line with previous releases dating back at least 8 years, so I see no reason to make any assumption that mid-range pricing will change either.

Second, why do you think that the midrange should be faster than last generation's flagship? That hasn't been the case for at least 8 years. As described above, even the low end of the top tier doesn't beat last gen's flagships on a consistent basis. You're putting artificial expectations out there without any historical basis to back it up. There is no rule or reasonable historical basis for stating that the mid-range should beat the last gen's flagship.
 
  • Like
Reactions: Tlh97 and Martimus

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
This is by no means characteristic of past performance of the lower end of top tier cards in the past (I'll exclude the halo cards and instead use the xx80 Ti as top-top, xx80 as mid-top, and xx70 as low-top):
- the 3070 barely beats the 2080 Ti
- the 2070 barely loses to the 1080 Ti
- the 1070 handily beats the 980 Ti
- the 970 loses to the 780 Ti
- even the 780 handily loses to the 690
- the 670 loses to the 590

The 970 lost in some games initially to the 780Ti, but the 780Ti ages very poorly, and as time went on, it lost in more games than it won.

The 690 was a dual GPU card, not a good comparison. The 780 easily beat the 680.

The 590 was also a dual GPU card, so wrong comparison.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,628
158
106
4850 = 8800 Ultra
5770 = 4870/4890
GTX460 = gtx280/285
7870/GTX660/GTX760= 6970/GTX580

GTX1060/RX480/RX580 = 980

But yeah, there was a shift when the x04 became the flagship instead of the midrange. Although NVIDIA brings the x02 when it is either under threat or wants over $1000 cards.

Maybe if AMD becomes competitive again for a few generations, we will see the return of the x02/x00 as the flagship.
 

amrnuke

Golden Member
Apr 24, 2019
1,181
1,772
136
The 970 lost in some games initially to the 780Ti, but the 780Ti ages very poorly, and as time went on, it lost in more games than it won.
At the time of release of the 1060, the 780 Ti and 970 were roughly even at 4K. But I don't want to get into that habit - because as cards age, focus on performance and driver updates wanes, and it would be silly to penalize the 680 simply because it's old.

The 690 was a dual GPU card, not a good comparison. The 780 easily beat the 680.

The 590 was also a dual GPU card, so wrong comparison.
That's silly. You can't toss out "flagship cards" and then say, "well, not that flagship card." And it only applies to the 590 and 690 in any case, so all the rest of that discussion stands. And we are also not going to compare whether there was an improvement generation-to-generation at the same tier - his statement was that the low end of the top tier doesn't even beat the flagship card - as if historically that had been something that had occurred with any kind of frequency - which it hasn't, even if you want to compare 770 to 680 (even) and 670 to 580 (670 wins).

The comparison wasn't "did they improve year to year" but rather "did the low end of the top tier beat out last gen's flagship."

It seems like the goalposts keep moving.
 
  • Like
Reactions: Tlh97

PJVol

Senior member
May 25, 2020
533
446
106
For me personally, what concerns the most is not the performance(+/- 5% from official data is ok), but the price. Just saw msi 3090 in stock at the local retailer for ~ $2000-2100, which doesn't sound optimistic for the projection of the 6800's price
 

MrTeal

Diamond Member
Dec 7, 2003
3,569
1,698
136
At the time of release of the 1060, the 780 Ti and 970 were roughly even at 4K. But I don't want to get into that habit - because as cards age, focus on performance and driver updates wanes, and it would be silly to penalize the 680 simply because it's old.


That's silly. You can't toss out "flagship cards" and then say, "well, not that flagship card." And it only applies to the 590 and 690 in any case, so all the rest of that discussion stands. And we are also not going to compare whether there was an improvement generation-to-generation at the same tier - his statement was that the low end of the top tier doesn't even beat the flagship card - as if historically that had been something that had occurred with any kind of frequency - which it hasn't, even if you want to compare 770 to 680 (even) and 670 to 580 (670 wins).

The comparison wasn't "did they improve year to year" but rather "did the low end of the top tier beat out last gen's flagship."

It seems like the goalposts keep moving.
Comparing the 600 and 700 series is kind of pointless anyway, since they're all Kepler. The 770 is the 680, and big Kepler could have just as easily been released as a 680 Ti, 690 and Titan.
 
  • Like
Reactions: spursindonesia

amrnuke

Golden Member
Apr 24, 2019
1,181
1,772
136
Comparing the 600 and 700 series is kind of pointless anyway, since they're all Kepler. The 770 is the 680, and big Kepler could have just as easily been released as a 680 Ti, 690 and Titan.
Sure. That's entirely missing the forest for the trees.

Lower end of top tier has not consistently beaten the flagship from the prior generation. And certainly the mid-range have not beaten the flagships. That's the only point I'm making with any of these examples.
 
  • Like
Reactions: spursindonesia

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
At the time of release of the 1060, the 780 Ti and 970 were roughly even at 4K. But I don't want to get into that habit - because as cards age, focus on performance and driver updates wanes, and it would be silly to penalize the 680 simply because it's old.


That's silly. You can't toss out "flagship cards" and then say, "well, not that flagship card." And it only applies to the 590 and 690 in any case, so all the rest of that discussion stands. And we are also not going to compare whether there was an improvement generation-to-generation at the same tier - his statement was that the low end of the top tier doesn't even beat the flagship card - as if historically that had been something that had occurred with any kind of frequency - which it hasn't, even if you want to compare 770 to 680 (even) and 670 to 580 (670 wins).

The comparison wasn't "did they improve year to year" but rather "did the low end of the top tier beat out last gen's flagship."

It seems like the goalposts keep moving.

The 590 and 690 were actually TWO flagships cards on a single board. That's why they should not be compared to single GPU cards. Not to mention both GPU's could only be used in some games, and the 99th percentile was typically horrible because of micro-stutter.

770 and 680 are literally the exact same GPU. The whole 700 series (minus GK100 cards) was just a rename.
 
  • Like
Reactions: dr1337

MrTeal

Diamond Member
Dec 7, 2003
3,569
1,698
136
The 590 and 690 were actually TWO flagships cards on a single board. That's why they should not be compared to single GPU cards. Not to mention both GPU's could only be used in some games, and the 99th percentile was typically horrible because of micro-stutter.

770 and 680 are literally the exact same GPU. The whole 700 series (minus GK100 cards) was just a rename.
And the 750 Ti and 750 mainstream Maxwell based cards.
 

obidamnkenobi

Golden Member
Sep 16, 2010
1,407
423
136
UL released a new Raytracing test for 3dmark. This one excluseively using Raytracing (unlike Port Royale which was hybrid rendering, like all games except Quake 2 RTX):


My guess is that AMDs hybrid performance was too good, so Nvidia lobbied for this bench to show it's GPUs in better light.

Looking at the video it's clear that this is 100% a synthetic test (like the PCIe 4.0 bandwidth one):

And not a demo-like test like Port Royale (being more similar to actual games using RT, say the latest Watch Dogs):

The first one looked like shit, the second pretty lame. Just "really good" at showing sunlight/floodlights cast on stuff, is that it?! Expected to pay a $1000 for that? Don't care, seems pretty meh. Uninteresting even at half that price!