Question Curious if anybody else is bothered by the RX6800 naming.

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

EliteRetard

Diamond Member
Mar 6, 2006
6,490
1,021
136
I know the name wont change at this point, but if it could do you think RX 6700XT makes more sense?

The 6800 suffers a much larger performance drop then the 6800XT does from the 6900XT.

AMD also suggests their x69/x68 against the RTX3090/3080, why not an x67 vs the 3070?

-

I also feel like the MSRP needs to be reduced to $550, but that's a separate topic.
 
Jul 27, 2020
19,823
13,590
146
What's beyond me is how safe the marketing folks in these companies play when naming stuff. What about:

Radeon InfiniDream?

Geforce Giant?

Intel Innova?

Ok, so all of the above are bad. But they need to come up with better names and model number schemes. I love that the hardware is new and shiny but the names are just rehashes of what we have seen/heard in the past. Get Original, marketing people. Hey, that could be a new slogan :p
 

scineram

Senior member
Nov 1, 2020
361
283
106
Personally I wanted the 6800 to be 64CU. I DO NOT think AMD has a natural competitor to the RTX 3070, hence why they cut an additional 4CU to Navi 21
Obviously AMD wants to maximize binning potential to lower SKUs. Therefore it makes no sense to them to have those configs with a multiple of 16 CU count. That would drastically limit the number of potentially enabled configs.
 

DaaQ

Golden Member
Dec 8, 2018
1,443
1,041
136
Die size doesn't matter, don't know how that relates to what I was discussing.

Historically the high end GPUs were $500, with a few blips around $600
The second best were typically $350, sometimes $400
Third tier were $200-250.

Then they started playing name games, and especially during RTX straight up raised prices.

Ignoring the actual Titan class GPUs (separate die from the top consumer GPUs)
Top tier GPU is now $1500
Second tier now $700
Third $500

So how much is the (realistically x50) fourth tier gonna cost? $375?
For a GPU that would be in line with a GT450-750 around $120.

Even the GTX950 and 1050ti that that seemed inflated at $150+ are amazing in comparison.

The x30 class GPUs (1660, 2660?) are $230-280! What the 3070 should cost (as a 3060)

The IGP class garbage like the GT 1030 is now holding the sub $200 market.

If they hadn't played these stupid name games, I would have been fine with a small price climb over time.

Apparently stupid wishful thinking:

3090 (really 3080) $700
3080 (really 3070) $525
3070 (really 3060) $350
3060 (really 3050) $250
2660 (really 3030) $175
2650 (really 3010) $125
1030 class GPU eliminate

Realistically I would have actually liked to see a price drop from the RTX 2000 even with the gains.

As they are named:

3090 $1000 (2080ti was "$1000" but realistically $1200)
3080 $600
3070 $400
3060 $300

You have made my point.

You have to go back to pre Kepler days, IE GTX 580 and before.
Kepler was the game changer. GK104 IE flagship die.
This was when the fanboys, performance enthusiasts, or flat out more money than sense people started with the mantra of "who cares about die sizes, IF the performance is there, they can charge FLAGSHIP MONEY.
This is also where the divergence of x80 x70 x60 failed to relate to die codenames.

GK104 released as flagship. Then Hawaii came along. Right before its release, they rebranded Kepler to GK100 IE GTX780 cutdown big die. Then GK110 TITAN, and when Hawaii competed we got GTX780Ti with half the Titan ram.
Rest is history.

Also I may have my codenames slightly mixed up. as in GK100 GK110 ect. But what did they do afterwards, TITAN BLACK. Lets not forget the monstrosity of dual GK110 chips. What was that TITAn called? the 3K US dollar card?

EDIT: Don't get me wrong, we are not in complete disagreement.
I don't know your age, but I remember the Geforce MX2, The 4600Ti ect.
 

EliteRetard

Diamond Member
Mar 6, 2006
6,490
1,021
136
*snip*
I don't know your age, but I remember the Geforce MX2, The 4600Ti ect.

My first PC was a 1983 Kaypro 10.

The first discrete GPU I owned that I can recall the name of was the Diamond Viper V770
I also owned a ti4600 etc. Still have an Athlon XP Barton 2500+ / 9800Pro system.

I had to retire my amazing Trinitron CRT back in like 2018, due to losing housing.
I think the last system I was able to build for myself was during the AMD 64 era.
After that I "retired" and no longer had the money to keep up with buying hardware.

I eventually acquired a C2D E6600 (first gen) system from the friend I built it for (7900GTO).
After that I acquired an AMD PhenomII - 4850 system the same way from the same person.
They are still using the last system I built for them (4790k/GTX970) so I'm still on the Phenom 2.
I did eventually upgrade each time a component hit $100, X6 1100T, 6950, then an RX470 4GB.
I would've stayed on the 6950, but I passed it to ■ when *redacted* needed a GPU replacement.
 
  • Like
Reactions: DaaQ

Ricky T

Member
Oct 31, 2020
48
22
41
I am only bothered by the silly price. Who would be crazy enough not to spend the 70 bucks more and get the 6800xt if you are already spending 579 bucks?
 

lixlax

Member
Nov 6, 2014
187
162
116
I am only bothered by the silly price. Who would be crazy enough not to spend the 70 bucks more and get the 6800xt if you are already spending 579 bucks?
Going by simply the price and perormance of the 3 cards announced, the 6800 should be 500$ (and that was the price I was hoping for, now I'm probably going to wait and see what Navi 22 offers).
From AMD's standpoint it makes much more sense:
-6900XT is the halo product to make the best impression on the charts. It is a fully enabled die with seemingly the best clock/power characteristics. So the least amount of dies are suitable for that card and they priced it where the demand isn't going to be too high.
-6800XT is the card they want you to buy, it's very competitive on the high end for 650$ which means they'll be still getting decent margins. Most of the working dies should be suitable for this card.
-6800 is mainly here to have more than 2 cards on launch day and make the card they want you to buy even more compelling and as a bonus it looks quite good against Nvidias 3070 as well (little bit faster+ 2x the VRAM). To make 6800s AMD needs to cut 25% of the compute resources and I don't think yields are that bad so they'd rather put those dies in more expensive cards.

Edit: Even on AMDs own graphs that they published on their website the 6800 has worse perf/price than the XT. Usually cheaper products offer more bang for buck. As it stands now I wouldn't be suprised if the 6700XT is going to be launched at 400$+ (429-449$ is my guess).

Just my thoughts.
 
Last edited:
Jul 27, 2020
19,823
13,590
146
I still think the 5700 series is getting DXR support otherwise doesn't make much sense for AMD to have no sub-$500 DXR-compatible card while nVidia has the 2060 for "poor" gamers.
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
They need the hardware support for doing that to make any sense at all. They'll - hopefully, but with good reason to expect it - roll new gen lower end stuff out once the pressure of the console launches & so on is released.
 
Jul 27, 2020
19,823
13,590
146
If nVidia can do it with their 1000 series, AMD should be able to do the same. It would be slow, emulated DXR compatibility and if the game is good visually and gameplay-wise, gamers may still consider anything above 15 FPS within the thresholds of playability. It's either that or waiting a long time to save up enough for a new card.
 

YBS1

Golden Member
May 14, 2000
1,945
129
106
-6800 is mainly here to have more than 2 cards on launch day and make the card they want you to buy even more compelling and as a bonus it looks quite good against Nvidias 3070 as well (little bit faster+ 2x the VRAM). To make 6800s AMD needs to cut 25% of the compute resources and I don't think yields are that bad so they'd rather put those dies in more expensive cards.

Edit: Even on AMDs own graphs that they published on their website the 6800 has worse perf/price than the XT. Usually cheaper products offer more bang for buck. As it stands now I wouldn't be suprised if the 6700XT is going to be launched at 400$+ (429-449$ is my guess).

Just my thoughts.
It appears pretty obvious they don't really want anyone to buy the 6800, I'm assuming it's simply in the lineup right now as the only alternative they have at the moment to the 3070/3070ti and as a catch all product for any dies they have that simply can't pass the grade as 6800XT/6900XTs. I wouldn't be surprised if this product simply evaporates in terms of actual availability as the 6700 series comes online and even before that will be scarce. It's not even that is priced poorly compared to the 3070, I'm pretty sure I'd pay the upcharge from the 3070 if I was in the market for that performance bracket, it's just that I'd also be willing to pay the tiny bit more for the 6800XT at that point, it's just in no man's land. It really boils down to the 6800XT (and honestly the 3080 to a lesser degree) making all the cards sandwiching them look like poor values.
 

trinibwoy

Senior member
Apr 29, 2005
317
3
81
Personally I wanted the 6800 to be 64CU. I DO NOT think AMD has a natural competitor to the RTX 3070, hence why they cut an additional 4CU to Navi 21

They didn’t cut 4 additional CUs. Each shader engine has 20 CUs and the 6800 likely has an entire engine disabled. Makes perfect sense.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
7,150
7,645
136
They didn’t cut 4 additional CUs. Each shader engine has 20 CUs and the 6800 likely has an entire engine disabled. Makes perfect sense.

-No, the real reason is that after 64CUs in Fury, and 64 CUsin Vega, then 64CUs in Vega 2, that 64CU's is a cursed number of CUs for AMD and they're avoiding that number at all costs.

64CU RDNA2 was mysteriously performing like an old RTX2080, and cutting the last 4CU's got them back up to 2080Ti+10% :p

True story, a small birdie told me...
 
  • Haha
Reactions: Mopetar

DaaQ

Golden Member
Dec 8, 2018
1,443
1,041
136
My first PC was a 1983 Kaypro 10.

The first discrete GPU I owned that I can recall the name of was the Diamond Viper V770
I also owned a ti4600 etc. Still have an Athlon XP Barton 2500+ / 9800Pro system.

I had to retire my amazing Trinitron CRT back in like 2018, due to losing housing.
I think the last system I was able to build for myself was during the AMD 64 era.
After that I "retired" and no longer had the money to keep up with buying hardware.

I eventually acquired a C2D E6600 (first gen) system from the friend I built it for (7900GTO).
After that I acquired an AMD PhenomII - 4850 system the same way from the same person.
They are still using the last system I built for them (4790k/GTX970) so I'm still on the Phenom 2.
I did eventually upgrade each time a component hit $100, X6 1100T, 6950, then an RX470 4GB.
I would've stayed on the 6950, but I passed it to ■ when *redacted* needed a GPU replacement.


I think we are pretty much in agreement, my original post was to the other person regarding die sizes, then using your quote to reiterate that a majority of people stopped "caring" for lack of a better word, about die sizes.

I explicitly remember people/Fboys saying die size does NOT matter as long as performance is there. Similar to your comment of being abused ect.

The above brings back memories.
My first actual "computer" was the TRS80. :p cassette tape drive and all. Wish I could remember the Text game I played on that. I want to say Motizuma's revenge or something of the sort. (never beat it)

My first dedicated GPU would have been a Matrox Millenia (millenium??) I can't remember exact name. Came in a Dell prebuilt I bought for like $3500 in 97 or 98. It had a Pentium 266Mhz with MMX Technology!! lol
But hey, it played Diablo 1.

First self built had a Pentium 3, maybe 866mhz?? Right around the time Rambus was making a stink, and DDR was breaking onto the scene. IIRC it had a VIA chipset motherboard (ASUS) in order to use DDR memory instead of Rambus. I forget what the northbridge ran at 66Mhz or was it 100Mhz.

Then I built a P4 Northwood, with 133 or was it 266Mhz FSB. I wasn't able to get the fabled P4 @2.4 Ghz that overclocked like crazy. Think I got the 2.8, was still able to get it to like 3.4 or 3.5 on the overclock.

Ahh the good old days.
 

scineram

Senior member
Nov 1, 2020
361
283
106
They didn’t cut 4 additional CUs. Each shader engine has 20 CUs and the 6800 likely has an entire engine disabled. Makes perfect sense.
Indeed, I was wrong about this config. And it is quite weird. They need 3 fully enabled shader engines. How does that help with die harvesting?
 

EliteRetard

Diamond Member
Mar 6, 2006
6,490
1,021
136
I think we are pretty much in agreement, my original post was to the other person regarding die sizes, then using your quote to reiterate that a majority of people stopped "caring" for lack of a better word, about die sizes.

I explicitly remember people/Fboys saying die size does NOT matter as long as performance is there. Similar to your comment of being abused ect.

The above brings back memories.
My first actual "computer" was the TRS80. :p cassette tape drive and all. Wish I could remember the Text game I played on that. I want to say Motizuma's revenge or something of the sort. (never beat it)

My first dedicated GPU would have been a Matrox Millenia (millenium??) I can't remember exact name. Came in a Dell prebuilt I bought for like $3500 in 97 or 98. It had a Pentium 266Mhz with MMX Technology!! lol
But hey, it played Diablo 1.

First self built had a Pentium 3, maybe 866mhz?? Right around the time Rambus was making a stink, and DDR was breaking onto the scene. IIRC it had a VIA chipset motherboard (ASUS) in order to use DDR memory instead of Rambus. I forget what the northbridge ran at 66Mhz or was it 100Mhz.

Then I built a P4 Northwood, with 133 or was it 266Mhz FSB. I wasn't able to get the fabled P4 @2.4 Ghz that overclocked like crazy. Think I got the 2.8, was still able to get it to like 3.4 or 3.5 on the overclock.

Ahh the good old days.

Ah yes, the Trash 80.
Odd, I can't seem to recall anything else about it...

Like I still know what a punch card was...
Any other memory does not seem to exist.

Actually a ton of old stuff I can barely recall now.

~~~

I remember buying an Asus A7N8X-E Deluxe, then changing my mind last minute.
I ended up getting that P4 2.4 Northwood, on an Asus P4C800-E Deluxe w/ 9700Pro
That system went to my mom, and I swear she sold it for like $250 just a few years ago.
She could sell a broken VCR to Goodwill for $40. I have no idea how she gets away with it.

The XP2500+ system I have now, was built later on for the nostalgia of those systems.
Hard to recall now, but it seems like that was the era when everything was amazing.

I actually had a huge collection of old computers and parts...
It all got "recycled" a few years back when I lost housing.
I was only able to hang on to what I could fit in my car.
 

trinibwoy

Senior member
Apr 29, 2005
317
3
81
Indeed, I was wrong about this config. And it is quite weird. They need 3 fully enabled shader engines. How does that help with die harvesting?

It helps when there are defects in the fixed function units. The 6800xt and 6000xt require 4 functioning engines and only help with salvaging defective CUs.

I’m just guessing that the 6800 has only 3 engines enabled though. It could be the case that it has 4 engines and just disables 20 random CUs across the chip.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,650
218
106
It appears pretty obvious they don't really want anyone to buy the 6800, I'm assuming it's simply in the lineup right now as the only alternative they have at the moment to the 3070/3070ti and as a catch all product for any dies they have that simply can't pass the grade as 6800XT/6900XTs. I wouldn't be surprised if this product simply evaporates in terms of actual availability as the 6700 series comes online and even before that will be scarce. It's not even that is priced poorly compared to the 3070, I'm pretty sure I'd pay the upcharge from the 3070 if I was in the market for that performance bracket, it's just that I'd also be willing to pay the tiny bit more for the 6800XT at that point, it's just in no man's land. It really boils down to the 6800XT (and honestly the 3080 to a lesser degree) making all the cards sandwiching them look like poor values.

From a revenue per mm2 silicon, AMD probably only wants to sell Epycs.

I think AMD just inflated their prices due to the fact NVIDIA cards will rarely sell at MSRP and that they have a decent margin to drop to respond to any NVIDIA moves.
 

EliteRetard

Diamond Member
Mar 6, 2006
6,490
1,021
136
Don't care about naming. People make too much of an issue about naming.

Price and Performance is all.

I can somewhat agree with this...

However, this practice of dishonest marketing and name games has clearly influenced prices.

There is movement in price/performance at the high end, but little in the $350/under market.

Gaining price/performance but having to pay 20% more each year to get it is not enjoyable.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,334
5,452
136
I can somewhat agree with this...

However, this practice of dishonest marketing and name games has clearly influenced prices.

There is movement in price/performance at the high end, but little in the $350/under market.

Gaining price/performance but having to pay 20% more each year to get it is not enjoyable.

Same or different names won't change that.
 
  • Like
Reactions: Mopetar

DaaQ

Golden Member
Dec 8, 2018
1,443
1,041
136
Ah yes, the Trash 80.
Odd, I can't seem to recall anything else about it...

Like I still know what a punch card was...
Any other memory does not seem to exist.

Actually a ton of old stuff I can barely recall now.

~~~

I remember buying an Asus A7N8X-E Deluxe, then changing my mind last minute.
I ended up getting that P4 2.4 Northwood, on an Asus P4C800-E Deluxe w/ 9700Pro
That system went to my mom, and I swear she sold it for like $250 just a few years ago.
She could sell a broken VCR to Goodwill for $40. I have no idea how she gets away with it.

The XP2500+ system I have now, was built later on for the nostalgia of those systems.
Hard to recall now, but it seems like that was the era when everything was amazing.

I actually had a huge collection of old computers and parts...
It all got "recycled" a few years back when I lost housing.
I was only able to hang on to what I could fit in my car.


I hear you there, had that issue once before myself.

Anyway, that was the motherboard I had in bold. Good old P4C800-E Deluxe. Amazing board. Unfortunately I handed it down to one of the kids, and picked up a Stryker?? Striker Extreme. Paired with a QX6600 and SLI 8800 GTX. Board failed. I think that had an Nvidia Nforce 4 chipset.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,334
5,452
136
It does though.

By manipulating customers you can change their buying habits.
After that you can justify any unreasonable thing.

Not sure how, or where you see this. People will pay for the performance they want, at a price in their budget. Changing the card name won't make them pay more.
 
  • Like
Reactions: Mopetar

EliteRetard

Diamond Member
Mar 6, 2006
6,490
1,021
136
Not sure how, or where you see this. People will pay for the performance they want, at a price in their budget. Changing the card name won't make them pay more.

I really don't mean to be rude, but you're incredibly naive.

"Clearly they would never manipulate me! They're such benevolent people, they only do what's good for us."

Do a little research on advertising/marketing.

Most consumers are completely clueless, and rarely have unlimited money to get what they want.