Discussion Radeon 6500XT and 6400

Page 11 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Shivansps

Diamond Member
Sep 11, 2013
3,625
1,306
136
The RX 6500XT is here to compete against the GTX1650, It is not replacing the RX 5500XT
If prices where normal 6500XT would cost $129 with NVIDIA RTX3050 at $179
If its not replacing the RX5500XT they should have named it RX 6400XT. With that name and sharing the RX5500XT 8GB launch MSRP it is its succesor, there is nothing to argue here.
 

Shivansps

Diamond Member
Sep 11, 2013
3,625
1,306
136
Doom Eternal has always been a the poster child for PCIe bandwidth, we can see reviews for the 6600XT highlighting this.
You cant blame a game for using modern texture streaming tech just because AMD wants to sell a GT1030 class laptop GPU for $200+ for huge profit.

The games that have the most issues are the ones that use texture streaming in order to run/look better at GPUs with 2-4GB VRAM. And use higher quality textures on 8GB ones. The ones that run better are the ones that do it "the old way". This is the first GPU of 2022 and it cant handle modern techs. The games are not a fault here.
 

AtenRa

Lifer
Feb 2, 2009
13,782
2,954
136
You cant blame a game for using modern texture streaming tech just because AMD wants to sell a GT1030 class laptop GPU for $200+ for huge profit.
Exaggerating much ??? the card its way faster than GTX1650 and reaches GTX1650 Super/GTX1660 territory (when in PCIe Gen 4.0).
OK we got it, it has problems with games that need more than 4GB VRAM when used on a PCI-e Gen 3.0, that doesnt make the 6500XT a GT1030 class though.
 

blckgrffn

Diamond Member
May 1, 2003
8,471
1,800
136
www.teamjuchems.com
You cant blame a game for using modern texture streaming tech just because AMD wants to sell a GT1030 class laptop GPU for $200+ for huge profit.

The games that have the most issues are the ones that use texture streaming in order to run/look better at GPUs with 2-4GB VRAM. And use higher quality textures on 8GB ones. The ones that run better are the ones that do it "the old way". This is the first GPU of 2022 and it cant handle modern techs. The games are not a fault here.
I think we already discussed the profit side of this into the ground. It's probably a lot thinner than we think and it's almost certainly true that AMD would prefer to sell higher margin cards. So... whatever on that front. This feels a market need, which is something better than iGPU and you know, available. Playing just about everything at 60+ FPS at medium settings? Seems to check the box to me.

This card competes with GTX 1050 Ti and 1650 Super. That's really the end of it. If you want Doom Eternal on PC at full tilt you want a better card. Noted.

So, as long as those cards are priced at $300+ we can expect the even loosely comparable card from AMD to cost the same, right? And right now its often the card you can find, not the card you want when you are 13, 33 or 63 and want to get online with your buddies and just play.

Most better used cards now have probably been powered on mining for the last couple years, the odds are super high. If you care about a warranty on your overpriced $300 card then it's a better way to go.
 

Shivansps

Diamond Member
Sep 11, 2013
3,625
1,306
136
Exaggerating much ??? the card its way faster than GTX1650 and reaches GTX1650 Super/GTX1660 territory (when in PCIe Gen 4.0).
OK we got it, it has problems with games that need more than 4GB VRAM when used on a PCI-e Gen 3.0, that doesnt make the 6500XT a GT1030 class though.
The lack of encoders, crippled media block, PCI-E x4, and two video outputs makes it GT1030 class. Even the RX550 had better features than that. This GPU as a RX550 replacement you would be already a compromise due to have less video outputs and no encoders, but the better 3D perf would be more than worth it. As a RX5500XT replacement is the PR disaster we are seeing right now.
 

blckgrffn

Diamond Member
May 1, 2003
8,471
1,800
136
www.teamjuchems.com
The lack of encoders, crippled media block, PCI-E x4, and two video outputs makes it GT1030 class. Even the RX550 had better features than that. This GPU as a RX550 replacement you would be already a compromise due to have less video outputs and no encoders, but the better 3D perf would be more than worth it. As a RX5500XT replacement is the PR disaster we are seeing right now.
It's a disaster? We are getting really, really excited for what would have been a very quiet launch back in the day. This is like a "MOBA" card if the last two years hadn't happened.

Even Tom's gave it a three star shrug, and they usually snub AMD in my view.


Good info there on PCIe 3 vs 4 and choosing between Medium and Ultra settings. Medium settings is like 10% on average vs more like 50% (lows) on Ultra (yikes).
 
Last edited:

Insert_Nickname

Diamond Member
May 6, 2012
4,687
1,278
136
This GPU as a RX550 replacement you would be already a compromise due to have less video outputs and no encoders, but the better 3D perf would be more than worth it.
I would argue the lack of an AV1 decoder is a more pressing concern then the lack of encoders. AV1 is starting to get some traction, and you can always run encodes on the CPU, where it doesn't really matter if it takes all night to do a bluray rip transcode.
 
  • Like
Reactions: Tlh97 and coercitiv

guidryp

Platinum Member
Apr 3, 2006
2,203
2,474
136
Even Tom's gave it a three star shrug, and they usually snub AMD in my view.
And HWUB is usually easy on AMD, and Steve called it the worse video card release in decades.

I think we already discussed the profit side of this into the ground. It's probably a lot thinner than we think and it's almost certainly true that AMD would prefer to sell higher margin cards.
Margins are almost certainly fat. This is a card that should be selling for $99. It's ridiculously cost reduced/crippled in every way (some never seen before like 4 lane PCIe bus), and they set the MSRP at $200. It's a very fat margin card.

Defending this steaming pile of dung, makes no sense.

This card is an insult to consumers, and a poster child for corporate greed.
 

jpiniero

Lifer
Oct 1, 2010
11,829
3,350
136
Margins are almost certainly fat. This is a card that should be selling for $99. It's ridiculously cost reduced/crippled in every way (some never seen before like 4 lane PCIe bus), and they set the MSRP at $200. It's a very fat margin card.
This is a product that wouldn't have gotten a desktop release if it wasn't for those fat margins. Also the mx line has typically been 4 lanes but the mx570 and mx590 might end up being 8.
 

Stuka87

Diamond Member
Dec 10, 2010
5,748
1,619
136
It's not a nonsense metric. The performance is pretty bad for a 101W power consumption.
But it is, because it doesn't mean anything. Power consumption matters when you have to design around it. Such as the high end nVidia cards that use 400-600W of power. But on that graph, these cards are look at favorably. And performance obviously matters. But performance per watt is something nVidia shoved down people's throats when they had a clear advantage in that area. Now that they regularly lose in that comparison, they have not mentioned it at all.

And its all relative. If you compare this card to old cards with the same performance (such as Polaris), they had significantly higher power consumption. So then this card looks fine.
 
  • Like
Reactions: Tlh97 and scineram

jpiniero

Lifer
Oct 1, 2010
11,829
3,350
136
But it is, because it doesn't mean anything. Power consumption matters when you have to design around it. Such as the high end nVidia cards that use 400-600W of power. But on that graph, these cards are look at favorably. And performance obviously matters. But performance per watt is something nVidia shoved down people's throats when they had a clear advantage in that area. Now that they regularly lose in that comparison, they have not mentioned it at all.

And its all relative. If you compare this card to old cards with the same performance (such as Polaris), they had significantly higher power consumption. So then this card looks fine.
Kinda wonder why this card isn't 75 W.
 

Shivansps

Diamond Member
Sep 11, 2013
3,625
1,306
136
I would argue the lack of an AV1 decoder is a more pressing concern then the lack of encoders. AV1 is starting to get some traction, and you can always run encodes on the CPU, where it doesn't really matter if it takes all night to do a bluray rip transcode.
I completely agree, i would rather have the AV1 decode. BTW has anyone confirmed that it at least has VP9? Im not seeing that info anywhere.

But the encoder is also somewhat important too, a lot of people want to stream or record games for fun, the ones with anything below a 8C (who are a lot) can only do it using the GPU encode, that the encoders arent present on old entry level gpus like the GT1030 is understandable, but it just cant be missing on a $200 GPU on 2022, plain and simple. Even an Atom APU has a encoder.

I really dont understand who designed this thing, they scrapped the encoder and AV1 decode because it was to be done on the APU on a notebook, but AMD APUs all have PCI-E 3.0!!!!! they designed a GPU that should only be used on Intel CPUs. Also scrapping the encoder is a awfull idea for a notebook that would mean doing memory copy if you are trying to stream or record a game playing on the dGPU with the encoder on the iGPU, what is terrible for the battery use. And terrible for that pcie link too.
 
Last edited:

Mopetar

Diamond Member
Jan 31, 2011
6,776
3,870
136
If its not replacing the RX5500XT they should have named it RX 6400XT. With that name and sharing the RX5500XT 8GB launch MSRP it is its succesor, there is nothing to argue here.
I agree from a certain perspective, but suppose that they had called this a 6400 XT and absolutely nothing else had changed. Same 4 GB, same 4x PCIe, same $200 MSRP (for what that's worth), same everthing else. What's actually changed? Is the card any better? Are the limitations any more okay? Is it a better value?

If there's any part of you that wants to say yes, then would it be even better for AMD to just call it a 6300 XT? Frankly they needed to go a lot farther than just the name in order to manage expectations. Prior cards like this had a heavy emphasis placed on their use as entry level cards for eSports and similar games that aren't as demanding. The street price still wouldn't be good, but that's not something that AMD can control, but they could have been more up front about the capabilities of the card.

Although some people just want to crap on this card (and it isn't as though the arguments aren't valid in many cases) but it's also pretty clear that the infinity cache is still carrying this thing. The bus saw a 50% reduction in width (though only a ~40% reduction in bandwidth accounting for the faster VRAM) and ~30% reduction in CUs. Obviously the base clock being ~50% higher is going to help offset that (and the 6500 XT does have more FLOPs on paper as a result) but even still it achieves these gains at ~80% of the TDP of the 5500 XT.

Not that any of this matters. It's still going to sell every unit and the MSRP will wind up being every bit as much of a fiction as it is with other cards. This card isn't as good as I had hoped it would be, but I don't think it's as bad as people want to think it is either. The silly part is that calling it a 6300 XT probably would have shut everyone up even though nothing else had changed.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
4,127
2,763
136
Ah, a fresh AMD GPU release. Blood in the water!

Kind of have mixed feelings on this release. Yeah, the card is a turd in terms of features, ram, performance, price, power... Wait I thought I had something positive to say...

On the other hand, I was never going to buy this thing. Weaker than my 980ti in every meaningful way and less ram to boot. BUT, unless the unthinkable happens, I'm not really going to by buying any release this gen. With that perspective, more cards is a good thing. Flood the zone, get anything and everything out there. At some point supply will level out and exceed demand, or a big bust somewhere will put a glut of used cards onto the market.

The funniest thing about this launch is watching the law of unintended consequences be writ loud. Would be truly ironic if by trying to go bottom barrel on this release and proclaiming themselves the anti-mining gamer saviors, AMD actually costs themselves more money thanks to the reputation hit this thing delivers.

For the people in the know, yes, but imagine you're some guy that is just trying to complete a build or buy a pre-built and this thing craps out or crashes when attempting to turn on RT or doesnt perform any better than a $300 card from 6 years ago, or gets plugged into the vast number of PCI-E 3.0 systems out there...

It's not even some even some low volume part that will end up in 500 computers, this is supposed to be the high volume piece that makes it to the most PCs.

I was starting to wonder where AMD would shoot themselves in the foot this gen, which has been an otherwise excellent showing on their part (no major driver issues, performance parity with NV, near feature parity with NV, it was going so well).
 
  • Like
Reactions: Mopetar

blckgrffn

Diamond Member
May 1, 2003
8,471
1,800
136
www.teamjuchems.com
And HWUB is usually easy on AMD, and Steve called it the worse video card release in decades.



Margins are almost certainly fat. This is a card that should be selling for $99. It's ridiculously cost reduced/crippled in every way (some never seen before like 4 lane PCIe bus), and they set the MSRP at $200. It's a very fat margin card.

Defending this steaming pile of dung, makes no sense.

This card is an insult to consumers, and a poster child for corporate greed.
I just don't get this fat margin thing. AMD is launching a card in 2022 to hit a $200 MSRP which the chip shortage and global logistics woes are all but making impossible. Should they have built a card to $250? $300? It's likely this bastardized card was nearly solely aimed at the OEM market (like the cut down 5500 derived OEM parts, which we aren't mentioning) and due to the logistics/road map issues we all get the OEM special.

1642653674228.png

That's from this very thread.

If true at all, this card has the lowest percentage markup of those shown, and terrible $$$ per unit margins. It's a terrible margin card and likely by far the least profitable per unit that AMD will be selling.

An 8GB card would increase the price by a minimum of $60 right now if that BoM is to be believed? And another $40 (lol, probably way too low) for the miner premium that entails? So $300 MSRP for a card with basically the same performance in a PCIe4 system? It seems like that would just be worse...

The lowest end cards tend to ride the tightest margins, it's just how merchandising works.

Should AMD sell this at a huge loss? If they had invested more, it's MSRP would have been the same as a 6600 which is a joke too. Seems like a no-win situation, but OEMS probably pressed them for something to sell as right now AMD has essentially nothing.

So we get "this". Which is very meh but has a shot to be purchasable.

It will be funny to see how nvidia dances around the "MSRP" of a 3050 being the same as a "MSRP" of a 3060 if really is that much better than the 6500XT - or will it be another card launched without an MSRP?

In any case, vote with your dollars.
 

majord

Senior member
Jul 26, 2015
420
481
136
Because they had to "Clock it to the Moon" (tm), in order for it to at least match a GTX 1050 ti 4GB card. :p
Except it's PCIe bandwidth at 3.0 , not clockspeed / core rasteration performance that causes it to be anywhere near a 1050ti in certain circumstances.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
4,640
3,176
136
I just don't get this fat margin thing. AMD is launching a card in 2022 to hit a $200 MSRP which the chip shortage and global logistics woes are all but making impossible.
You think this will actually be a $200 card? I have seen this story over and over this gen, you get a few MSRP cards at launch and then never see MSRP ever again.
 

blckgrffn

Diamond Member
May 1, 2003
8,471
1,800
136
www.teamjuchems.com
You think this will actually be a $200 card? I have seen this story over and over this gen, you get a few MSRP cards at launch and then never see MSRP ever again.
What retailers charge for it doesn’t necessarily impact the margins AMD brings home though.

The x-factor is really the wholesale price to AIBs, then the AIB markup to the retailers, then the retailer markup. AMD only profits from the first margin there and I would *love* to see that price sheet! That’s the one that matters w/regards to AMD making the “big bucks” on these tiny cut down GPUs.

If AMDs pricing to the AIB is unchanging (presumably some number of units would be to allow for some stability) then it’s the AIBs and retailers making the extra margin.

I get why that’s frustrating, I don’t understand how that’s AMDs “fault” there is no end to the demand for GPUs.
 

RnR_au

Senior member
Jun 6, 2021
458
1,101
96
Yeah, the card is a turd in terms of features, ram, performance, price, power...
Its a turd in terms of the future.

Its really strange for AMD to give some of its cards pcie x4/x8 connections given its experience with the new gen Xbox and PS consoles where streaming of textures and poly data over pcie from the harddrive directly to the ram. Its the major change to how games will be structured data wise in the future.

I just don't get this fat margin thing. AMD is launching a card in 2022 to hit a $200 MSRP which the chip shortage and global logistics woes are all but making impossible. Should they have built a card to $250? $300?
They could have given the card more pcie lanes. Wouldn't have raised the price significantly.
 
  • Like
Reactions: guidryp

beginner99

Diamond Member
Jun 2, 2009
5,042
1,349
136
Who knows, you might actually be able to find this one in stock.
Stock right now isn't the problem here (europe) anymore because the shops have taken the place of scalpers. 6700xt is $1100 at the shops. Best deal I have seen so far is a used 6600xt for $700. 3060 ti is also around $1000. But you can buy them, If you are willing to spend as much.
 

DAPUNISHER

Super Moderator and Elite Member
Moderator
Aug 22, 2001
24,192
7,036
146
Defending this steaming pile of dung, makes no sense.
That is accurate. Because this discussion, like most, becomes a trial in the court of public opinion.

Team Green, the prosecution, wants a kangaroo court, so they can get right to the hanging.

Team Red is using the Chewbacca defense.




As to the corporate greed comment: Finally! It's about time AMD started following the rules of acquisition. Intel and Nvidia have spent well over a decade moving from one anti consumer practice, or scandal, to the next. Yet they both continue to rake in the cash. Can't beat them? Join them. And make certain you are holding Nvidia's feet to the fire the same way. Otherwise, it reflects poorly on you, and leads to most dismissing your opinions offhand. ;)

The 6500XT will be the perfect snapshot, of this era. An era when the potato 710&730 series, the also largely pointless 1030, and1050ti, were somehow, still being manufactured.
 

ASK THE COMMUNITY