Discussion Radeon 6500XT and 6400

GodisanAtheist

Diamond Member
Nov 16, 2006
6,829
7,193
136
Just getting a thread up for this. Navi 24 will ride... Q1 2022... ish.


I fugure the 6500XT lands ~5500XT territory for ~$200.
 
  • Like
Reactions: Tlh97

kognak

Junior Member
May 2, 2021
21
44
61
If it had been pitched as a 1650(S) competitor at $150 tops, I don't think many would have minded it's shortcomings. Asking $199 for it is already though to swallow. $300+ is just obscene.
Prices of imagery land far far away, this $150 GTX1650. Actual real price is around $350. Indeed it is pitched as 1650 competitor. And it's cheaper and significant upgrade.
And any 1650 owner can sell their card at $300 in Ebay. 50% upgrade for free essentially.

Expecting price of any new card not to align relatively with other cards in the market is foolish. Why any one in the chain would leave money on the table. End users are going to pay the market price in way or the other, at minimum scalpers will take care of it if distrubution and retail doesn't. 6500XT was always going to be half price of 6600XT. Whatever it is.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,501
20,626
146
i say they would have used this gpu to replace the RX550 if things were normal. Now it ended up replacing the RX5500XT.
Now that I can agree with. But since things have not been normal for going on 2yrs now. When does everyone stop whinging about change for the worse, and deal with how things actually are? It's starting to sound like old people talking about how much a loaf of bread cost when they were a kid, and how it was better bread <insert okay grandma, let's get you to bed! meme here>

2yrs is like dog years in the tech space. When did DIYers turn into a bunch of Karens? It's manufactured outrage. AMD manufactured it, and now there's outrage. :p

I'll show myself out...
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
Saw on Reddit PCMR's are getting the ASRock phantom gaming model for $199 at MicroCenter. At that price, it is untouchable for a 12th gen budget gaming build.
Buying this card is very much not PCMR, but hey. Also Newegg had the Pulse for $199 earlier today.
It would actually be nice if AMD is able to keep pushing out just enough supply of cards sold at MSRP so that scalpers end up eating a bit of a loss for once.

They may not be able to keep them in stock at all times, but if there's a weekly drop then the more people will hold out and avoid eBay.

It seems like they're able to keep 6500XT's in stock @ pretty much MSRP here in Denmark. Even if you have to get in line to actually buy one, and you can't be picky about the specific model. I think that's pretty reasonable all-round.

If everyone can avoid fleeceBay, so much the better.

There seemed be ZERO forward thinking, or consideration for other use cases, so there may be more real HW limitations on this as well.

My own personal gripe with the 6400 and 6500XT isn't even the performance. It's the crippled media block. Would have been decent if that had been fully featured. Why AMD? Just why?
 

Shivansps

Diamond Member
Sep 11, 2013
3,855
1,518
136
Ok i know mining sent everything to hell, but having the same perf (best case scenario) as a RX480/580, SIX YEARS LATER, limited to the half the avalible memory version of that card, at the same MSRP (that is not going to be respected anyway), plus removing the encoder block (wut?), still no AV1 decode, so the decoder block is still the same (maybe VP9 decode? i dont remember if Polaris supported that). Plus the limitation of the PCI-E interface that is just going to kill performance on most pcs, what means the performance they are showing may no were near what the end user will get whiout PCI-E 4.0 and AMD didnt added any warning about that...

This is just not acceptable, in any way. And ive warning about this way before the mining sent everything to hell, this going way back to the RX5700XT launch, that we knew it was to be called RX690 because it was right there in the photos, they didnt have time to photoshop it out, last minute they changed the nomenclature for the RX5700XT not to look like a RX580/RX590 replacement with a 80% price premium... EVERYTHING goes downhill from there. except the prices.

Im sorry, but the RX6500XT is a gpu that belongs in the RX550 tier. NO, WAIT even the RX550 has encoders! This is GT1030/GT710 level tier then. Eveyone let the RX690 RX5700XT slip trought the cracks, but this needs to stop or is going to get a lot worse. And mining is just an excuse, mining is not the reason this card is limited to x4 or the lack of a encoder block. Mining is the reason why you cant get anything at MSRP.
 
Last edited:

blckgrffn

Diamond Member
May 1, 2003
9,128
3,069
136
www.teamjuchems.com
Come on guys. The answer is obvious.

Don't spend less than $500 on a GPU.

Problem solved.

Doom Eternal has always been a the poster child for PCIe bandwidth, we can see reviews for the 6600XT highlighting this.

A year from now all new builds - especially the HP Omens and Ibuypower and all of that ilk - are going to PCIe 4 only. This is a win for system integrators who can finally have a DX12 Ultimate (lol) card for their ~$1K or lower PCs.

I still don't understand how this is "AMDs" fault. If nvidia is pricing the 1050ti at $300 (they are) and the 1650 Super at @ $350 to $400 (also... they are) then why would we expect this card, which competes with them even if the bad benches shown overall, to be priced so low?

If we want it cheaper then we need some market pressure - outside of a massive shortage of chips and huge costs of import shipping - to make it happen.

So let's see it. The 8GB 3050 is a non-starter because it won't be available to normal people. So, the 4GB 3050. Let's see it at $300 or less on the street, nvidia.

Haha, like that will happen.
 

AtenRa

Lifer
Feb 2, 2009
14,001
3,357
136
You cant blame a game for using modern texture streaming tech just because AMD wants to sell a GT1030 class laptop GPU for $200+ for huge profit.

Exaggerating much ??? the card its way faster than GTX1650 and reaches GTX1650 Super/GTX1660 territory (when in PCIe Gen 4.0).
OK we got it, it has problems with games that need more than 4GB VRAM when used on a PCI-e Gen 3.0, that doesnt make the 6500XT a GT1030 class though.
 

jpiniero

Lifer
Oct 1, 2010
14,631
5,249
136
I guess the reality check is close enough, Nvidia is poised to launch the RTX 3050:
  • $249 MSRP
  • 8GB VRAM
  • 5th gen Decoder and 7th gen Encoder, basically everything it needs
Reviews on the 26th, one day before launch.

The 3050 is not a comparable product. The real price difference is going to be way more than $50. You know that, I'm not even sure why you are mentioning it.
 

Mopetar

Diamond Member
Jan 31, 2011
7,848
6,015
136
Buying this card is very much not PCMR, but hey. Also Newegg had the Pulse for $199 earlier today.

It would actually be nice if AMD is able to keep pushing out just enough supply of cards sold at MSRP so that scalpers end up eating a bit of a loss for once.

They may not be able to keep them in stock at all times, but if there's a weekly drop then the more people will hold out and avoid eBay.
 

blckgrffn

Diamond Member
May 1, 2003
9,128
3,069
136
www.teamjuchems.com
I can only point out the same thing I did for @AtenRa: if someone asks you for build advice today, will you also mention waiting until the 3050 launch? Because if you do mention they may want to wait, then you're just as intellectually conflicted as the people criticizing the 6500XT, as some of them would still buy it given no other choice. A product is good or bad no matter whether someone is metaphorically pointing a gun to your head in order to make you buy it.

Let me make the razor even sharper. What would you recommend to Joe today:
  1. 6500XT @ $199 /w 4GB VRAM
  2. 6500XT @ $249 /w 8GB VRAM
Why aren't we getting the option to choose? Because of the miners?! Heck, put both products on the market, let's see the 8GB variant price blow up because of crypto and AMD getting grateful nods from gamers happy they have a 4GB version just for them. Or could it just be that the review comparison would be problematic for the 4GB variant while the $249 MSRP of 8GB SKU would look bad in upcoming reviews? (even if MSRP isn't representative for real market pricing anymore)

Here's a mainstream reviewer expressing her... "feeling" for the immediate future:

View attachment 56402

They don't want to sell more of these.

It's likely that any other product they sell has a better margin. Container space doesn't know about CPUs vs GPUs. This has got to be in the running for the lowest value density product they sell. (In terms of dollars in logistics displacement).

I'll stick to my observation that this was supposed to be an OEM level part, only for new (ie, PCIe4) builds that are likely overwhelmingly Intel based (because of that's how it is) because OEMs wanted something, anything to sell and this was conceived in the darkest days of the silicon (substrate, etc.) shortages. I realize they said laptops but this seems in reality a better fit for any OEM desktop.

Dell will make this a $100-$150 stepping stone from the 6400 and a bunch of people will click the button because the don't want the slowest one, they want the +1. That's just a educated hunch.

Due to the ongoing s-show that is current global economies, mining and logistics we all get to enjoy this.

An 8GB version would show likely minimal improvements in a PCIe 4 system, likely super similar to an 8GB RX580 vs 4GB RX580 and looking back there are pages and pages and pages of discussion how wasteful it was to spend money on the extra ram. I know that's a discussion that goes back to like 2015 and 8GB 290's but... yeah.

I buy every 1060 3GB and GTX 970 I can lay my hands on sub $200 and they are used, dusty and sometimes smokey. I'd love to have new cards with warranties to include in builds, just because.

¯\_(ツ)_/¯

Maybe we'll get a little window of availability now with crypto looking tough, I see lots of NIB cards on my local CL for... ~50% MSRP markup which is a bit of a reprieve.

I still think the 6600/3060 is a starting spot for AAA/FPS "serious" gamers going forward, but man you should have seen how excited this 13 year old was three days ago to buy a $600 computer from me with an 8600K and a GTX 970. Kid was shaking a little bit and this was all his money from reffing soccer. Everything he wanted to play with his older brother was suddenly on the table.

1080p and medium details is still plenty to "just play".

Sorry for the ramble here.
 
Last edited:

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,501
20,626
146
Ignoring price
no-i-dont-think.gif


No one ignores price when buying a vid card. The rest is fantasy league stats that mean next to nothing. Check out that vid I posted of it paired with a 10100f, 3.0 is not the deal breaker big reviews led me to think it was. In fact, youtube is chockablock with 3.0 testing where it is still beating the competition like the 1650, handily. I will stop picking on the 1050ti for a moment, because it always sucked. Furthermore, it is an even better example of how bad things have gotten, than the 6500xt is. Every gamer that buys the 6500xt instead of the other 2 turds available around $300, wins.

The consensus is that it will age like milk. Well, the cards you can buy around its price, already have. No speculation required, about it. Dying Light 2 and Sifu just came out, and the 6500xt runs those fine.

I have also made note of how reviewers refer to the same 2 or 3 games when bashing it. Cyberpunk'd is not going to be representative of many games in the next few years. Yet that is always the one they point out. Personally, I'd rather not be able to play a few overpriced, hot garbage, AAA games in the next few years, than be forced to spend more money to get descent performance in them.
 

Shivansps

Diamond Member
Sep 11, 2013
3,855
1,518
136
When AMD designed Navi 24, they probably weren't even thinking about a desktop model.

AMD does not have enoght marketshare in notebooks for that, and in fact one of the first thing they did with this GPU was to launch a OEM only version and a FirePRO version. Nah, as i said many times, this is the kind of GPU that was designed to target the very low end on all markets, the kind of GPU that they would use to replace the RX550 and target the same markets as the GT1030/RX550/ and low end 1050 cores (in notebooks) that currently Nvidia dosent have a replacement.
The fact that the Navi24 has the same PCIE lanes as the nvidia conterparts is not coincidence.

To me only thing that was not planned to me is to launch a >75W $200 version.
 

Shivansps

Diamond Member
Sep 11, 2013
3,855
1,518
136
Ok here we go, i dont have any new AAA titles to test so this does not represent in any way modern AAA gaming. Also avg fps does not tell the whole story, with the 10100F (and pci-e 3.0) i was having issues in some games with fps going to single digits for a short time every now and then, this is a clear pcie bandwidth issue, for example with RDR2, when Arthur is running out of the store and he shoots the first guard, fps drops to nothing in that moment, every time. Something that does not happens on the same system with a 11400. And i know it is the pcie and not the CPU because it also happens with the 4700S were i did some tests with the 6500XT a few weeks ago. Assessin creed Origins is another game that is severely affected by pcie bandwidth, even trough VRAM should be enoght.

hra.png


I3-10100F/16GB/MSI RX 6400/ASUS H510M-E

AC: Origins 1080p Very High
vFKNlEp.png


Cyberpunk 2077 1080p Medium
PqMsBZG.png


FarCry 5 1080p Ultra
auGUbxa.png


RDR2 1080p High (i didnt enabled the advanced graphics settings)
Ii1n199.png

9QWKAKp.png


Total War: Warhammer 1080p ultra
BtSBPsH.png


Witcher 3 1080p Ultra Hairworks off
9Y0yXGx.png

Average framerate : 61.7 FPS
Minimum framerate : 54.6 FPS
Maximum framerate : 67.1 FPS
1% low framerate : 43.2 FPS
0.1% low framerate : 9.9 FPS

Shadow of the Tomb Raider 1080p Highest
zMcrMdn.png


Metro Exodus 1080p High
tLq6ePU.png


GTA V 1080p Ultra MSAA x4
Average framerate : 65.0 FPS
Minimum framerate : 40.9 FPS
Maximum framerate : 94.5 FPS
1% low framerate : 41.0 FPS
0.1% low framerate : 40.4 FPS

I5-11400/16GB/MSI RX 6400/ASUS H510M-E

AC: Origins 1080p Very High
72J5cHt.png


Cyberpunk 2077 1080p Medium
vbFYCRb.png


FarCry 5 1080p Ultra
Wa6hxtD.png


RDR2 1080p High (i didnt enabled the advanced graphics settings)
WBIgjWe.png

J5TMP9Q.png


Total War: Warhammer 1080p ultra
MvZn2Uo.png


Witcher 3 1080p Ultra Hairworks off
6owgatc.png

Average framerate : 65.1 FPS
Minimum framerate : 62.8 FPS
Maximum framerate : 68.6 FPS
1% low framerate : 54.6 FPS
0.1% low framerate : 47.8 FPS

Shadow of the Tomb Raider 1080p Highest
zPowpBM.png


Metro Exodus 1080p High
lcuuex4.png


GTA V 1080p Ultra MSAA x4
vndUgTo.png

Average framerate : 65.1 FPS
Minimum framerate : 40.8 FPS
Maximum framerate : 99.2 FPS
1% low framerate : 40.9 FPS
0.1% low framerate : 40.5 FPS

There is any way for me to be sure this thing has the full 16MB IC?
 
Last edited:

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136

blckgrffn

Diamond Member
May 1, 2003
9,128
3,069
136
www.teamjuchems.com
And HWUB is usually easy on AMD, and Steve called it the worse video card release in decades.



Margins are almost certainly fat. This is a card that should be selling for $99. It's ridiculously cost reduced/crippled in every way (some never seen before like 4 lane PCIe bus), and they set the MSRP at $200. It's a very fat margin card.

Defending this steaming pile of dung, makes no sense.

This card is an insult to consumers, and a poster child for corporate greed.

I just don't get this fat margin thing. AMD is launching a card in 2022 to hit a $200 MSRP which the chip shortage and global logistics woes are all but making impossible. Should they have built a card to $250? $300? It's likely this bastardized card was nearly solely aimed at the OEM market (like the cut down 5500 derived OEM parts, which we aren't mentioning) and due to the logistics/road map issues we all get the OEM special.

1642653674228.png

That's from this very thread.

If true at all, this card has the lowest percentage markup of those shown, and terrible $$$ per unit margins. It's a terrible margin card and likely by far the least profitable per unit that AMD will be selling.

An 8GB card would increase the price by a minimum of $60 right now if that BoM is to be believed? And another $40 (lol, probably way too low) for the miner premium that entails? So $300 MSRP for a card with basically the same performance in a PCIe4 system? It seems like that would just be worse...

The lowest end cards tend to ride the tightest margins, it's just how merchandising works.

Should AMD sell this at a huge loss? If they had invested more, it's MSRP would have been the same as a 6600 which is a joke too. Seems like a no-win situation, but OEMS probably pressed them for something to sell as right now AMD has essentially nothing.

So we get "this". Which is very meh but has a shot to be purchasable.

It will be funny to see how nvidia dances around the "MSRP" of a 3050 being the same as a "MSRP" of a 3060 if really is that much better than the 6500XT - or will it be another card launched without an MSRP?

In any case, vote with your dollars.
 

maddie

Diamond Member
Jul 18, 2010
4,749
4,691
136
Love the hate. Hope it continues. Only way to keep prices lower in this market. Planning to get one for a new parts build priced as low as possible and still relevant.

In any case, is there a mandate that I missed, to buy this card?
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,501
20,626
146
I can only point out the same thing I did for @AtenRa: if someone asks you for build advice today, will you also mention waiting until the 3050 launch?
No I would not. No, no, and again, no. The fact that a card that many (including myself) would consider compelling at $350 is being called budget, is another sign we are in the dark tech times. Given the 6500xt was just released, and is popping up at $200, that is the card much more appropriate to a thrifty budget build. $150 is a lot money in the low budget build space, period.

Anecdotally: I see posts frequently on reddit subs of DIYers doing AMD APU builds while they search for a card without getting fleeced. Be it the shuffle, EVGA queue, BB or MC drop, or what have you. Instead of going with that 5700G, they can now get a 6500xt and i3 for about the same money as the APU costs. It will be a superior budget solution too.

And let me be explicit. Since responses keep going into strawmen I have no interest in. I don't care about how this makes AMD look. I don't care about what would happen if they had released an 8GB version, or any of that other fantasy league stuff. Or we can call them hypotheticals, if strawman feels offensive or off the mark. I don't hold AMD stock. I haven't used one of their discreet cards in a build in years. The only thing I care about, is what I can add to cart for a low cost build, because that's my bag baby. And I would spec the 6500xt for the build under any other circumstance than the 3050 being under $300 and readily available. Which feels like more fantasy league talk imo. I will eat that plate of crow if wrong.

Speaking of which, and off topic. I would like to publicly say the crow I am eating is not delicious, but I am eating it. @blckgrffn took the position that the 5600x would not get a significant price cut when the budget 12th gen were available. I thought that it would. He was right, I was wrong.

 

DeathReborn

Platinum Member
Oct 11, 2005
2,746
741
136
HWUB tests low end GPUs old and new (including 6500Xt) at the lowest possible 1080p settings. I just assumed the 6500 would have no problem at potato settings, but that didn't hold for Cyberpunk 2077:

I suspect this could be because Wendell at Level1Techs (among a couple of other people) made 2 videos papering overr the 6500XT issues. I do think the 5600G flattered it a little, it's probably choke harder if put with a 4770K or similar type CPU.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
6,903
5,836
136
Why is there so much hate for the 6500xt?

Because it's a budget card that tanks on PCIE-3.0 and likely most people on tight budgets are still on PCIE-3.0. Anyone with a B450 board for Ryzen or an Intel system 10th generation or before is on PCIE-3.0 and this card is terrible for them. And it's not a $200 card, it's a $280 card. Just because the rest of the market sucks doesn't make this crappy card a winner, better to just buy nothing and hope ethereum goes proof of stake in 4 months.
 

blckgrffn

Diamond Member
May 1, 2003
9,128
3,069
136
www.teamjuchems.com
Haha, so we can just agree that the lowest priced, new, gamer marketed GPU is the only one available in its price bracket and is regrettably not faster?

Seriously, when has this not been the case? Now the threshold is somewharound $200 instead or $100. The lowest end SKUs are always cut too close to the bone.

The Polaris glut was an aberration we will likely only see again if crypto mining dies when the current cards are even still relevant.

This is like a thread from three years ago belaboring the existence of the 1030 or something.

If that was what you could afford then, that’s what you got.

All the rest is noise. You want a capable (high settings, min 60 fps 1080) for now and the near term? Budget $500 for a new GPU. If you score something cheaper, kudos to you.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,501
20,626
146
@maddie

Good stuff.

There is no more revenue to be made from stomping it like the printer in office space. And I could dig up numerous vids by Steve from before the dark times, where he explained to his viewers, the potential perils of buying used cards when reviewing the latest offerings from either IHV. Besides, they are going to start looking like they are part of the Nvidia marketing human caterpillar, if they go any harder.

The newest release, Dying light 2, is feasting on cards like zombies on brains. The 6500xt is beating not just the vanilla 1650, but the 1650 super. And it nips at the heels of the 1070 at 1080p HQ settings, and beats it with low settings. In an Nvidia sponsored game no less.

dying-light-1080p-high-quality.png

dying-light-1080p-low-quality-768x768.png

Those charts are from Kit Guru BTW.
 
Last edited:

ryanjagtap

Member
Sep 25, 2021
108
127
96
Techpowerup review is up, RX6400 same performance with PCIe 4.0 vs GTX1650
I would really like to see a Low Profile RX6400 vs GTX1650 LP on a refurbished SFF system with PCIe 3.0 review
Well this video from ETA Prime has the RX 6400 LP tested on two systems, first an i9-12900K and then an Lenovo system with ryzen 3 4300G, as well as a comparison with 1650 LP. The video has timestamps so you can seek ahead to the parts that you want to watch. The lenovo system testing starts at 06:30 :)
 

jpiniero

Lifer
Oct 1, 2010
14,631
5,249
136