Discussion Radeon 6500XT and 6400

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

GodisanAtheist

Diamond Member
Nov 16, 2006
6,776
7,102
136
Just getting a thread up for this. Navi 24 will ride... Q1 2022... ish.


I fugure the 6500XT lands ~5500XT territory for ~$200.
 
  • Like
Reactions: Tlh97

blckgrffn

Diamond Member
May 1, 2003
9,120
3,048
136
www.teamjuchems.com
Profitability, being a mix of base price, efficiency, hash rate and crypto pricing, if these were too cheap and too efficient that it offset a terrible hashrate in an online calculator, it seems really likely there would just be stacks and stacks of these hanging off of mining rigs.
 

DisEnchantment

Golden Member
Mar 3, 2017
1,599
5,767
136
With prices being set at €300+ for the 6500XT this video is like rubbing salt into a gaping wound then opening another and adding more salt. Video is 5500XT being reduced to PCie 4x lanes both at 4.0 & 3.0, 8GB & 4GB models.
5500XT not a good indicator. The infinity cache is going to help a bit. By how much, don't know, we have to wait.
I have a 6600XT, it beats creams my 5700XT at half the bus width, lesser CUs, lesser power and lesser heat. But the 6500XT might suck on PCIe 3.0😊
 
Last edited:

DeathReborn

Platinum Member
Oct 11, 2005
2,745
738
136
I keep seeing "Infinity Cache will save it" but that's going to do diddly squat about VRAM Capacity & PCIe Bandwidth. 8x PCIe lanes makes up a handful of mm on a die, dropping half is the ultimate in cost cutting & I do think it will bite this card on the backside & hard.
 
  • Like
Reactions: Ranulf

Heartbreaker

Diamond Member
Apr 3, 2006
4,226
5,226
136
I'll give AMD a little bit of the benefit of the doubt on this and assume that potential issues that may exist due to the limited memory bus size or the smaller PCIe bus aren't as much of an issue due to infinity cache. I think the bigger problem is the 4 GB of VRAM which really limits this card because there are already several games where that simply isn't enough even in 1080p. Of course it's a bit of a double edged sword because it also limits the desirability of the card for certain categories of miners and if the 4 GB version of the card sees a lot of markup, you can be sure that the 8 GB version will be scalped far, far worse.

The real reason that this chip seems to be squeezed six ways from Sunday is precisely because AMD needs to be able to get as many of these on a wafer as possible. It's not just that it helps them in terms of profitability, but because they don't have nearly as many wafers as they need to be able to meet demand right now. It also doesn't matter if they aren't great for gaming because someone will scoop them up for mining, even if it's not as profitable as ETH.

They get no points from me, for making the crappiest most cost reduced card imaginable.

They are making something as crappy cheap as possible maximize profit.
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
Until the T400 gets sniffed out. I was buying "cheap" maxwell quadros until about 12 months ago, then they all doubled in price. I don't know when switch on this stuff gets flipped but I am considering buying a T400 just to have one for when it goes nuts.

Having only 2GB worth of memory should make it unpalatable for modern gaming and any sort of mining. We hope.

But yeah. Best to stock up while you can.
 

beginner99

Diamond Member
Jun 2, 2009
5,210
1,580
136
This really just seems like a terrible card from every angle. Tiny 64 bit memory bus. Crippled media encoder/decoder section, 4 Channel PCIe bus.

I can't imagine they would have built something this bad to sell for more than $100 in normal times. Going for $300-400 it's a sad joke.

excatly. no bad products just bad pricing. At $100 this would be ok. But at $300 it is bascially on par with a 290x (chip from 2013) at 2015 pricing ($250). similar performance just less power and never video decode.
 

Shivansps

Diamond Member
Sep 11, 2013
3,848
1,518
136
The PCI-E bandwidth depends on the game, "Shadow of the Tomb Raider" will probably give you the worse case results, as that game is very PCIE/VRAM dependant.

But outside of that, im starting to think the perf hit may not be that bad, as long you can keep the game under 3GB VRAM (remember that Windows and other background apps uses a significant amount of vram for a 4GB card and with that pcie you cant afford data swap on runtime). Worse part is limiting quality this much on a gpu that may end up at $400-$500... in a $100 GPU this is fine, but this is not a $100 gpu, not even at the imaginary MSRP.

But to be honest, even the RX 550 had encoders and a better PCIE link, this is a GT1030 tier level GPU (PCIEx4/No encoders). I hope it at least has VP9 decode. Because the GT1030 has that.
 

Ranulf

Platinum Member
Jul 18, 2001
2,345
1,164
136
I know that on my rx 570 4gb card windows was regularly using 500mb-1gb of vram while gaming. RDR2 claimed it was using 2.5GB, and 1GB for system with 500mb available on the FX 8350 system a year or so ago.
 

Leeea

Diamond Member
Apr 3, 2020
3,617
5,362
136
With prices being set at €300+ for the 6500XT this video is like rubbing salt into a gaping wound then opening another and adding more salt. Video is 5500XT being reduced to PCie 4x lanes both at 4.0 & 3.0, 8GB & 4GB models.

That is exaggerating things a bit. AMDs MSRP is $200 usd, but odds are this will not hold its over inflated price.

This thing is useless to miners with 4 gig of ram.


^

This really just seems like a terrible card from every angle. Tiny 64 bit memory bus. Crippled media encoder/decoder section, 4 Channel PCIe bus.

I can't imagine they would have built something this bad to sell for more than $100 in normal times. Going for $300-400 it's a sad joke.

I'd just stick with an iGPU to ride out these times, even if it meant not playing any games from the last decade, rather than buy this thing.

actually, no it is not terrible from every angle.

The memory bus pins and logic requires quite a lot of die space.
Same thing with the PCI bus channels.

This allows them to reallocate that space to either infinity cache or just die shrinking. Either way it allows them to increase yields (defects in cache sram are frequently ignorable) and reduce die size ( more dies per wafer = more production ).


At its price point it likely will be very competitive on PCIe 4.0 systems.

This thing will likely have 3x to 6x the performance of your iGPU.


I keep seeing "Infinity Cache will save it" but that's going to do diddly squat about VRAM Capacity & PCIe Bandwidth. 8x PCIe lanes makes up a handful of mm on a die, dropping half is the ultimate in cost cutting & I do think it will bite this card on the backside & hard.
Depends.

On PCIe 4.0 it is not going to matter.

On PCIe 3 it is going to hurt, but compare it to its market competitors. It will likely be the best desperation video card easily available.
 
Last edited:
  • Like
Reactions: Tlh97

Heartbreaker

Diamond Member
Apr 3, 2006
4,226
5,226
136
actually, no it is not terrible from every angle.

The memory bus pins and logic requires quite a lot of die space.
Same thing with the PCI bus channels.

This allows them to reallocate that space to either infinity cache or just die shrinking. Either way it allows them to increase yields (defects in cache sram are frequently ignorable) and reduce die size ( more dies per wafer = more production ).
Stop making excuses for manufacturer greed.

They made it as cheap as possible (even compromising the Media capability), that's only good from one perspective. AMD's profit margin.

From a consumer perspective, it's a cost reduced piece of garbage.
 

tamz_msc

Diamond Member
Jan 5, 2017
3,763
3,585
136
Depends.

On PCIe 4.0 it is not going to matter.

On PCIe 3 it is going to hurt, but compare it to its market competitors. It will likely be the best desperation video card easily available.
The target customers for this card are those with B450/350 or equivalent 200/300 series motherboards from AMD and Intel respectively, and those are all PCIe 3.0. Even those who can't get a GPU but are holding out on the APUs with a PCIe 4.0 board are going to be slighted if they get this GPU because APUs are PCIe 3.0 only. So in retrospect "buy an APU to hold over till you can afford a GPU" as a temporary measure was terrible advice for budget gamers with APUs.

Then there's the fact that it's a 4 GB card, which is absolutely inadequate these days for anything above medium textures at 1080p. What's even more comical is AMD going the extra mile to 'hide' the fact that this is a terrible card with things like this -


And finally, it's competition is not entry level GPUs like the GT 1030, but rather used ones like GTX 1060/RX 570/RX 580 which are on eBay for $200-300, right now.

There are no bad products, only bad prices - yes this card is an exemplar of that dictum. It should have been $150 tops with all its shortcomings, but now it will have a premium above its $200 MSRP, as evidenced by ASUS already setting the price to $300.
 
  • Like
Reactions: Leeea

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
And finally, it's competition is not entry level GPUs like the GT 1030, but rather used ones like GTX 1060/RX 570/RX 580 which are on eBay for $200-300, right now.

There are no bad products, only bad prices - yes this card is an exemplar of that dictum. It should have been $150 tops with all its shortcomings, but now it will have a premium above its $200 MSRP, as evidenced by ASUS already setting the price to $300.

If it had been pitched as a 1650(S) competitor at $150 tops, I don't think many would have minded it's shortcomings. Asking $199 for it is already though to swallow. $300+ is just obscene.
 

kognak

Junior Member
May 2, 2021
21
44
61
If it had been pitched as a 1650(S) competitor at $150 tops, I don't think many would have minded it's shortcomings. Asking $199 for it is already though to swallow. $300+ is just obscene.
Prices of imagery land far far away, this $150 GTX1650. Actual real price is around $350. Indeed it is pitched as 1650 competitor. And it's cheaper and significant upgrade.
And any 1650 owner can sell their card at $300 in Ebay. 50% upgrade for free essentially.

Expecting price of any new card not to align relatively with other cards in the market is foolish. Why any one in the chain would leave money on the table. End users are going to pay the market price in way or the other, at minimum scalpers will take care of it if distrubution and retail doesn't. 6500XT was always going to be half price of 6600XT. Whatever it is.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,226
5,226
136
Scathing review. AMD pulled out all the stops, to cut it down...


Worst GPU: Radeon RX 6500 XT Review, Corner Cutting Edition
Steve Says, this is Worse GPU release EVER (in his memory).


They even cut it down to two video outputs.
 
Last edited:
  • Like
Reactions: RnR_au and Mopetar

Shivansps

Diamond Member
Sep 11, 2013
3,848
1,518
136
4GqYM4g.png


And this is were we need to remember that AMD artificially limited revised versions of B450 motherboards and A520 to PCI-e 3.0. And H510 has 4.0 provided you use an 11th gen.
 

coercitiv

Diamond Member
Jan 24, 2014
6,176
11,808
136
This thing is useless to miners with 4 gig of ram.
This card is a really bad "hill to die on". I defended AMDs pricing decisions on good products with limited availability, but this is a really sad product for the price.

There's an old joke in my country, about a stupid man who gets captured by an indigen tribe in the wilderness. He finds himself among a few other captives and one of them tells him they will be put to work and eventually be sacrificed, their skin used to make shoes and clothing. The next day the tribe members find this man stabbing his entire body with a crude needle, yelling at them that his skin will be useless for their fancy clothes and shoes.

Stabbing the card all over so that the miners wouldn't care for it, only a poorly prepped exec would think to use this sad excuse in front of a knowledgeable audience:
We have really optimized this one to be gaming first… You can see that with the way we’ve configured the part. Even with the four gigs of frame buffer, that’s a really nice frame buffer size for the majority of triple-A games, but it’s not particularly attractive if you’re doing blockchain-type or mining activities.

that’s a really nice frame buffer size for the majority of triple-A games
1642605313155.png


Just like I said with Nvidia's LHR marketing stunt, this will not affect miners one bit. They will find a use for them with alternative coins, they will vacuum the market.

Raja's DNA is still in the
company.
 
Last edited:

Shivansps

Diamond Member
Sep 11, 2013
3,848
1,518
136
Actually, LHR has been successfull in pissing off miners, and no one was able to fully unlock it yet, aside from the 1st gen 3060s, what is very, very suprising, the problem is that they are going to keep buying the LHR cards unless they have something else to buy. The CMP cards availability were a joke, for instance, the CMP 30HX (GTX1660S) that mines at 32MH/s was avalible only once at the start of the LHR, then never again. Launch LHR to try to divide mining hardware from gaming hard and have no mining hardware to sell is stupid.

AMD limits this card to 4GB does nothing to stop miners, you can still use it to mine alt coins, if there is nothing else to buy, they are going to buy it anyway.