Discussion Radeon 6500XT and 6400

Page 13 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

GodisanAtheist

Diamond Member
Nov 16, 2006
6,826
7,190
136
Just getting a thread up for this. Navi 24 will ride... Q1 2022... ish.


I fugure the 6500XT lands ~5500XT territory for ~$200.
 
  • Like
Reactions: Tlh97

Heartbreaker

Diamond Member
Apr 3, 2006
4,228
5,228
136
I would find very hard to belive they designed this GPU only for low end laptops, and low end RMB laptops at that due to the pcie x4 4.0 thats a very specific market to target. Considering that low end laptops are going to be using Barcelos/Cezanne with PCIe 3.0, that a very weird decision to make.

Nah, this is just a low end GPU, simple as that, they designed this as a replacement to the RX550 Polaris GPU, notebook, desktop, workstation, wharever. And if i consider as that, its not bad, even with the compromises it is still good.

But you just cant try to charge $200 for this and try to pass it as the RX 5500 XT replacement.

Two things seal that it MUST have been made for low end laptops only.

The 4 channel PCIe bus. This cripples it significantly, for miniscule savings. You don't do this unless you are certain, you will never need it. Which would apply for low end laptops with 4 channels to spare for the discrete GPU.

The decimated media core. One of the use cases often mentioned for low end GPUs is for Media PC, so again this is the kind of crippling that makes no sense unless you are certain, you are never going to need it. Again applies if you are going to use it only with APUs that already have their own Media cores.

If you were designing a low end part for laptop and discrete cards you would have a more reasonable PCIe bus, and a more reasonable media section. Penny pinching here makes no sense if discrete cards were planned from the start.
 

Panino Manino

Senior member
Jan 28, 2017
821
1,022
136
It's bad because it's being sold as something that it isn't.
This shouldn't be sold for "gamers" that expect much more. The biggest reason for the 65xx number seems to be the price. They couldn't justify $200 for a 64xx or 63xx.

I know that RTG lived very dire times in recent years but by now they should had recovered enough to do better. Instead of cutting essential things why not use RDNA1 not wasting transistors on useless RT? Do a refresh of RDNA1 (the missing 5400) for people that just want to get a cheap VGA today, any modern VGA as long as it's modern and cheap. RTG could had designed an even smaller GPU with all the decode and encode blocks and sold it for $150 or less as it really is, just a survival VGA for desperate times. Would make money for AMD and also would earn AMD good will points with consumers.
 

blckgrffn

Diamond Member
May 1, 2003
9,128
3,069
136
www.teamjuchems.com
It's bad because it's being sold as something that it isn't.
This shouldn't be sold for "gamers" that expect much more. The biggest reason for the 65xx number seems to be the price. They couldn't justify $200 for a 64xx or 63xx.

I know that RTG lived very dire times in recent years but by now they should had recovered enough to do better. Instead of cutting essential things why not use RDNA1 not wasting transistors on useless RT? Do a refresh of RDNA1 (the missing 5400) for people that just want to get a cheap VGA today, any modern VGA as long as it's modern and cheap. RTG could had designed an even smaller GPU with all the decode and encode blocks and sold it for $150 or less as it really is, just a survival VGA for desperate times. Would make money for AMD and also would earn AMD good will points with consumers.

Likely because that would be one of the worst ways they could invest N7 wafers from a profit standpoint.

RDNA v1 was just a stopgap, imo, not something they wanted to keep around for long.
 

Glo.

Diamond Member
Apr 25, 2015
5,711
4,559
136
As someone said, this GPU should have been called RX 650 XT, instead of 6500 XT.

Would it be priced accordingly? Not really. But the naming scheme would be perfectly fitting.
 

coercitiv

Diamond Member
Jan 24, 2014
6,213
11,956
136
My point was the estimate spreadsheet that you quoted looks very weird when attempting to fit the BoM for such a device. Even if the real cost amounts to more than the retail price, which is possible in this crazy market, that's still a huge amount of hardware for $300+ when compared with the spartan 6500XT: bigger PCB, much bigger chip, lots more VRAM, case, cooling, PSU, WiFi/BT card, 512GB SSD, controller. Take a look for yourself, it's quite a piece of engineering.

I'm not part of the crowd that's angered by AMDs decision to cash in, my problem with AMD in this case is they have neutered this product from so many angles that it cannot be anything else but bad. In fact, that is why people play the $99 price argument, they're simply voicing frustration for a badly engineered product. We all know prices would be huge even on a "budget" card, but at least present the market with a balanced product. It's 4GB of VRAM or lower PCIe lane count, but not both at the same time.
 

Shivansps

Diamond Member
Sep 11, 2013
3,855
1,518
136
Two things seal that it MUST have been made for low end laptops only.

The 4 channel PCIe bus. This cripples it significantly, for miniscule savings. You don't do this unless you are certain, you will never need it. Which would apply for low end laptops with 4 channels to spare for the discrete GPU.

The decimated media core. One of the use cases often mentioned for low end GPUs is for Media PC, so again this is the kind of crippling that makes no sense unless you are certain, you are never going to need it. Again applies if you are going to use it only with APUs that already have their own Media cores.

If you were designing a low end part for laptop and discrete cards you would have a more reasonable PCIe bus, and a more reasonable media section. Penny pinching here makes no sense if discrete cards were planned from the start.

It is x4 because it was designed to be a entry level GPU, for all markets, this is the same thing with the GT1030 (GP108) in desktop being the same core used in MX330 in laptops. Just as the GT1030 it also lacks encoders and has two video outputs. So yeah this is petty much a 1:1 GT1030/MX330 features copy. Dont belive for one second they designed Navi 24 to be laptop only. This is the GPU that would have replaced the RX550 at $100 (and lower for the 12CU version) if the prices were normal, this is the only product they have that can replace that old Polaris GPU.
 

IEC

Elite Member
Super Moderator
Jun 10, 2004
14,332
4,926
136
A whole lot of "Meh" other than for power efficiency. 4GB vRAM is just not enough (as even AMD has previously pointed out) and 64-bit memory, 4 lanes of PCI-e is crippling

As a pipe cleaner for 6nm it makes sense... that's about it.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,228
5,228
136
It is x4 because it was designed to be a entry level GPU, for all markets, this is the same thing with the GT1030 (GP108) in desktop being the same core used in MX330 in laptops. Just as the GT1030 it also lacks encoders and has two video outputs. So yeah this is petty much a 1:1 GT1030/MX330 features copy. Dont belive for one second they designed Navi 24 to be laptop only. This is the GPU that would have replaced the RX550 at $100 (and lower for the 12CU version) if the prices were normal, this is the only product they have that can replace that old Polaris GPU.

GT1030 was a $70 garbage card, from 2017, and that was the last NVidia Chip like that that made it into a discrete card, and it's so slow that PCIe x4 doesn't matter.

If "6500xt" was actually planned from day 1, to be a discrete GPU card, as crippled as a 2017, $70 garbage, then that's even worse than if they changed there mind on a low end laptop part.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,353
10,050
126
A whole lot of "Meh" other than for power efficiency. 4GB vRAM is just not enough (as even AMD has previously pointed out) and 64-bit memory, 4 lanes of PCI-e is crippling

As a pipe cleaner for 6nm it makes sense... that's about it.
Does it fit inside a Cracker Jack box? What about a cereal box?
 

Ranulf

Platinum Member
Jul 18, 2001
2,357
1,177
136
A whole lot of "Meh" other than for power efficiency. 4GB vRAM is just not enough (as even AMD has previously pointed out) and 64-bit memory, 4 lanes of PCI-e is crippling

As a pipe cleaner for 6nm it makes sense... that's about it.

Meh, its not that efficient power wise. My rx 570 4gb real world varies between 115w and 150w.
 

Shivansps

Diamond Member
Sep 11, 2013
3,855
1,518
136
GT1030 was a $70 garbage card, from 2017, and that was the last NVidia Chip like that that made it into a discrete card, and it's so slow that PCIe x4 doesn't matter.

If "6500xt" was actually planned from day 1, to be a discrete GPU card, as crippled as a 2017, $70 garbage, then that's even worse than if they changed there mind on a low end laptop part.

I think it is just a gpu designed from day 1 as entry level, for all platforms, thats it. I would not understand why they are releasing them for workstations too if it wasnt. It is the gpu that you use in laptops vs the nvidia options like the MX350/MX330 and in desktop to replace the RX550.

The 4GB and x4 PCIe hurts performance, yet, as a entry level gpu what you would expect? thats fine. As a RX550 replacement.
Its actually the lack of AV1 decode and encoders that hurts it more for that market. Considering it still seems to have a full decode block for other formats, the lack of AV1 decode may be a hardware bug.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,228
5,228
136
Its actually the lack of AV1 decode and encoders that hurts it more for that market. Considering it still seems to have a full decode block for other formats, the lack of AV1 decode may be a hardware bug.

When everything looks like penny pinching in the extreme, not sure why you would think lacking AV1 decode is a bug. Sounds right in line with the extreme penny pinching. AV1 decode would have used a few pennies worth of extra transistors (probably usnig older previous gen Media decode block that used less transistors).
 

gruffi

Member
Nov 28, 2014
35
117
106
I don't know why some people mock about the card. The card is fine as it is. It's an ENTRY card for OEM systems or casual gamers. Why should it have 8 GB VRAM or encoders you actually wouldn't use with such a card anyway? If you want that then go with 6600 series. RX 6500 XT / 6400 are replacing RX 5300 which had only 3 GB VRAM and half the performance of RX 6500 XT. For me the only downside of RX 6500 XT is the TDP. It should have been possible the do such a card within 75W and without additional power connector. Even if it loses 5-10% performance I wouldn't care. It would have been a worthy upgrade for my current 75W card.
 

Shivansps

Diamond Member
Sep 11, 2013
3,855
1,518
136
When everything looks like penny pinching in the extreme, not sure why you would think lacking AV1 decode is a bug. Sounds right in line with the extreme penny pinching. AV1 decode would have used a few pennies worth of extra transistors (probably usnig older previous gen Media decode block that used less transistors).

I would like to think that no one sane would have designed a RDNA 2 GPU for 2021/2022 launch without AV1 decode.

But i dont know anymore, nothing about this gpu makes sense to me, as a entry level product, something you use to replace the already out of production RX 550 in desktop and in workstations vs the Nvidia T400 and T600 offerings, the encoder and AV1 decode are far more important than the pcie bandwidth and the 4GB VRAM.

In laptops, specially low end laptops, limiting it to x4 4.0 makes no sense, it means this gpu should not be used with Cezannes/Barcelos. And with RMB, a low end RMB does not exists, unless you would call the 6600H low end, im petty sure it is still a premium sku.

Then, if the card is not connected to the monitor it means when gaming everything will have to go trought the already saturated pcie x4, im really not sure if that is going to be any faster than the 12 CU RMB alone that is likely cheaper and lower power than a 6600H+Navi24. So this card does not seem to be designed to be used by AMD APUs, just Intel ones.

If the GPU is connected to the monitor it means AV1 decode and all encode work will have to be done on the IGP, what means having a two gpus parcially active, not one, doing constant memcopy, that just terrible for thermals and battery.

¯\_(ツ)_/¯
 

jpiniero

Lifer
Oct 1, 2010
14,629
5,247
136
I guess the reality check is close enough, Nvidia is poised to launch the RTX 3050:
  • $249 MSRP
  • 8GB VRAM
  • 5th gen Decoder and 7th gen Encoder, basically everything it needs
Reviews on the 26th, one day before launch.

The 3050 is not a comparable product. The real price difference is going to be way more than $50. You know that, I'm not even sure why you are mentioning it.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,498
20,622
146
I guess the reality check is close enough, Nvidia is poised to launch the RTX 3050:
  • $249 MSRP
  • 8GB VRAM
  • 5th gen Decoder and 7th gen Encoder, basically everything it needs
Reviews on the 26th, one day before launch.
MSRP? We're still playing make believe? It has 8GB of ram; scalpers and miners will drive the prices up to $400+.
 
  • Like
Reactions: Tlh97

Heartbreaker

Diamond Member
Apr 3, 2006
4,228
5,228
136
The 3050 is not a comparable product. The real price difference is going to be way more than $50. You know that, I'm not even sure why you are mentioning it.

Of course it's not comparable. 3050 is not a penny pinched piece of garbage.

So yeah, completely unfair comparing the 6500xt to something that isn't garbage. :D
 
  • Haha
Reactions: DeathReborn

Heartbreaker

Diamond Member
Apr 3, 2006
4,228
5,228
136
I would like to think that no one sane would have designed a RDNA 2 GPU for 2021/2022 launch without AV1 decode.

But i dont know anymore, nothing about this gpu makes sense to me, as a entry level product, something you use to replace the already out of production RX 550 in desktop and in workstations vs the Nvidia T400 and T600 offerings, the encoder and AV1 decode are far more important than the pcie bandwidth and the 4GB VRAM.

In laptops, specially low end laptops, limiting it to x4 4.0 makes no sense, it means this gpu should not be used with Cezannes/Barcelos. And with RMB, a low end RMB does not exists, unless you would call the 6600H low end, im petty sure it is still a premium sku.

Then, if the card is not connected to the monitor it means when gaming everything will have to go trought the already saturated pcie x4, im really not sure if that is going to be any faster than the 12 CU RMB alone that is likely cheaper and lower power than a 6600H+Navi24. So this card does not seem to be designed to be used by AMD APUs, just Intel ones.

If the GPU is connected to the monitor it means AV1 decode and all encode work will have to be done on the IGP, what means having a two gpus parcially active, not one, doing constant memcopy, that just terrible for thermals and battery.

It was designed to Pair with Rembrandt APUs, which have full media, and PCIe 4.0. Also bear in mind that they will likely drop the clocks and performance a lot for laptops to get the power down for lower end laptop usage, then 4x PCIe really is less of an issue. Maybe even in PCIe 3.0 systems.


It's almost like it was designed only for laptops, and then they saw how high it could clock and decided to make a discrete card out of it. But then it becomes problematic with weak Media, a only 4x PCIe.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
6,899
5,835
136
I don't know why some people mock about the card. The card is fine as it is. It's an ENTRY card for OEM systems or casual gamers. Why should it have 8 GB VRAM or encoders you actually wouldn't use with such a card anyway? If you want that then go with 6600 series. RX 6500 XT / 6400 are replacing RX 5300 which had only 3 GB VRAM and half the performance of RX 6500 XT. For me the only downside of RX 6500 XT is the TDP. It should have been possible the do such a card within 75W and without additional power connector. Even if it loses 5-10% performance I wouldn't care. It would have been a worthy upgrade for my current 75W card.

Because it's supposed to be a budget card but is enormously handicapped when running on a budget user's platform (eg one with PCIE-3.0).
 

maddie

Diamond Member
Jul 18, 2010
4,749
4,691
136
Love the hate. Hope it continues. Only way to keep prices lower in this market. Planning to get one for a new parts build priced as low as possible and still relevant.

In any case, is there a mandate that I missed, to buy this card?
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,498
20,622
146
But then it becomes problematic with weak Media.
For DIYers that is nothing more than marketing dept. propaganda. E.G. Name a CPU so weak, that it cannot handle all the media duties a card like the GTX 1650 can. And when you pick one, let's go a step further, and see how well it handles games. Because if it can't do media duty, it ain't likely to be well rounded for modern gaming either. And we are discussing a card for 1080p, a CPU heavy res.
If it sells for $400 that would be about $50 more than 6500XT is selling for on ebay tbh.
You can buy a 6500XT new for $270 right now. Ebay, because of fees, will drive the price of the 3050 even higher. Literally, what ever the market will bear and then some.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,498
20,622
146
Because it's supposed to be a budget card but is enormously handicapped when running on a budget user's platform (eg one with PCIE-3.0).
Show me 5 games, just 5, where it isn't faster than a 1650.

And I could build a BNIB budget gamer 12100f B660 with it, and match the price of a 5600G alone.