Discussion Radeon 6500XT and 6400

Page 29 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

GodisanAtheist

Diamond Member
Nov 16, 2006
6,719
7,016
136
Just getting a thread up for this. Navi 24 will ride... Q1 2022... ish.


I fugure the 6500XT lands ~5500XT territory for ~$200.
 
  • Like
Reactions: Tlh97

AtenRa

Lifer
Feb 2, 2009
14,000
3,357
136

Jason from PC Builder, doing a "Boost My Build" video, see his reaction to builds that include an RX 6500.


He changed the RX6500XT with a RX580 4GB at the same price, this is the worst advise he could make.

6500XT vs RX580 @ PCIe 3.0 = equal performance
6500XT vs RX580 @ PCIe 3.0 = half the power
6500XT vs RX580 @ PCIe 3.0 = new design, it will be supported longer.

 
  • Like
Reactions: DAPUNISHER

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
He changed the RX6500XT with a RX580 4GB at the same price, this is the worst advise he could make.

6500XT vs RX580 @ PCIe 3.0 = equal performance
6500XT vs RX580 @ PCIe 3.0 = half the power
6500XT vs RX580 @ PCIe 3.0 = new design, it will be supported longer.


The problem with the RX6400/RX6500XT with PCI-E is that there are some situacions, specially due to texture streaming, were Navi24 just dies on pcie 3.0. RDR2 benchmark when Arthur runs out of the store, and shoots the first guard on the street, the gpu has to completely change all assets on screen, the FPS drops to nothing there on 3.0. This will only get worse moving forward.

That said, getting a Polaris today is also a very bad idea.

I would love to see some x16 3.0 to x8 4.0 adapters, But i guess they would be expensive.
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,000
3,357
136
The problem with the RX6400/RX6500XT with PCI-E is that there are some situacions, specially due to texture streaming, were Navi24 just dies on pcie 3.0. RDR2 benchmark when Arthur runs out of the store, and shoots the first guard on the street, the gpu has to completely change all assets on screen, the FPS drops to nothing there on 3.0. This will only get worse moving forward.

That said, getting a Polaris today is also a very bad idea.

I would love to see some x16 3.0 to x8 4.0 adapters, But i guess they would be expensive.

Where did you see that ??
From the bellow video i dont see that much of a fps drop in that particular point of the benchmark.

Even soo, in the few games that exhibit fps drops in specific areas of the game, just change the settings so the game doesnt need more than 4GB.

 
  • Like
Reactions: DAPUNISHER

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
Where did you see that ??
From the bellow video i dont see that much of a fps drop in that particular point of the benchmark.

Even soo, in the few games that exhibit fps drops in specific areas of the game, just change the settings so the game doesnt need more than 4GB.


My own testing, both the RX6400 and the RX6500XT did that on PCI-E 3.0 mode.
 

Glo.

Diamond Member
Apr 25, 2015
5,661
4,419
136
Core i3-12100F, 8 GB 3200 MHz CL16, RX 6400 4 GB, Pop!_OS 22.04 LTS:

Overwatch 1080p, Medium presets: 200 FPS, average, High presets: 180 FPS, average.

Im surprised. I was expecting much, much less from this GPU.
 
  • Like
Reactions: DAPUNISHER

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
What hardware did you use ??

a H510 build with 16gb ram, nvme and an 10100F/11400, in both cases the 10100F had that issue, and in both cases swapping the cpu for the 11400 fixed the problem.

I also tested the 6500XT on a 4700S board and the problem was way worse.
 

AtenRa

Lifer
Feb 2, 2009
14,000
3,357
136
a H510 build with 16gb ram, nvme and an 10100F/11400, in both cases the 10100F had that issue, and in both cases swapping the cpu for the 11400 fixed the problem.

I also tested the 6500XT on a 4700S board and the problem was way worse.

Perhaps the combination of the Core i3 10100F with H510 is the problem , because I can see a similar behavior from the video bellow with 10100F + GTX1650 on H410

1080p High settings

 
Last edited:
  • Like
Reactions: DAPUNISHER
Jul 27, 2020
15,759
9,822
106
Core i3-12100F, 8 GB 3200 MHz CL16, RX 6400 4 GB, Pop!_OS 22.04 LTS:

Overwatch 1080p, Medium presets: 200 FPS, average, High presets: 180 FPS, average.

Im surprised. I was expecting much, much less from this GPU.
That's the CPU's great ST performance helping your GPU, a LOT since it's 1080P. I'm guessing that your CPU is getting at least 1600 Geekbench ST score.
 
  • Like
Reactions: psolord

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,274
19,922
146
Here is a realistic budget build with the 6500XT. Using a 12100 and quick sync, he is able to play and stream perfectly fine. Even Halo Infinite is a good time on it. Stop being part of the group think lynch mob, and take a minute to spec a proper build, and the card crushes the 1650 and 1050ti it is favorably priced against.


Big reviewers did a hit job on it. Aussie Steve was shameless enough to pull the "I have to film the monitor because it can't do video capture hurr durr!" Maybe if Steve played something other than fortnite on high end hardware, he'd know you can record with afterburner etc. :p My favorite hit job talking point from reviews, is intentionally exceeding the 4GB frame buffer to show off how bad bandwidth is in 3.0. "Look guyz!1! even the 1050ti is doing better in this game. Derp."
 

DeathReborn

Platinum Member
Oct 11, 2005
2,743
734
136
Here is a realistic budget build with the 6500XT. Using a 12100 and quick sync, he is able to play and stream perfectly fine. Even Halo Infinite is a good time on it. Stop being part of the group think lynch mob, and take a minute to spec a proper build, and the card crushes the 1650 and 1050ti it is favorably priced against.


Big reviewers did a hit job on it. Aussie Steve was shameless enough to pull the "I have to film the monitor because it can't do video capture hurr durr!" Maybe if Steve played something other than fortnite on high end hardware, he'd know you can record with afterburner etc. :p My favorite hit job talking point from reviews, is intentionally exceeding the 4GB frame buffer to show off how bad bandwidth is in 3.0. "Look guyz!1! even the 1050ti is doing better in this game. Derp."

(UK prices) They could save a decent amount of money buying a used 1070, 580 8GB or 5500XT 8GB all for 20-30% less money than the 6500XT or for 10-15% less money a used 1660S or for the same price a used 2060/1080. In the UK you can easily get a cheaper used card that is either faster or hasn't got the handicaps of the 6500XT.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,274
19,922
146
(UK prices) They could save a decent amount of money buying a used 1070, 580 8GB or 5500XT 8GB all for 20-30% less money than the 6500XT or for 10-15% less money a used 1660S or for the same price a used 2060/1080. In the UK you can easily get a cheaper used card that is either faster or hasn't got the handicaps of the 6500XT.
Haters gonna hate.

Used v. New is not apples to apples. It is however, the standard response to the 6500xt by reviewers though. That being, moving the goal post, instead of acknowledging it wins v the Nvidia alternatives. The 1650 and 1050ti are among the most popular on steam surveys. 6500xt merks them. But they are still selling better new, because gamers continue being gas lit.

I accept your info on the U.K. market, I do not have personal experience with it. And I know you have sellers that even give 2yr warranties on used cards. Very cool.

That said, I am well versed in the market here in the U.S. 5500xt 8GB is the only one of the cards you mentioned, not on the wrong end of the bathtub curve. The other 8GB cards may have been ragged out, not just during this mining boom, but the last one too. That's how old they are. If it fails in weeks or months, that cash strapped gamer, you know, the one that can ill afford to lose money on a deal, now has to start all over scraping together cash for another card. Only this time they are out $100-150 that could have gone toward buying a 6500xt with warranty. And the peace of mind that goes with it.
 

Glo.

Diamond Member
Apr 25, 2015
5,661
4,419
136
I have a PC with i3-12100F and RX 6400, and 8 GB RAM.

And its the best computing experience I have ever had.

Mainly because of high-refresh rate monitor, but still... :p I really do not understand the criticism of N24 GPUs. They are good for what they are.
 
  • Like
Reactions: DAPUNISHER

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
I really do not understand the criticism of N24 GPUs. They are good for what they are.

That right there^^

So long as you keep your expectations realistic it's not a bad GPU. Just wish they'd kept the AV1 decoder, and made an 8GB version. But 4GB GDDR6 chips are probably too expensive for this segment. If nothing else that'd have silenced the critics quickly.
 

gdansk

Golden Member
Feb 8, 2011
1,986
2,356
136
I have a PC with i3-12100F and RX 6400, and 8 GB RAM.

And its the best computing experience I have ever had.

Mainly because of high-refresh rate monitor, but still... :p I really do not understand the criticism of N24 GPUs. They are good for what they are.
I am a bit disappointed it is completely locked down. I cannot reduce clocks or anything in the AMD control panel.

Between that, PCIE x4, and lacking certain media blocks it is definitely not AMD's best effort at making a good low end card.

It is probably the best buy for specific price points in certain markets but it easily could have been better.
 

Glo.

Diamond Member
Apr 25, 2015
5,661
4,419
136
I am a bit disappointed it is completely locked down. I cannot reduce clocks or anything in the AMD control panel.

Between that, PCIE x4, and lacking certain media blocks it is definitely not AMD's best effort at making a good low end card.

It is probably the best buy for specific price points in certain markets but it easily could have been better.
PCIe 4x would be enough IF... the GPU would be 96 bit bus and 6 GB VRAM buffer.

That alongside the 20 CUs(1280 ALUs) would make this far, far better product. Especially - clocked lower to fit in sub 75W and no-6 pin connector envelope.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,274
19,922
146
The complaints are understandable, but I don't agree with them.

Let's use the 5800x3D as an example. It is a gaming CPU; if you buy it for productivity you will be disappoint, since it will underperform in most tasks compared to even its non 3D bro.

The 6400 and 6500 are similar, in that they have a target market. The 6500, as is well known, is repurposed mobile hardware. For the desktop, it was meant to unseat the 1650 and 1050ti as budget gaming kings. It did just that, regardless of mis&disinformation based mud slinging campaigns maligning it, that always omit this important fact. Like the 3D, choose it for gaming, not other tasks. Do a smart build as linked above, and you can stream too. Afterburner and others, as noted, can record the gameplay, countering the talking point about that. As if it not having AMD's solution means there isn't one. And if your CPU can't handle AV1 decode, it ain't the card for you anyways. :D

The 6400 has models that are the fastest single slot, low profile, slot powered cards now = mission accomplished. LP 1650 is still vastly overpriced making it a non starter.

And if there were in 8GB model when it released, it would have been price gouged and scalped.