Article [PCGH via Neowin] PCIe 3.0 could be crippling AMD's RX 5500 XT performance

Hitman928

Diamond Member
Apr 15, 2012
5,313
7,961
136


PCGH is the source of the benchmarks but since it's not in english, I'm linking a write-up on the findings as well.

This story popped up in my feed, not too sure what to think of it yet and don't have much free time to really dig into it but very interesting if true. Possibly could be a driver issue at work? I don't know, it seems like PCIe3 8x shouldn't be this restricting but maybe it is.
 

amrnuke

Golden Member
Apr 24, 2019
1,181
1,772
136


PCGH is the source of the benchmarks but since it's not in english, I'm linking a write-up on the findings as well.

This story popped up in my feed, not too sure what to think of it yet and don't have much free time to really dig into it but very interesting if true. Possibly could be a driver issue at work? I don't know, it seems like PCIe3 8x shouldn't be this restricting but maybe it is.
Really interesting. The idea that you could increase performance 40-100% by going with an 8GB variant on PCIe4 over a 4GB variant on PCIe3 is remarkable for a card that theoretically shouldn't be constrained by PCIe3 bandwidth. After all, a 2080 Ti can push out twice as many frames on PCIe3. So why not a 5500XT?

Seems like a probable driver issue or some other problem extrinsic to the Navi chip.
 

SPBHM

Diamond Member
Sep 12, 2012
5,056
409
126
Really interesting. The idea that you could increase performance 40-100% by going with an 8GB variant on PCIe4 over a 4GB variant on PCIe3 is remarkable for a card that theoretically shouldn't be constrained by PCIe3 bandwidth. After all, a 2080 Ti can push out twice as many frames on PCIe3. So why not a 5500XT?

Seems like a probable driver issue or some other problem extrinsic to the Navi chip.

it's limited to x8, while the 2080 (and most cards) is x16

and as you can see it's not just about the card being faster, the 4GB version is far more affected because it has to load stuff from ram more often than a card with more vram
 

amrnuke

Golden Member
Apr 24, 2019
1,181
1,772
136
it's limited to x8, while the 2080 (and most cards) is x16

and as you can see it's not just about the card being faster, the 4GB version is far more affected because it has to load stuff from ram more often than a card with more vram
That makes sense, I totally missed the statement about x8. That's interesting - is that an OEM request carryover that they just kept the same on the retail cards?
 

DeathReborn

Platinum Member
Oct 11, 2005
2,746
741
136
The 5700XT sees a negligible (1-2%) performance difference between PCIe 4.0 x16 & PCIe 2.0 x16 so I doubt that the PCIe 3.0 x8 link is a limitation for performance at all.

 
  • Like
Reactions: Mopetar

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
The 5700XT sees a negligible (1-2%) performance difference between PCIe 4.0 x16 & PCIe 2.0 x16 so I doubt that the PCIe 3.0 x8 link is a limitation for performance at all.

The 5700XT does have twice the bandwidth, so maybe its true? Maybe the PCIe scaling should be tested on lower end cards than higher end ones.

A simple way to test is by seeing how it scales when VRAM clocks are changed.
 

SPBHM

Diamond Member
Sep 12, 2012
5,056
409
126
they both can be right, the problem with the TPU one is that they seem to use their regular test suite, while the other test was more focused
 

VirtualLarry

No Lifer
Aug 25, 2001
56,348
10,048
126
How much of this issue is "real world" (affecting games and/or compute apps), versus theoretical ("power-virus-type benchmarks")?

And why is the RX 5500 (XT) affected, but not the RX 5700 (XT)?
 

AtenRa

Lifer
Feb 2, 2009
14,001
3,357
136
How much of this issue is "real world" (affecting games and/or compute apps), versus theoretical ("power-virus-type benchmarks")?

And why is the RX 5500 (XT) affected, but not the RX 5700 (XT)?

Probable because the 5500XT only has 8x PCI-e links and 4GB memory.
It seems that when the game needs more than 4GB memory and needs to access the main system ram is when we have the most gains from PCI-e Gen 4.
 

PhonakV30

Senior member
Oct 26, 2009
987
378
136
Probable because the 5500XT only has 8x PCI-e links and 4GB memory.
It seems that when the game needs more than 4GB memory and needs to access the main system ram is when we have the most gains from PCI-e Gen 4.

I think he meant 5500XT 8GB vs 5700.It's odd that 8GB indeed gets benefit from PCIe 4 while 5700 doesn't.
 

beginner99

Diamond Member
Jun 2, 2009
5,210
1,580
136
Hardware Unboxed did a little video covering it.


Not exactly conclusive I think, although it does paint AMD in a bad light both for the x8 choice and no cheaper PCIe 4.0 motherboards. Bean counters overruling engineers maybe?

So basically AMD cheaped out and that cheaping out cripples the performance in some game on their own platform where as it works just fine on Intel pcie 3.0 platform.*


* albeit AMD test was done on x570 set to pcie 3. Maybe that doesn't work as nicely? I think this needs to be repeated on 400 and 300 series AMD boards.
 
  • Like
Reactions: Ranulf

Campy

Senior member
Jun 25, 2010
785
171
116
Should only be a problem when 4GB VRAM is not sufficient and the card needs to swap out what's in VRAM a lot. I think the problem can be avoided by lowering settings to get under 4GB VRAM usage.
 
  • Like
Reactions: guachi

beginner99

Diamond Member
Jun 2, 2009
5,210
1,580
136
Should only be a problem when 4GB VRAM is not sufficient and the card needs to swap out what's in VRAM a lot. I think the problem can be avoided by lowering settings to get under 4GB VRAM usage.

BF5 also shows a lot of improvement with 8gb card. Seems like it simply needs more bandwidth than other games.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
So basically AMD cheaped out and that cheaping out cripples the performance in some game on their own platform where as it works just fine on Intel pcie 3.0 platform.*

Well it seems to me that whatever deficiency the AMD platform has in gaming is covered by using PCIe 4.0. At least for the 5500XT.

How does it work between two platforms on higher end cards?
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,819
7,177
136
This whole thing comes across as a really weird excuse for the 5500xt's performance.

I mean, isn't it performing exactly where AMD said it would (20-30% faster than reg 1650)?

Best argument against this theory is why would AMD not market some mountain of hidden performance in this thing or run their tests on a PCI-e 4.0 system?
 
  • Like
Reactions: maddie

joesiv

Member
Mar 21, 2019
75
24
41
Seems like AMD is leaving a lot of price flexiblity in the RX 5500. AMD tends to drop prices over time, so I guess that's good. Hopefully there is a small price adjustment when the RX 5600 comes out.
 

H T C

Senior member
Nov 7, 2018
555
396
136
Copying a reply i gave in another topic:

There is no weirdness.

What happens is that in ALL PCIe 3.0 boards, the 5500XT works @ x8 PCIe 3.0 while with X570 boards it works @ x8 PCIe 4.0. The difference stems from the fact that boards with PCIe 3.0 have HALF the bandwidth of X570 boards, that have PCIe 4.0.

Effectively, when you place ANY PCIe 4.0 card in a non-PCIE 4.0 board, you're bottlenecking the card's true performance. It the case of the 5700(XT) card(s) you don't notice it much because of the VRAM present in the card but you may notice in SOME games if you use a 5500(XT) card.

@uzzi38 Continuing the other topic discussion.

But it shouldn't have seen one at all. See the problem?

In games where there was a noticable amount of CPU GPU communcation, it saw a difference even though it should have.

All the PCIe bandwidth tests (that i know of) have been done with cards that have 8+GB VRAM, such as the 5700XT, the 1080Ti and 2080Ti.


It's quite likely Navi 12 is aimed at workstations in an mGPU setup. Komachi doesn't think it's a mobile part, so considering the low clocks and HBM2 it's makes sense.

If that's what it's for, surely you can see how such an errata would need to be fixed, right?

No idea about this.
 

NTMBK

Lifer
Nov 14, 2011
10,239
5,025
136
When AMD planned this product several years ago, they probably assumed that Intel wouldn't completely screw up their roadmap, and that we would all be installing Navi in PCIe 4 motherboards. It's not their fault that we're stuck with the same architecture that Polaris was targeting!
 
  • Like
Reactions: lightmanek

DeathReborn

Platinum Member
Oct 11, 2005
2,746
741
136
When AMD planned this product several years ago, they probably assumed that Intel wouldn't completely screw up their roadmap, and that we would all be installing Navi in PCIe 4 motherboards. It's not their fault that we're stuck with the same architecture that Polaris was targeting!

It's not like AMD has enabled PCIe 4.0 on sub 570 motherboards, they also designed a card that would most likely be bought by those upgrading from an older card & an older system that doesn't have PCIe 4.0.
 

Bouowmx

Golden Member
Nov 13, 2016
1,138
550
146
I recently noticed my GeForce GT 730 is PCIE 2 x8.

When these PCIE x8 GPUs get put on a expansion card, the fingers are still x16 length. [A real PCIE x8 card] So, they don't gain motherboard compatibility. :confused: Board partners lazy? Buyers think x8-length card is inferior?
 

beginner99

Diamond Member
Jun 2, 2009
5,210
1,580
136
When AMD planned this product several years ago, they probably assumed that Intel wouldn't completely screw up their roadmap, and that we would all be installing Navi in PCIe 4 motherboards. It's not their fault that we're stuck with the same architecture that Polaris was targeting!

Nope. Can't blame intel on this one- In the linked video im this thread it shows that in fact on a 9900k it performs just as well as on a 3700x with pcie4. The issues happens when running x570 in pcie3 mode. However the video does not also test x470 or x370. Maybe it works just fine on these.