Question Will performance decrease on ASUS RTX 4070 TI Super PCIE 4.0 X8 Bus?

Orodruin

Member
Sep 30, 2020
37
4
71
Hello everyone;

AMD Ryzen 9 7900X3D
Asus ROG Strix B650E-E Gaming
and Asus RTX TUF RTX4070TI Super,
I have a system consisting of main components.

There are a total of 4 Nvme M2 slots on my motherboard. 2 of them support PCIE 5.0, and when you insert an SSD into the second m2 slot, the GPU slot decreases from X16 to X8 (Pcie Gen 5). How much will I lose in performance as a result of decreasing this bus from X16 to X8?
I want to learn this.
My graphics card supports PCIE Gen4. The motherboard is Gen 5.
But when the bus decreases from X16 to X8, this also corresponds to the same bus width as X16 Gen3.
Most graphics cards do not use X16 Gen4 bandwidth. I know that.
I know that even Gen3 16x is more than enough.
But I would still be happy if those who know could respond.
 

BurnItDwn

Lifer
Oct 10, 1999
26,211
1,692
126
This is several years old, but the same principals/ideas likely to be true here if you don't find a more comprehensive/modern answer

Likely you will take a small performance hit, my guess would be in the area of around 2%.

 
  • Like
Reactions: DAPUNISHER

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
7,885
2,862
146
Wow that is incredible and surprising, apparently reducing the PCIe bandwidth to 25% only loses 6-8% with the 4090!
 

ToTTenTranz

Senior member
Feb 4, 2021
249
462
106
Wow that is incredible and surprising, apparently reducing the PCIe bandwidth to 25% only loses 6-8% with the 4090!

I'd say a GPU with a very large framebuffer like the RTX 4090 will be able to better mitigate the reduced PCIe bandwidth, as it can cache more data and require less assets to be transferred more than once.

GPUs with a small framebuffer and a reduced PCIe bandwidth will hurt the most, because it needs to keep sending assets from RAM to VRAM as the latter needs to be scrubbed constantly.

That's what made the 6500XT 4GB pretty much useless when it released, it uses a narrow 4x PCIe bus that needs to constantly renew the contents of a small 4GB VRAM. Not only did it often perform worse than the 5500XT predecessor, when paired with an older PCIe 3.0 motherboard (which is common for lower-end GPUs) it could lose another 35-50% performance.


F1.png
Doom.png






I get why AMD felt immense an pressure to release a 6500XT at the time. GPUs were still selling at rocket-high prices as the market still hadn't (still hasn't?)) recovered from the crypto craze, and AMD stood to gain ridiculous margins from the tiny Navi 24 chip.
I also get that Navi 24 was probably made with low-end laptops using newer PCIe4 chipsets in mind which is why they only implemented 4x PCIe lanes in the chip.


What I don't get is the decision to limit a $200 product to 4GB VRAM, when the similarly priced RX480 brought 8GB VRAM 6 years earlier. In the end AMD launched a product no one bought and once again only damaged their brand. Which is the most stupid thing to do when your marketshare is diving below 20% like it was at the time.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
30,210
26,488
146
The 6400 and 6500XT are certainly the exception and not the rule to PCIe scaling from 4.0 to 3.0 having a big impact. Also, AMD sold a bunch of them as HP and Lenovo offered them in desktops. The kicker is both usually configured it with Cezanne 🤣 The 6400 single slot LP models have done okay in retail, because it occupies a niche few cards do. Particularly in the $150 and under market.

Reputation damage? Adding a new flavor of Haterade is all it really amounted to. ;) The rest of us understood any more vram and they would have been scooped up by bots and scalped like everything else with enough for Ethereum. 1050ti's were being manufactured new again, yet no one raises hell over that POS selling for as much or more than the 6500XT. Or talks about reputation damage due to it. Why is that I wonder? /s If anyone responds to that sarcastic rhetorical question anyways. Understand nothing you write will convince me the 1050ti pricing was not more egregious than the 6500XT.

That Doom Eternal bigger bar better highlights the hijinks. Neither the 6500XT in 3.0 or the 1050ti hit 60fps but it's used to show the worst case scenario anyways. You have to work to find a scenario where the 1050ti competes with it, and when you do it'll be a pyrrhic victory. Never you mind that game is one of very few that still looks great at the next lower texture setting. Even low settings in that game look better than older games on high. But reviews bashing and trashing products get the most hits the fastest so off we go. Reviewers were hurting bad back then, and complaining about no monies due to lack of an audience. Because no one gave a crap about stuff they could not buy. So they were flogging the drama even harder than usual to get those clicks. Fortunately, a couple of reviewers took the high road and reviewed the cards through the lens of the time in which they were released AKA the dark times.

It's always fun to watch how easily the narrative is controlled and mindshare harvested. But I was there Gandalf, 3000 years ago, and things were much more messy then than now. In today's market only the tiny 6400s still have a use case. The rest should no longer exist. At the time, they made more sense than the similarly priced 1050ti. Particularly for Zen 2/3 (aside from Cezanne of course) with 5 series board, or i3 and lower end i5 11th and 12th gen build.

On topic: In the OP's case the 4070ti Super (what a dumb name TIE SUPER is) will do great with 3.0 x16 bandwidth.
 
  • Like
Reactions: Orodruin

Orodruin

Member
Sep 30, 2020
37
4
71
Techpowerup did PCIe scaling with the 4090 https://www.techpowerup.com/review/nvidia-geforce-rtx-4090-pci-express-scaling/29.html

Gaming experience should not change at all i.e. need benchmarks or fps counters to even know there is a difference.
I've been using it in X8 mode for about a week. And I didn't notice any performance drop. I think the FPS must have dropped to something like 145-148 in the game where I was already getting 150FPS.
I don't even feel the 1-2% decrease.
 

Orodruin

Member
Sep 30, 2020
37
4
71
This is several years old, but the same principals/ideas likely to be true here if you don't find a more comprehensive/modern answer

Likely you will take a small performance hit, my guess would be in the area of around 2%.

Yes, I have been testing it for a week, there is around 1-2% performance loss and this is very insignificant. It is not felt at all in games. This is very good. Pcie Gen 4 x8 was more than enough.
 
  • Like
Reactions: BurnItDwn

Orodruin

Member
Sep 30, 2020
37
4
71
Wow that is incredible and surprising, apparently reducing the PCIe bandwidth to 25% only loses 6-8% with the 4090!
RTX4090 suffers an average of 2-3% performance loss with PCIE Gen4 X8. In the Techpowerup review, except for a few games, the general loss looked like this.