GodisanAtheist
Diamond Member
- Nov 16, 2006
- 8,536
- 9,972
- 136
Does cost factor in? I considered the 2080 a side grade from my 1080Ti at the same MSRP, but I probably would've bought it at 1/3 the cost.9070XT is a solidly a side-grade from the 3090Ti. If you highly value NV features, a 10% improvement in performance is not nearly enough to offset the lack of DLSS etc. and could reasonably be construed as a downgrade.
FSR4 while good is not nearly as broadly adopted as DLSS.
View attachment 131800
Does cost factor in? I considered the 2080 a side grade from my 1080Ti at the same MSRP, but I probably would've bought it at 1/3 the cost.
Again; that chart is fake news for old vs new. It has the ARC A380 as 20% slower than a GTX 780 tie. A380 has current driver support, full DX12 support, hardware ray tracing and upscaling, twice the vram, much better media encode/decode. Yet an old card with 3GB of vram and DX12( 11_1) features is supposed to be 20% i.e. a whole tier faster. It's laughable really.9070XT is a solidly a side-grade from the 3090Ti. If you highly value NV features, a 10% improvement in performance is not nearly enough to offset the lack of DLSS etc. and could reasonably be construed as a downgrade.
Toggle FSR4 on in the app, and most games with 3.1 will use it. It can be injected with some games that have DLSS using Optiscaler. But is no bueno for anti-cheat or Vulcan.FSR4 while good is not nearly as broadly adopted as DLSS.
Again; that chart is fake news for old vs new. It has the ARC A380 as 20% slower than a GTX 780 tie. A380 has current driver support, full DX12 support, hardware ray tracing and upscaling, twice the vram, much better media encode/decode. Yet an old card with 3GB of vram and DX12( 11_1) features is supposed to be 20% i.e. a whole tier faster. It's laughable really.
Though the 30 series isn't nearly as old for doing a compare and contrast, driver optimizations for it are starting to fade. Which is simply pattern recognition of how Nvidia does drivers over the years. I read complaints on reddit subs (not the Nvidia sub though, they delete the posts according to the affected owners) with owners showing how they have to use older drivers or else certain titles they play are up to 20% slower than with the newest ones.
Hardware Lab and Iceberg tech did a good video debunking relative performance a couple years back. As the years go by, you can see the 290X starts to spank the 780 tie. Assetto Corsa is almost double the performance. Yet they are next to each other on the TPU chart. The GTX fails to even render games like shadow of the tomb raider and red dead 2 correctly.
Toggle FSR4 on in the app, and most games with 3.1 will use it. It can be injected with some games that have DLSS using Optiscaler. But is no bueno for anti-cheat or Vulcan.
If you check the dates, it's spring 2022 as I recall.you got a 3090Ti for $2500 back in 2021
It's also the straight dope. The sooner people stop throwing that fake news chart around, the better. Especially tech tubers like Daniel-san. He is among the worst offenders. He can test whatever hardware he wants to base his content around, it's just lazy and doing his audience a disservice.That's all well and good
That's a good analogy IMO.And I would say the 5070 Ti and RX 9070 XT at MSRP (seemingly prices have fallen to where they should be) are the best "value" parts since the RTX 3080.
And I would say the 5070 Ti and RX 9070 XT at MSRP (seemingly prices have fallen to where they should be) are the best "value" parts since the RTX 3080.
Compared to a similarly priced GA102 chip it is quite an upgrade.-Yeah and don't get one looking for a huge performance bump if you already have a 3090ti...
Compared to a similarly priced GA102 chip it is quite an upgrade.
But if you have a 3090 Ti neither AMD nor Nvidia have anything at 1/3rd the price that's much an upgrade.
It's a weird comparison because there is only one part available that's an actual significant upgrade. AMD is not making a 5090 competitor, they can't justify making chips that big. But they have had parts available that are generally faster than the 3090 Ti since December, 2022 mere months after its release...
No, he asked why AMD doesn't have a 5090 competitor. That's a fair question but should be obvious from requisite die size and market share... That is never making a return on investment.-If you have a 3090 TI from 2022 then you're not in the market for a $600-$750 part... the 5090 is your upgrade path and you've demonstrated that you are willing to drop $2000+ on a card.
This was all kicked off thanks to @Stg-Flame saying his 9070xt feels like a downgrade from a 3090ti.
The other part was that the 9070 XT looks like a downgrade compared to his 3090 Ti. It is not, except in memory. And they did this at a much lower cost.
The only significant upgrade from a 3090 Ti is the 5090 and it costs 5090 money. AMD isn't alone here. There is no logical upgrade from anyone if you are not willing to spend 5090 money.-Rignt, it's a side grade and it doesn't make sense to go from a 3090ti to a 9070xt or 5070ti.
Only makes sense if you can somehow sell your 3090ti for more money than the newer parts cost, which is feasible with how people value used NV parts.
The only significant upgrade from a 3090 Ti is the 5090 and it costs 5090 money. AMD isn't alone here. There is no logical upgrade from anyone if you are not willing to spend 5090 money.
Hope none of you were waiting on those
Buy whatever you can... whatever is available...
HardwareRumor
AMD & NVIDIA Could Kill Off Budget GPUs as Memory Shortages Drive Costs Up, Leaving Entry-Level Gamers With Little Options
Muhammad ZuhairNov 18, 2025 at 11:50am EST
AMD/NVIDIA Could Halt Budget GPU Production As Memory Shortages Force Them to Reallocate Capacity
It appears that DRAM shortages are going to affect gamers on a much larger scale than just being confined to RAM supply, as according to a report by the Korean media outlet Hankyung, it is rumored that AMD and NVIDIA are looking to "discontinue" budget-oriented GPUs, since their BOM (Bill of Materials) has risen dramatically with higher GDDR module prices. The report doesn't specifically mention which SKUs could be affected by this move
![]()
AMD & NVIDIA Might Kill Off Budget GPUs Soon, And Gamers Won’t Like the Reason
This isn't good news for PC gamers at all, as a new report suggests that GPU manufacturers may halt production of budget GPUs.wccftech.com
- I'm so, so tired boss...
Was in same situation, so decided to get RX 9070 before new year. Who know what happens in 2027, RDNA5/Rubin prices/memory.So help me.. I'm debating buying a 9060 16gb or even a 9070.
Not the cards, the prices. Current pricing will seem down right reasonable with the direction things are heading.Maybe people will stop complaining about 8 GB cards if there are none.
If the dark times return yet again, you might end up getting back all of your money or even more than you paid, for your present GPU, by selling it next year. Helping defray or cover the cost of a new 90 series card. Next week could be the best prices we will see for a long while. DDR pricing is the first domino?So help me.. I'm debating buying a 9060 16gb or even a 9070.
