Question 4070ti vs 7900xt

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Cableman

Member
Dec 6, 2017
78
73
91
It's time for me to upgrade my aging GTX1070 and I am trying to decide between the 4070ti and the 7900xt. The card will be bought through my employer ($900 is my hard limit) so those are my two options. I have a 4k 144hz monitor that I haven't been able to drive properly with the 1070.

I read and watched all the reviews I could find. The cards are basically the same price. The 7900xt beats the 4070ti in raster pretty much in every game. The weakness of the 7900xt seems to be RT, but it doesn't seem to be as bad as I expected. There are even games where it beats the 4070ti even with RT on. The 2 worst cases I saw were Control and Cyberpunk 2077. The one variable I'm not sure about is DLSS vs FSR - do we expect FSR to gain similar traction to DLSS? Is the 7900xt missing any features?

So overall, the two are priced the same (in practice), the 7900xt is faster without RT and just a bit slower with RT on. Is there a reason to consider the 4070ti over the 7900xt?
 
  • Like
Reactions: Leeea and adamge

Ranulf

Platinum Member
Jul 18, 2001
2,348
1,165
136
The 7900XT has another reason beyond NVidia why it's pricing makes no sense: The 7900 XTX.

The XT is only $100 less than the XTX, and has significantly worse perf/$ than the XTX.

You really need biased clouded thinking, to conclude the price for the XT makes any sense.

The XT needs $100 price cut, to compete with the 4070 Ti and to make any sense compared to the XTX.

Well yes, this is all obvious but the XTX also blows the 4070ti out of the water by your argument. If one is willing to spend $800+ for the new midrange Nvidia, why not buy the XTX with double the ram and its an actual 4k card?

In the end, ignore AMD GPUs (most everyone else does.. rimshot) and just start comparing Ada cards to each other. Then toss in last gen. Even with inflation arguments (that truly matter for the first time in years), ignoring crypto going away or BoM/shipping costs going up, Nvidia has a tough sell with a 192bit 12Gb card at $800 or $700. Its a 1440p card. If the Green Marketing Machine was truthful and not high on RT Overdrive/DLSS 3 pixie doubler dust they would pull out the old 5600XT's marketing line from 3 years ago and call it the "ultimate 1440p card".
 
  • Like
Reactions: Saylick and Khanan

Ranulf

Platinum Member
Jul 18, 2001
2,348
1,165
136
Hah, hilarious. They're pushing Darktide a lot, with its RT effects. Game looks good but as usual from FatShark Devs, it needs optimization. $800 to run a $40 game at 120fps at 1440p resolution.
 
  • Like
Reactions: Khanan

Aapje

Golden Member
Mar 21, 2022
1,376
1,853
106
I think DLSS3 is for mid range GPU users and lower, you surely don’t need it at 1440p or lower which is probably the typical res of the 4070 Ti. 4090 and 4080 users don’t need it either, there’s so much performance at hand, why would they, or why should they use it.

No, DLSS 3 is actually more useful for the 4080 and 4090, because DLSS 3 reduces the CPU bottleneck. Since the fake frames are generated entirely by the GPU, the CPU doesn't have to do all the processing for those frames.

Still, it's only useful in some games.
 

leoneazzurro

Senior member
Jul 26, 2016
920
1,450
136
No, DLSS 3 is actually more useful for the 4080 and 4090, because DLSS 3 reduces the CPU bottleneck. Since the fake frames are generated entirely by the GPU, the CPU doesn't have to do all the processing for those frames.

Still, it's only useful in some games.

The problem is that these fake frames are not related at all to the game engine (which runs on the CPU) so the input lag at best does not increase, at worse it is bigger than without DLSS3. So in all the games where normally an high frame rate is desirable because it improves the responsiveness, you have the opposite effect. So yes, it is useful only in the games where input lag is not important.
 

beginner99

Diamond Member
Jun 2, 2009
5,210
1,580
136
I have posted that BOTH AMD and Nvidia's prices suck in this generation where mining is no longer a factor.
minign never really was such a big factor but I got almost banned fo rsaying this here. Now mining id dead, prices are still high, so I have all but been confirmed mining never really was the issue.
 

Aapje

Golden Member
Mar 21, 2022
1,376
1,853
106
Lots of people play single player games or multiplayer games where slightly higher input lag is no big deal. So far, the main issue I've seen is that the artifacting can be really bad.
 

Khanan

Senior member
Aug 27, 2017
203
91
111
No, DLSS 3 is actually more useful for the 4080 and 4090, because DLSS 3 reduces the CPU bottleneck. Since the fake frames are generated entirely by the GPU, the CPU doesn't have to do all the processing for those frames.

Still, it's only useful in some games.
If you have a serious bottleneck which makes the 4090 unusuable down to the point you need suboptimal tech like DLSS3 which lowers image quality, you have other problems, like a weak CPU for example. Or wrong settings.
 

blckgrffn

Diamond Member
May 1, 2003
9,123
3,056
136
www.teamjuchems.com
Hah, hilarious. They're pushing Darktide a lot, with its RT effects. Game looks good but as usual from FatShark Devs, it needs optimization. $800 to run a $40 game at 120fps at 1440p resolution.

If only they were pushing a good game, rather than a game that is being widely panned. The number of games where this "RT Performance" gap matters is still... small. Extremely tiny.

And yeah, put me in the "staying on Vermintide 2 until they make it not suck" crowd. That's plenty of fun on the cheap, thanks. And I think the 40K universe could be way cooler than the Warhammer Fantasy one, too.
 
  • Like
Reactions: Khanan and Ranulf

MrTeal

Diamond Member
Dec 7, 2003
3,569
1,698
136
In the end, ignore AMD GPUs (most everyone else does.. rimshot) and just start comparing Ada cards to each other. Then toss in last gen. Even with inflation arguments (that truly matter for the first time in years), ignoring crypto going away or BoM/shipping costs going up, Nvidia has a tough sell with a 192bit 12Gb card at $800 or $700. Its a 1440p card. If the Green Marketing Machine was truthful and not high on RT Overdrive/DLSS 3 pixie doubler dust they would pull out the old 5600XT's marketing line from 3 years ago and call it the "ultimate 1440p card".
They certainly don't want you comparing it to last gen, because then those of us who bought 3080s two years ago will be looking at spending around 15% more money on a 4070 Ti vs 3080 MSRP for a whopping ~15% increasing in performance at 4k. There's a reason they're always comparing it to the joke of a 3090 Ti.
 

Aapje

Golden Member
Mar 21, 2022
1,376
1,853
106
If you have a serious bottleneck which makes the 4090 unusuable down to the point you need suboptimal tech like DLSS3 which lowers image quality, you have other problems, like a weak CPU for example. Or wrong settings.

The 4090 is bottlenecked by every CPU in quite a few games. There are games that are extremely CPU-heavy as well, like MSFS.
 

Aapje

Golden Member
Mar 21, 2022
1,376
1,853
106
They certainly don't want you comparing it to last gen, because then those of us who bought 3080s two years ago will be looking at spending around 15% more money on a 4070 Ti vs 3080 MSRP for a whopping ~15% increasing in performance at 4k. There's a reason they're always comparing it to the joke of a 3090 Ti.

Anyone who has got a 3080 should just stick with it for now, unless they are willing to greatly increase their budget.
 

Khanan

Senior member
Aug 27, 2017
203
91
111
The 4090 is bottlenecked by every CPU in quite a few games. There are games that are extremely CPU-heavy as well, like MSFS.
At 4K? Highly rarely (or never). And even then, youre already at high enough FPS.

I don't think lesser resolutions are relevant with the 4090.
 

Cableman

Member
Dec 6, 2017
78
73
91
I don't think that any GPU is bottlenecked at 4k. My aging i7-8700 won't be bottlecking the 7900xt because I'm at 4k too (might lose 2-3 frames). I'm sure we'll get there, but not this generation.
 
Last edited:
  • Like
Reactions: Mopetar

Mopetar

Diamond Member
Jan 31, 2011
7,831
5,980
136
Maybe something like Factorio, but that's probably going to be CPU bound at any resolution.

A few years ago I had looked at CPU benchmarks in 4K and in over half of the titles the $80 Celeron was as good as the top Intel CPU.

Most games have certainly changed to take advantage of more cores to the point where it's unlikely that a 2-core CPU would do as well today, but it wouldn't take a whole lot more.
 
  • Like
Reactions: Cableman

Khanan

Senior member
Aug 27, 2017
203
91
111
Maybe something like Factorio, but that's probably going to be CPU bound at any resolution.

A few years ago I had looked at CPU benchmarks in 4K and in over half of the titles the $80 Celeron was as good as the top Intel CPU.

Most games have certainly changed to take advantage of more cores to the point where it's unlikely that a 2-core CPU would do as well today, but it wouldn't take a whole lot more.
You also have to think about the fact that todays gpus simply need way more input at 4K, not comparable to the times you mentioned. So you need quite a good cpu to feed a high end gpu now for 4K, something like a Ryzen 1600 or 4700K won’t cut it anymore. 8700 which he mentioned is right at the edge
 

Mopetar

Diamond Member
Jan 31, 2011
7,831
5,980
136
Older architectures tend to fall off faster than fewer cores. Here're some recent results from TPU for 4K gaming. I'll grant it's not using a 4090, but I wouldn't expect wildly different results.

1673372130880.png

The i3 10100 which only has 4 cores and a max boost of 4.3 GHz is able to offer ~93% of the performance of a 13900K which is the top scoring chip in the results. That's an 8-core CPU with a max boost of 5.8 GHz.

Maybe you can't quite go as bottom of the barrel as you used to be able to, but when 2-core Celerons were still a viable pick, Intel was only just starting to move past 4-cores in their desktop parts.

The times haven't changed as much as you might think.
 

Khanan

Senior member
Aug 27, 2017
203
91
111
Older architectures tend to fall off faster than fewer cores. Here're some recent results from TPU for 4K gaming. I'll grant it's not using a 4090, but I wouldn't expect wildly different results.

View attachment 74342

The i3 10100 which only has 4 cores and a max boost of 4.3 GHz is able to offer ~93% of the performance of a 13900K which is the top scoring chip in the results. That's an 8-core CPU with a max boost of 5.8 GHz.

Maybe you can't quite go as bottom of the barrel as you used to be able to, but when 2-core Celerons were still a viable pick, Intel was only just starting to move past 4-cores in their desktop parts.

The times haven't changed as much as you might think.
Yea exactly like I said, Ryzen 2600 doesn’t cut it anymore with a high 18% perf loss which will be even higher with 4090. It’s nearly the same cpu as the 1600 I mentioned earlier .

then again there’s not much point to this discussion. Anyone who can afford something like a 4090 or anything remotely close will not use a low end cpu, they sometimes use older high end CPUs like 8700 or so. And then they upgrade anyways
 

Aapje

Golden Member
Mar 21, 2022
1,376
1,853
106
Games typically have a very demanding main process, so a single fast core tends to be really important, plus a couple of cores to offload other work to that doesn't have to be on that same core.

So 4 fast cores tends to beat 20 billion slow cores.
 
  • Like
Reactions: CP5670

CP5670

Diamond Member
Jun 24, 2004
5,510
588
126
It's very game dependent. Some older games slow down in some specific areas on any video card, and possibly even on any CPU. One infamous case is the vtol map in Crysis. I saw it recently in Outer Worlds in the Byzantium hub (but not any other map). Other games and anything RT are GPU bottlenecked and an older CPU is fine.
 

blckgrffn

Diamond Member
May 1, 2003
9,123
3,056
136
www.teamjuchems.com
Older architectures tend to fall off faster than fewer cores. Here're some recent results from TPU for 4K gaming. I'll grant it's not using a 4090, but I wouldn't expect wildly different results.

Thank you for sharing that graph - for me the biggest takeaway - off topic of course - is that the 11th Gen regression across the board compared to 10th Gen Intel, to the point where a 11900K is behind a 10700K. Ouch. Probably burning 20%-30% more juice too.

Makes me feel great about that 12100F build as well, hah.
 

Khanan

Senior member
Aug 27, 2017
203
91
111
Thank you for sharing that graph - for me the biggest takeaway - off topic of course - is that the 11th Gen regression across the board compared to 10th Gen Intel, to the point where a 11900K is behind a 10700K. Ouch. Probably burning 20%-30% more juice too.

Makes me feel great about that 12100F build as well, hah.
it isnt though. 11900K has higher ipc but lower cache than 10900K, so while its often slower its also faster sometimes, the architecture is more modern. 11700K has the same cache amount and higher ipc than 10700K, same cores as well - so it can't be slower and it isn't. Same for 11900K.
 

blckgrffn

Diamond Member
May 1, 2003
9,123
3,056
136
www.teamjuchems.com
it isnt though. 11900K has higher ipc but lower cache than 10900K, so while its often slower its also faster sometimes, the architecture is more modern. 11700K has the same cache amount and higher ipc than 10700K, same cores as well - so it can't be slower and it isn't. Same for 11900K.

I am just referring to the graph used above that has every 11 series CPU behind the 10 series counterpart. I hear what you are saying, but the memory controller was bad and it leaked way to much power which probably restricted it ability to boost as high or as long as the 10 series (lmao, forever boosting at P2 :D). The 11 series was just a bad Intel release that I actually had really high hopes for, which I recounted many a time here until it launched with a thud.

In any case, any modern CPU should be able to get a lot of value out of any of these nice, high end GPUs.

And if a 10100 can hang, so can a 8700/9700/9900 - with the caveat to all of them that you might get some low 1% FPS numbers.
 

Khanan

Senior member
Aug 27, 2017
203
91
111
I am just referring to the graph used above that has every 11 series CPU behind the 10 series counterpart. I hear what you are saying, but the memory controller was bad and it leaked way to much power which probably restricted it ability to boost as high or as long as the 10 series (lmao, forever boosting at P2 :D). The 11 series was just a bad Intel release that I actually had really high hopes for, which I recounted many a time here until it launched with a thud.

In any case, any modern CPU should be able to get a lot of value out of any of these nice, high end GPUs.

And if a 10100 can hang, so can a 8700/9700/9900 - with the caveat to all of them that you might get some low 1% FPS numbers.
you should look at 1080p data if you want CPU vs CPU comparisons (that I recited from memory), this isn't good for that, because 4K puts the pressure elsewhere.