• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Discussion RTX 4090 reviews

amenx

Diamond Member






Comprehensive roundup:
 
Last edited:
1219 points in Octanebench - very nice. My 3090 and 2080Ti do 650 and 350 respectively. Back in 2016 i had 2x 1080s, which got combined score of 300 :-D Which was about 5x more than gtx580 i have been using till that point.
 
Man that's a large jump from 1 gen to the next.

Makes the current GPUs from both Nvidia and AMD look over priced. Sorry for those that bought an expensive card recently.
 
Power consumption numbers are confusing..

power-gaming.png


Why the humongous cooler if it uses less power than a 3080ti??
 
Power consumption numbers are confusing..

power-gaming.png


Why the humongous cooler if it uses less power than a 3080ti??

To justify higher price. They pay 20 bucks for bigger cooler, you pay 200 more for entire card, 180 in their pocket - cause look, bigger cooler!
If thats not the case with this FE, it is with those massive AIBs for sure.
 
Pretty awesome and tempting from a hardware perspective ... basically 2x faster than my 3080 at 4k, even at equal power levels. Wish I needed one for work because I can't justify it just for a few not very compelling video games.
 
That's likely because GamersNexus used FurMark. I'm not sure exactly what TechPowerUp used while logging their numbers as it just says "Gaming".

You need to click on "power consumption testing details"
  • Gaming: Cyberpunk 2077 is running at 2560x1440 with Ultra settings and ray tracing disabled. We ensure the card is heated up properly, which ensures a steady-state result instead of short-term numbers that won't hold up in long-term usage.
 
You need to click on "power consumption testing details"
  • Gaming: Cyberpunk 2077 is running at 2560x1440 with Ultra settings and ray tracing disabled. We ensure the card is heated up properly, which ensures a steady-state result instead of short-term numbers that won't hold up in long-term usage.

At 1440p with no RT, it's going to be a bit CPU limited which will drop the power consumption. Running 4K, I would expect the power consumption to increase though it's unlikely to hit the GN numbers. Just speculation, but for maxed out 4K numbers, it's probably just about in the middle, meaning around 400W. Derbauer showed 422W in Time Spy Extreme but average gaming will probably be slightly lower than that.
 
That sounds reasonable. By setting that power limit you mean the slider in Afterburner? Would that be enough, no need for any underclocking shenanigans and whatnot?
Correct.

Looks like this approach isn't universally applicable, however. Andreas did the same kind of study but across a few different games. In some games, more power did lead to higher performance.
 
The performance is tempting but I think the card in the power and price envelope I can justify will not be as impressive. That is the 4090 is about 100% more TFLOPs than the 3090 while the 4080 16gb is closer to 50% more TFLOPs than the 3080.

But anyone who bought a 3090 or 3090 Ti this year? Well, maybe should have waited...
 
From TPU
1665501093090.png

The 6700 XT is 1/2 the core count of the 6900 XT at boost clock 14% higher though with 75% of the memory bandwidth and cache, and it was 63% of the score of the bigger chip in TPU's test suite. If the 4080 12GB shows similar scaling (47% cores, 67% cache, 50% memory bandwidth) compared to the 4090, you'd expect it to end up closer to the 3090 than the 3090 Ti.
Have to wait and see how it scales.
 
From TPU
View attachment 68952

The 6700 XT is 1/2 the core count of the 6900 XT at boost clock 14% higher though with 75% of the memory bandwidth and cache, and it was 63% of the score of the bigger chip in TPU's test suite. If the 4080 12GB shows similar scaling (47% cores, 67% cache, 50% memory bandwidth) compared to the 4090, you'd expect it to end up closer to the 3090 than the 3090 Ti.
Have to wait and see how it scales.

That it what the 3 benchmarks NV showed at their event showed. 4080 12GB is around 3080Ti / 3090 performance give or take.

Also this is one of the worse showings for the 4090 given it is 'only' 50% ahead of the 3090Ti in this suite. ComputerBase have it 65% ahead of the 3090Ti in their 4k suite for pure raster.

Skimming through the RT numbers at TPU it loses less performance than the 3090Ti and with the huge raster advantage there are good gains overall but the difference is not all that much from a % FPS reduction viewpoint, eyeballing it it was about 5%.

Computerbase also showed that RT was 69% faster than the 3090Ti so an improvement but not huge.

Power consumption looks great though, seems like the TDP is more like the max it can consume stock if you are running a heavy scene with RT turned on (TPU showed some numbers in cyberpunk with various settings).

Ultimately though a 420W TBP 7950XT with a 50% perf/watt gain will is 2.1x the 6900XT which would put it at 119.7 in the above graph and vs the 6950XT it would be a 1.9x gain which would put it at 115.9 in the above graphic.

AMD might actually take the raster performance crown and if they can match Ampere for the RT fps drop that 16-20% increase in raster will be enough for it to outright win there too.

EDIT: With the computerbase.de numbers 2.1x the 6900XT would be 109.2 on their graph in 4k raster.

Edit 2: just checked techspot/hub. They have it 58% faster at 4k than the 3090Ti. Will be interesting how their 50 game suite goes. Doing the maths for the 7950 here we get it as 11% - 12% ahead of the 4090 using the 6900XT and 6950XT as the baseline. (so for 6900XT it is 420/300 * 1.5 * 53 and for 6950 it is 420/330 * 1.5 * 59)
 
Last edited:
Back
Top