Discussion RTX 4090 reviews

amenx

Diamond Member
Dec 17, 2004
3,355
1,408
136






Comprehensive roundup:
 
Last edited:

Timmah!

Golden Member
Jul 24, 2010
1,170
363
136
1219 points in Octanebench - very nice. My 3090 and 2080Ti do 650 and 350 respectively. Back in 2016 i had 2x 1080s, which got combined score of 300 :-D Which was about 5x more than gtx580 i have been using till that point.
 
  • Like
Reactions: xpea

Jimzz

Diamond Member
Oct 23, 2012
4,398
190
106
Man that's a large jump from 1 gen to the next.

Makes the current GPUs from both Nvidia and AMD look over priced. Sorry for those that bought an expensive card recently.
 

Timmah!

Golden Member
Jul 24, 2010
1,170
363
136
Power consumption numbers are confusing..



Why the humongous cooler if it uses less power than a 3080ti??
To justify higher price. They pay 20 bucks for bigger cooler, you pay 200 more for entire card, 180 in their pocket - cause look, bigger cooler!
If thats not the case with this FE, it is with those massive AIBs for sure.
 

repoman0

Diamond Member
Jun 17, 2010
4,048
2,694
136
Pretty awesome and tempting from a hardware perspective ... basically 2x faster than my 3080 at 4k, even at equal power levels. Wish I needed one for work because I can't justify it just for a few not very compelling video games.
 
  • Like
Reactions: Tlh97 and Leeea

Aikouka

Lifer
Nov 27, 2001
30,380
909
126
GamersNexus got very different power consumption. 450w at stock, 666 watts with a 33% OC.
That's likely because GamersNexus used FurMark. I'm not sure exactly what TechPowerUp used while logging their numbers as it just says "Gaming".
 
  • Like
Reactions: Leeea

DiogoDX

Senior member
Oct 11, 2012
746
277
136
That's likely because GamersNexus used FurMark. I'm not sure exactly what TechPowerUp used while logging their numbers as it just says "Gaming".
You need to click on "power consumption testing details"
  • Gaming: Cyberpunk 2077 is running at 2560x1440 with Ultra settings and ray tracing disabled. We ensure the card is heated up properly, which ensures a steady-state result instead of short-term numbers that won't hold up in long-term usage.
 

Hitman928

Diamond Member
Apr 15, 2012
4,371
5,757
136
You need to click on "power consumption testing details"
  • Gaming: Cyberpunk 2077 is running at 2560x1440 with Ultra settings and ray tracing disabled. We ensure the card is heated up properly, which ensures a steady-state result instead of short-term numbers that won't hold up in long-term usage.
At 1440p with no RT, it's going to be a bit CPU limited which will drop the power consumption. Running 4K, I would expect the power consumption to increase though it's unlikely to hit the GN numbers. Just speculation, but for maxed out 4K numbers, it's probably just about in the middle, meaning around 400W. Derbauer showed 422W in Time Spy Extreme but average gaming will probably be slightly lower than that.
 

Timmah!

Golden Member
Jul 24, 2010
1,170
363
136
Derbauer tested the performance vs. power for the 4090. Seems like everyone who buys one should set a power limit of 70-80% and call it a day.
View attachment 68948
That sounds reasonable. By setting that power limit you mean the slider in Afterburner? Would that be enough, no need for any underclocking shenanigans and whatnot?
 

Saylick

Platinum Member
Sep 10, 2012
2,282
4,176
136
That sounds reasonable. By setting that power limit you mean the slider in Afterburner? Would that be enough, no need for any underclocking shenanigans and whatnot?
Correct.

Looks like this approach isn't universally applicable, however. Andreas did the same kind of study but across a few different games. In some games, more power did lead to higher performance.
 

AdamK47

Lifer
Oct 9, 1999
14,474
1,983
126
The 4090 provides the same average framerate in Assassins Creed Valhalla at 4K as the 3090 Ti does at 1080P. 116 FPS.

 

gdansk

Golden Member
Feb 8, 2011
1,292
1,275
136
The performance is tempting but I think the card in the power and price envelope I can justify will not be as impressive. That is the 4090 is about 100% more TFLOPs than the 3090 while the 4080 16gb is closer to 50% more TFLOPs than the 3080.

But anyone who bought a 3090 or 3090 Ti this year? Well, maybe should have waited...
 

MrTeal

Diamond Member
Dec 7, 2003
3,214
1,010
136
From TPU
1665501093090.png

The 6700 XT is 1/2 the core count of the 6900 XT at boost clock 14% higher though with 75% of the memory bandwidth and cache, and it was 63% of the score of the bigger chip in TPU's test suite. If the 4080 12GB shows similar scaling (47% cores, 67% cache, 50% memory bandwidth) compared to the 4090, you'd expect it to end up closer to the 3090 than the 3090 Ti.
Have to wait and see how it scales.
 

Saylick

Platinum Member
Sep 10, 2012
2,282
4,176
136

Timorous

Golden Member
Oct 27, 2008
1,137
1,722
136
From TPU
View attachment 68952

The 6700 XT is 1/2 the core count of the 6900 XT at boost clock 14% higher though with 75% of the memory bandwidth and cache, and it was 63% of the score of the bigger chip in TPU's test suite. If the 4080 12GB shows similar scaling (47% cores, 67% cache, 50% memory bandwidth) compared to the 4090, you'd expect it to end up closer to the 3090 than the 3090 Ti.
Have to wait and see how it scales.
That it what the 3 benchmarks NV showed at their event showed. 4080 12GB is around 3080Ti / 3090 performance give or take.

Also this is one of the worse showings for the 4090 given it is 'only' 50% ahead of the 3090Ti in this suite. ComputerBase have it 65% ahead of the 3090Ti in their 4k suite for pure raster.

Skimming through the RT numbers at TPU it loses less performance than the 3090Ti and with the huge raster advantage there are good gains overall but the difference is not all that much from a % FPS reduction viewpoint, eyeballing it it was about 5%.

Computerbase also showed that RT was 69% faster than the 3090Ti so an improvement but not huge.

Power consumption looks great though, seems like the TDP is more like the max it can consume stock if you are running a heavy scene with RT turned on (TPU showed some numbers in cyberpunk with various settings).

Ultimately though a 420W TBP 7950XT with a 50% perf/watt gain will is 2.1x the 6900XT which would put it at 119.7 in the above graph and vs the 6950XT it would be a 1.9x gain which would put it at 115.9 in the above graphic.

AMD might actually take the raster performance crown and if they can match Ampere for the RT fps drop that 16-20% increase in raster will be enough for it to outright win there too.

EDIT: With the computerbase.de numbers 2.1x the 6900XT would be 109.2 on their graph in 4k raster.

Edit 2: just checked techspot/hub. They have it 58% faster at 4k than the 3090Ti. Will be interesting how their 50 game suite goes. Doing the maths for the 7950 here we get it as 11% - 12% ahead of the 4090 using the 6900XT and 6950XT as the baseline. (so for 6900XT it is 420/300 * 1.5 * 53 and for 6950 it is 420/330 * 1.5 * 59)
 
Last edited:

ASK THE COMMUNITY