• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

nVidia 3090Ti reviews thread

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer


$2000 "MSRP" and 400W+ power for barely any performance gain. Time to pucker up your wallet and power supply orifices, folks.

The only good news is prices are dropping overall and the next gen is close.
 
Last edited:
oSE4aWm.png


SPMvBx0.png


Using stable clocks.
 
Last edited:
Nah I'm just kidding. April fool's, SUCKERS!

...

i was gonna say because if you seriously decided to do SLI on a non HEDT platform with 2 x 3090ti's i have absolutely no comment.....
Any one who builds a system with both in SLI on a non HEDT platform for obvious reasons should not be custom building a PC period.

...

Using stable clocks.


Adam i think i just heard your breaker box poping....

I am gonna grab me 4090's but i will do a big fat pass on a 3090ti..
Hopefully next gen HEDT platforms will roll around then too, as i am itching to upgrade my current HEDT.
 
What's the power consumption like during those tests?
Stock FTW Ultra BIOS at +6% power in Afterburner puts it at a peak of 480W.

The 500W BIOS I had on the MSI 3090 Gaming X Trio was hotter, louder, and of course with a bit less performance.

The 3090 Ti runs cooler with less noise by comparison.
 
Awesome game room @AdamK47 !

Just got my first 3090 Ti on Friday (snagged it on NewEgg):

KwWzuPA.jpg


tUTzNHT.jpg


With the power slider maxed out - I'm only seeing around 447W being used - not 480W. Seems rather strange. Do I have to be on the "OC" BIOS? My switch is on the middle BIOS (forgot the name).
 
Awesome game room @AdamK47 !

Just got my first 3090 Ti on Friday (snagged it on NewEgg):

KwWzuPA.jpg


tUTzNHT.jpg


With the power slider maxed out - I'm only seeing around 447W being used - not 480W. Seems rather strange. Do I have to be on the "OC" BIOS? My switch is on the middle BIOS (forgot the name).
450W is default. I set mine to the OC BIOS right away. From what I've read from others, the setting only adjusts the fan curve. +6% in Afterburner bumps it from 450W to 480W. Try looping Port Royal at 4K. You'll easily see it there.
 
450W is default. I set mine to the OC BIOS right away. From what I've read from others, the setting only adjusts the fan curve. +6% in Afterburner bumps it from 450W to 480W. Try looping Port Royal at 4K. You'll easily see it there.

Interesting - I set my own custom fan curve so the OC BIOS is useless then? I will loop Port Royal and see.

This GPU sure is thicc!
 
im suprised no one has asked yet.. but out of curiosity *getting my lamda driver force field up* hashrate for these guys? (there i said it for you @VirtualLarry )
 
im suprised no one has asked yet.. but out of curiosity *getting my lamda driver force field up* hashrate for these guys? (there i said it for you @VirtualLarry )
I suspect basically the same as a 3090 given equivalent clocks. I doubt the extra cores make a difference, but I am not certain.
 
This GPU is terrible absolutely terrible for mining. No dirty miner should buy one.
Really? How so? I would like details if possible 😛

My guess is too much power draw or something?
 
Your guess is incorrect.

Details: No one should be mining on a GPU.
I am confused...do you have any evidence, or any insight into this? Or are you just making stuff up?
 
Really? How so? I would like details if possible 😛

My guess is too much power draw or something?

It would be funny if NVidia changed their drivers around so that there was a minimum power setting that made it unfeasible for miners. Basically it will always pull at least XXX watts under load so that the economics of mining go to crap without having to resort to crippleware drivers.

Gamers or anyone using the card for productivity would never notice since they'd be running above that lower limit anyway. If anyone is overly concerned just release completely unlocked drivers that remove the power floor when the next generation of cards comes out and it makes them unattractive miners for another reason.
 
i am pretty sure these are non LHR.

You triggered a (gamer) auto-immune response from @AdamK47. I suggest you back away slowly.

Well gamers shouldn't be using 3090's unless you want 8k gaming.
3090 series are titans replacements, and hence should be left for content creators.
But the price gap between a 3080ti and a 3090 is so utterly small that it makes no sense to get a 3080ti with half the ram and less cores for that 15-20% price price difference.

The 3090Ti is just straight up stupid... sorry Adam, but i really would of just waited for a 4090, but again its you were talking about, so you'll probably grab a 4090 anyhow, which is the road i am intending on going unless AMD's new HEDT platform has SAM support, then a 7950X maybe my next route if the numbers fall in line.

It all falls in on what the next HEDT platform will be, and how much benefit i'll see on it.
 
Do you mean 4k gaming? There are quite a few AAA games that even a 3090ti does not hit 100fps in at 4k with everything maxed.

While this is true, a faster video card won't necessarily fix it. Most games have occasional parts that get bottlenecked by a single CPU thread, and won't hit a constant 120/144fps at all times on any hardware.
 
I actually agree with @AdamK47 . It would be a shame to send such a nice video card to the mines. I haven't sent my 3060ti MSI triple-fan to the mines, either, for the same reason.

Keep running the broke-back GTX 1660 Super / Ti cards, and the occasional Navi10-based cards. Oh, and I have a veritable fleet of RX 6600 cards, most are out of commission as of this posting, due to platform issues (my not-so-successful tries at implementing riser-less mobos).
 
Back
Top