nVidia 3090Ti reviews thread

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
2,971
126


$2000 "MSRP" and 400W+ power for barely any performance gain. Time to pucker up your wallet and power supply orifices, folks.

The only good news is prices are dropping overall and the next gen is close.
 
Last edited:
  • Like
Reactions: Edwin Joe

CP5670

Diamond Member
Jun 24, 2004
5,512
588
126
I use a USB DAC/AMP, which is good enough for a smaller 2.1 setup. For a full surround setup, HDMI passthrough to a reciever is the best approach.

Is the surround setup worth it for games? Last I checked, most PC games didn't really support it properly, and it only seemed to be worth it for movies. Maybe it's different now.
 

KentState

Diamond Member
Oct 19, 2001
8,397
393
126
I use a USB DAC/AMP, which is good enough for a smaller 2.1 setup. For a full surround setup, HDMI passthrough to a reciever is the best approach.

Is the surround setup worth it for games? Last I checked, most PC games didn't really support it properly, and it only seemed to be worth it for movies. Maybe it's different now.

I guess it depends on what you play, but the vast majority of AAA games support it. Even ones like Tomb Raider and Halo Infinite support Atmos. Lately I've been playing Elden Ring and the positional audio is pretty good, especially when the music creeps in from the rear speakers as you enter a tomb.
 
  • Like
Reactions: blckgrffn

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
Both of our stores here have dozens. Not sure why people are thinking they can scalp these on ebay, but I guess if there's a will.

It's likely because they can just return the card if it doesn't sell. Just buy the video card using a credit card, so you aren't on the hook for that amount until the next bill anyway.

My buddies 3080 that I helped him source required putting all new fans in his case and then it hot boxed his office (re-purposed bedroom) in the ~2 hour sessions he was playing warzone. But hey, it looked glorious!

Progress ;)

One of my bestest buddies is a bachelor with his own house - he put a split unit in his first level office :D So energy costs for the card, then the room cooler, never mind the cost of the unit and the pro install :D

I ended up having to put in an AC upstairs just to handle putting two computers (3080, 3080 Ti) into the same room. I didn't really want to do that, but I don't have another option. (That room has a ceiling fan too.) Anyway, that references an interesting problem that I've noticed as we've gained more and more technology. Technology tends to put off a decent amount of heat -- albeit, there is a trend toward low power too -- and in the US, we have a heavy prevalence of whole-house, duct-based HVAC systems. These systems really only work when the area being cooled doesn't vary so heavily. I've been really tempted to completely redo my HVAC setup (even though I've replaced the outdoor unit within the past 5 years) to use a mini-split setup in the areas that don't get serviced well by the HVAC, which includes that computer room.

The one issue that I ran into with the window AC is that it never sealed well, and it served as a good way for Stinkbugs and Ladybugs to make their way in. It's a nice unit (Midea U-shaped) as it's rather quiet, but its provided mount doesn't seal well around the runners.

That's the way I did it until I got a C9. Have to plug directly to the TV first for GSync to work. Same with the G1. It's also the best route for the least input latency. eARC can also output multichannel PCM 7.1 using passthrough from the TV.

I have a C1 and I can use G-Sync with it just fine with an AVR in between. To note, I do have to tell the Nvidia Control Panel to enable G-Sync for the display (the checkbox at the bottom) as I guess it assumes it isn't valid because the AVR is providing the EDID, which allows for VRR, but Nvidia likely doesn't recognize the Denon receiver as G-Sync Compatible. I noticed a lot of tearing in Guardians of the Galaxy prior to upgrading, and after turning it on, I saw no tearing. So, I'm assuming it's working.

To note, I'm using the cheaper Denon HDMI 2.1 receiver that Costo sells. I went with it because it's the only Denon receiver with more than one HDMI 2.1 port. The higher-end units require the HDMI 2.1 switch that Denon sells. It was a bit of a letdown because it was quite obviously weaker than my prior AVR-X4100 in regard to driving my speakers.
 
Last edited:

KentState

Diamond Member
Oct 19, 2001
8,397
393
126
It's likely because they can just return the card if it doesn't sell. Just buy the video card using a credit card, so you aren't on the hook for that amount until the next bill anyway.

Still seems like a hassle to make at most $100 per card after fees. They don't even let you buy more than one.

I ended up having to put in an AC upstairs just to handle putting two computers (3080, 3080 Ti) into the same room. I didn't really want to do that, but I don't have another option. (That room has a ceiling fan too.) Anyway, that references an interesting problem that I've noticed as we've gained more and more technology. Technology tends to put off a decent amount of heat -- albeit, there is a trend toward low power too -- and in the US, we have a heavy prevalence of whole-house, duct-based HVAC systems. These systems really only work when the area being cooled doesn't vary so heavily. I've been really tempted to completely redo my HVAC setup (even though I've replaced the outdoor unit within the past 5 years) to use a mini-split setup in the areas that don't get serviced well by the HVAC, which includes that computer room.

The one issue that I ran into with the window AC is that it never sealed well, and it served as a good way for Stinkbugs and Ladybugs to make their way in. It's a nice unit (Midea U-shaped) as it's rather quiet, but its provided mount doesn't seal well around the runners.

I have been toying with the idea of putting my desktop in the basement and using fiber optic display port/HDMI for the monitors and fiber optic USB for peripherals. Not sure what your setup is like, but relocating them to another part of the house is something to look into.

Noobs,
I only do 20 amp lines

Probably the safest way to go. I had 8 dedicated 20 amp outlets installed in my home theater equipment/server room in the basement. Probably overkill, but the three server chassis each have dual 900w+ power supplies.
 

CP5670

Diamond Member
Jun 24, 2004
5,512
588
126
I have my PC/TV setup in the living room in a two bedroom apartment. It's a large open area (about 25'x15') but still heats up noticeably if I game for a few hours.

I guess it depends on what you play, but the vast majority of AAA games support it. Even ones like Tomb Raider and Halo Infinite support Atmos. Lately I've been playing Elden Ring and the positional audio is pretty good, especially when the music creeps in from the rear speakers as you enter a tomb.

I might look into this again then. I play a mix of recent titles and older games. Although my current 2.1 setup is probably more suitable for an apartment where I can't turn up the volume beyond a point.
 

Mopetar

Diamond Member
Jan 31, 2011
7,842
5,994
136
Takes up 4 slots. Weighs about as much as two dozen bananas.

If it gets any bigger or heavier when people talk about benching the card we might think they're going to use it as part of a workout routine.

LOL playing Cyberpunk on this card will feel like jamming four more people into your bedroom.

Honestly it's not a bad deal in the winter. If your toes start getting cold just crank the clocks a bit more.
 
  • Haha
Reactions: Elfear
Feb 4, 2009
34,577
15,794
136
Still seems like a hassle to make at most $100 per card after fees. They don't even let you buy more than one.



I have been toying with the idea of putting my desktop in the basement and using fiber optic display port/HDMI for the monitors and fiber optic USB for peripherals. Not sure what your setup is like, but relocating them to another part of the house is something to look into.



Probably the safest way to go. I had 8 dedicated 20 amp outlets installed in my home theater equipment/server room in the basement. Probably overkill, but the three server chassis each have dual 900w+ power supplies.
Our house is odd but nice.
House was flipped before we bought it and whomever did the new electric work added a 20 amp wall plug in all but one room.
 

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
So, to make sure I'm not too off-topic with the discussion, are there any thoughts on the power trend of the 3090 Ti? The entire 30-series has had me a bit squeamish this whole time wondering if Nvidia has forgotten about Fermi after all these years. The top-end 20-series card pulled 250W, and even ignoring this latest card, the previously top-end 30-series card pulls 100W over that. I know that there has been a bunch of rumbling that you can drop the voltage and only see a relatively minor drop in performance (5-10%) for a large drop in power usage (50-100W), and while I'd feel better presenting more concrete numbers, the idea of that doesn't sit too well with me.

Still seems like a hassle to make at most $100 per card after fees. They don't even let you buy more than one.

It most likely is. That's really the only thing that I could consider is that they're willing to "test the waters" since it's only a minor inconvenience.

I have been toying with the idea of putting my desktop in the basement and using fiber optic display port/HDMI for the monitors and fiber optic USB for peripherals. Not sure what your setup is like, but relocating them to another part of the house is something to look into.

I've seen some of the Linus Tech Tips videos involving things like that or multi-user setups involving VMs and stuff, and my biggest thing about that is that it adds a bit of extra complexity that could end up problematic. One thing Linus mentioned in... I believe it was a recent house renovation updated video was that his wife had been having issues with the whole remote Thunderbolt-based computing setup, and they were really just going through from the adjacent room.

In my case, the upstairs rooms, which is where the computer room is, are all pretty bad when it comes to heating and cooling. This is mostly due to poor insulation between the garage and other rooms, and also the lack of an air return for the upstairs area. As much as I'd rather not spend thousands upon another HVAC upgrade, it would not only heavily benefit the computer room, but every other room. For example, I'm currently using an AC Infinity powered vent cover in my room just to help pull more air in.

Noobs,

I only do 20 amp lines

Depending on the age of your house, you will likely have #12 wire and 20A breakers on all of your room circuits. If you really wanted, you could simply swap out the 15A outlet with a 20A outlet so long as you do have that aforementioned setup. The reason why they do it is because people typically wire multiple 15A outlets on a single circuit, and that allows for more headroom with multiple outlets in use. (That's especially true in places like a kitchen.)

My house is kind of old, so the existing wiring is mostly #14, which is a bit scary when you consider how someone decided to replace the breakers with 20A breakers at some point. (14-gauge wiring is only rated for 15A, and is usually only used for lighting circuits if anything.)
 

CP5670

Diamond Member
Jun 24, 2004
5,512
588
126
So, to make sure I'm not too off-topic with the discussion, are there any thoughts on the power trend of the 3090 Ti? The entire 30-series has had me a bit squeamish this whole time wondering if Nvidia has forgotten about Fermi after all these years. The top-end 20-series card pulled 250W, and even ignoring this latest card, the previously top-end 30-series card pulls 100W over that. I know that there has been a bunch of rumbling that you can drop the voltage and only see a relatively minor drop in performance (5-10%) for a large drop in power usage (50-100W), and while I'd feel better presenting more concrete numbers, the idea of that doesn't sit too well with me.

Maybe it's like a dry run for the 4090. If Nvidia thinks customers are fine with the power/heat output of the 3090ti, they might be more open to 600W or whatever that card will use.
 

Ranulf

Platinum Member
Jul 18, 2001
2,353
1,172
136
Anthony at LTT's review. Its cued to about 8m20s to 9m, with a mea culpa on Linus' influence with their 8k video they did on the 3090 last year. Interesting numbers he cites regarding Steam user survey of the high end RTX 3000 cards. LTT's power draw had it at about even for watts per fps increase, percentage wise. 12% more power for 13% more fps.

 
  • Like
Reactions: Tlh97 and Elfear

Aapje

Golden Member
Mar 21, 2022
1,384
1,865
106
Are mobo slots designed to take the abuse from this monster?
There are actually two improvements to help with this. A major one is that the IO plate (that has the video ports) is now connected to the cooler, which means that the back of the case will take much more of the weight of this card, rather than the mobo slot. There is also an (optional) cable with a matching slot in the card, that can be used to hang the card from the top of the case.

I would actually be way more worried about the 3090 than the 3090 Ti when it comes to stress on the mobo, because the former lacks these improvements.
 

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
Anthony at LTT's review. Its cued to about 8m20s to 9m, with a mea culpa on Linus' influence with their 8k video they did on the 3090 last year. Interesting numbers he cites regarding Steam user survey of the high end RTX 3000 cards. LTT's power draw had it at about even for watts per fps increase, percentage wise. 12% more power for 13% more fps.


Back during one of my times spent in line at Best Buy waiting for a card, the guy in front of me wanted a 3090 just for future proofing. I think that was largely because he accepted that unless he camped out since the previous night, there was no way he was getting a 3080 (and he was right about that), so the 3090 was likely the next best bet. I ended up getting a 3080 Ti from that time -- I missed out on the last 3070 by the person two in front of me -- but I wish I had went with a 3090 instead. Although, that was mostly because I had a 10% coupon for my birthday, which worked on the 3090 but not the 3080 Ti, and the 3090 has no LHR but all 3080 Ti cards have LHR. I have no desire to mine, but I see it as a benefit if I ever decide to sell.

In the end, I guess I wouldn't be surprised if some people took that "future proofing" approach or went with the 3090 for its capability with commercial applications. To a degree, I wonder if this could be partially related to the lower VRAM amounts on any card below the 3090? I wonder if the other cards might be more popular if Nvidia had 16GB of VRAM on cards like the 3080 like AMD?
 
  • Like
Reactions: Tlh97 and Ranulf

CP5670

Diamond Member
Jun 24, 2004
5,512
588
126
The extra memory is actually important in VR, and games definitely take advantage of it there. The gap between the 3090 and 3080 increases in high resolution headsets like the G2.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
The headline is correct. "these are dumb"

and all GPUs are still way too expensive.

I loved it when people rejoiced because 3070 ti's were finally under $1000. I was just so confused by that, like who thinks $900 is acceptable for an 8GB card in 2022? And the 3090Ti? It's the same die as the $700 3080 and performs very similar. Nvidia can simply price the same component at wildly different level simply to take advantage of the fact that some people are willing to pay more for the same thing. Nice!
 

Saylick

Diamond Member
Sep 10, 2012
3,170
6,403
136
I loved it when people rejoiced because 3070 ti's were finally under $1000. I was just so confused by that, like who thinks $900 is acceptable for an 8GB card in 2022? And the 3090Ti? It's the same die as the $700 3080 and performs very similar. Nvidia can simply price the same component at wildly different level simply to take advantage of the fact that some people are willing to pay more for the same thing. Nice!
I believe the term is called "consumer surplus". Basically, instead of selling the product at market equilibrium price, the seller can extract maximum revenue by selling it at a higher price and then slowly decrease it to equilibrium price. That way, everyone who would have been willing to buy said product at higher than equilibrium pricing would have paid more. That's what basically happened, albeit as a side consequence of the supply shortage and crypto boom.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
I believe the term is called "consumer surplus". Basically, instead of selling the product at market equilibrium price, the seller can extract maximum revenue by selling it at a higher price and then slowly decrease it to equilibrium price. That way, everyone who would have been willing to buy said product at higher than equilibrium pricing would have paid more. That's what basically happened, albeit as a side consequence of the supply shortage and crypto boom.

Well I've been waiting for my equilibrium point for 2 years now and it's nowhere in sight! I don't care much about flat gaming these days, at least not enough to pay top dollar for it. I'm a lot more excited about advancements in stand-alone VR headsets.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,846
3,190
126
This card is just utterly stupid.
The performance / power percentages don't even merit it.

Its like 20% more power draw then my 3090 @ 6% net gain.... that's like stupidly excessive dimished gains.
At least on a good note, you can probably get this card @ near MSRP, because we know all the true gamers are looking for 3080, miners used to look for the 3060 / 3090, and the everyone else who falls in the middle that could get a videocard was torn between a 3070ti or a 3080ti @ the price of original MSRP 3090.

Noobs,
I only do 20 amp lines

Go dedicated NEMA 14-50 240V or go home... ;)
You can also use that for your EV Car. :p
 
Last edited:
  • Like
Reactions: Dannar26

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
I'm sick and tired of just complaining all the time about prices and pretending like I don't just want a 3090Ti. None of us have but a few decades left to live, so I caved and went back to my roots, which is high-end SLI. Check my sig and call me a traitor, or say congratulations. Either way I don't care! The dopamine rush is HEAVY boys!
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,819
7,180
136
I'm sick and tired of just complaining all the time about prices and pretending like I don't just want a 3090Ti. None of us have but a few decades left to live, so I caved and went back to my roots, which is high-end SLI. Check my sig and call me a traitor, or say congratulations. Either way I don't care! The dopamine rush is HEAVY boys!

- Wait till all the regret pours in when the post nut clarity kicks in...
 
  • Haha
Reactions: Ranulf and Saylick