Question Convince me not to buy a RTX 3090 Ti

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I haven't upgraded in a very long time and I now have the itch. I will be doing a full platform upgrade later this year with either Zen 4 or Raptor Lake. Ideally I would love to pair an RTX 4000 series card with my new platform, but I'm guessing the scalpers and miners will suck up all the initial inventory and I will probably have to wait till next year to get the GPU I want.

Also full disclosure, I've always had a strong preference for "full fat" GPUs and not the scaled down versions.

That said, the mark downs on the elite RTX 3090 Ti cards are becoming very tempting. An aftermarket model will play every game I want with maximum detail and high performance. Only a few upcoming games interest me, like Baldurs Gate 3, Jedi Survivor, Diablo IV, Stalker and several others and I don't foresee any of them being problematic at 1440p, though I'm thinking about upgrading to a 4K ips monitor as well to be honest.

Would I be making a big mistake if I bought an RTX 3090 Ti for $1200?
 
Jul 27, 2020
15,738
9,790
106
And now OLED has close to perfect contrast, great response times, true HDR but can suffer from burn in........
Don't worry that much about it. I have the LG C8 OLED. As long as you view mixed content (games and movies) on the OLED and don't do non-stop marathon sessions of games with fixed HUD elements on the screen, you won't cause any burn-in. Even if you have to play some game with HUD and worried, pause the game and pick up the remote every 15 to 20 minutes and switch to the home screen of the LG OS so the screen gets a break from displaying fixed elements constantly. There might be temporary image retention after a while which you can fix by running the built-in pixel scrubber. One important thing to remember to do every day is to leave the TV plugged in and don't turn off the power to it from the mains. During downtime, the TV will automatically do a routine refresh of the pixels.

I've had my TV since 2019 and I've yet to notice any issue in the panel so far. I was also lucky to get the C8 since it does 1000 nits. C9 and some of the later models are capped at 700-800 nits. Maybe the colors aren't as deep as on C9 and later but I really enjoy the bright highlights that make the images pop like anything and make watching beautiful scenes pure joy. Pull the trigger. And seriously look if getting the C2 is worth the price difference compared to the C1 (compare both on rtings.com).
 

Leeea

Diamond Member
Apr 3, 2020
3,599
5,340
106
And now OLED has close to perfect contrast, great response times, true HDR but can suffer from burn in........

You just can't win :persevere:
oled also handles ambient light poorly. It is the most reflective of any display I have. Black scenes will reflect a well lit room. I arranged things so no direct light sources reflect to me off the TV.

On the flip side the blacks are just better. A lot better. I am of the opinion good blacks are not achievable without the reflection issue.
 
Last edited:
  • Like
Reactions: Tlh97 and Carfax83
Jul 27, 2020
15,738
9,790
106
oled also handles ambient light poorly.
I solved it through bias lighting. It's bright enough that I don't need to turn on the overhead light in the room. OLEDs are best for dark room/low light environments. In a bright room, the highlights are not that visible. I think that for a bright room, we really need 4000 nits to truly enjoy HDR content.
 
  • Like
Reactions: Tlh97 and Leeea

CP5670

Diamond Member
Jun 24, 2004
5,508
586
126
I find a 3090 good enough for 4K (at least 90fps consistently) at a mix of high/ultra settings, but not with RT or VR. Anything RT at 4K needs the next generation of cards.

I have used the LG OLEDs for some years now with no burn in, and they blow away any regular PC monitor for games. The key is to keep the brightness down when you're not gaming. You can set different profiles and switch between them quickly. They are definitely more suited for a darker room though. Right now there are some great deals on the 2021 models at 30-40% off.

I've had my TV since 2019 and I've yet to notice any issue in the panel so far. I was also lucky to get the C8 since it does 1000 nits. C9 and some of the later models are capped at 700-800 nits. Maybe the colors aren't as deep as on C9 and later but I really enjoy the bright highlights that make the images pop like anything and make watching beautiful scenes pure joy. Pull the trigger. And seriously look if getting the C2 is worth the price difference compared to the C1 (compare both on rtings.com).

I used to have an E8 and it looked great, but it only did 60hz which resulted in more motion blur as well. The G1/C2 should achieve the same max brightness but with modern features.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Well, I devoured multiple hours of content the past few days concerning whether to get a native OLED panel like the LG C2 for PC desktop and gaming purposes, and I've come to my conclusion.

I've decided that a native OLED panel just wouldn't be a good solution for me for a number of reasons. First off, I think 42 inches is too large for how close I usually have my monitor positioned on my desk. I would either have to modify my desk or even get a new one just to accommodate that massive increase in screen real estate.

Second, I don't like all the caveats associated with using an OLED panel for desktop PC. Having to do this and that to make sure burn in isn't accelerated whereas with a true PC monitor I've never even had to care about such things and that just rubs me the wrong way.

Third, it seems that some of the recent PC monitors are getting quite close to OLED black level performance, without having the weaknesses of OLED like burn in and lower brightness. I've been watching and reading tons of reviews and feedback and so far, I'm leaning towards the Samsung Neo G8 4K 32 inch 240Hz monitor with 1196 local dimming zones and high brightness.

It has superior contrast ratio to any IPS panel, to include the elite level IPS monitors with FALD technology that cost $2500-3000 as it's PVA and PVA has always had higher native contrast than IPS; even moreso now due to FALD.

It would be hilarious if I do end up getting that monitor because way back in the day I had a 30 inch Samsung 305T 2560x1600 16:10 aspect ratio monitor that I loved to death. Unfortunately it didn't love me back and I had to RMA it twice. I vowed never to get another Samsung monitor ever again from my experiences with how faulty that monitor was, but I may just go back on my word.

I'll just make sure to also purchase a third party guaranteed warranty if I do get it.
 

itsmydamnation

Platinum Member
Feb 6, 2011
2,743
3,069
136
I haven't upgraded in a very long time and I now have the itch. I will be doing a full platform upgrade later this year with either Zen 4 or Raptor Lake. Ideally I would love to pair an RTX 4000 series card with my new platform, but I'm guessing the scalpers and miners will suck up all the initial inventory and I will probably have to wait till next year to get the GPU I want.

Also full disclosure, I've always had a strong preference for "full fat" GPUs and not the scaled down versions.

That said, the mark downs on the elite RTX 3090 Ti cards are becoming very tempting. An aftermarket model will play every game I want with maximum detail and high performance. Only a few upcoming games interest me, like Baldurs Gate 3, Jedi Survivor, Diablo IV, Stalker and several others and I don't foresee any of them being problematic at 1440p, though I'm thinking about upgrading to a 4K ips monitor as well to be honest.

Would I be making a big mistake if I bought an RTX 3090 Ti for $1200?

..ladies and gentlemen of this supposed fprum, I have one final thing I want you to consider.
Ladies and gentlemen, this is Chewbacca. Chewbacca is a Wookiee from the planet Kashyyyk.
But Chewbacca lives on the planet Endor. Now think about it; that does not make sense!
Why would a Wookiee, an 8-foot-tall Wookiee, want to live on Endor, with a bunch of 2-foot-tall Ewoks?
That does not make sense! But more important, you have to ask yourself: What does this have to do with this GFX card? Nothing.
Ladies and gentlemen, it has nothing to do with this GFX card! It does not make sense! Look at me. I'm a computer nerd on a internet forum , and I'm talkin' about Chewbacca!
Does that make sense? Ladies and gentlemen, I am not making any sense! None of this makes sense!
And so you have to remember, when you're in that checkout deliberatin' and conjugatin' the Emancipation Proclamation, does it make sense? No! Ladies and gentlemen of this supposed forum, it does not make sense!
If Chewbacca lives on Endor, you must not buy the GFX card!
 

Furious_Styles

Senior member
Jan 17, 2019
492
228
116
Well, I devoured multiple hours of content the past few days concerning whether to get a native OLED panel like the LG C2 for PC desktop and gaming purposes, and I've come to my conclusion.

I've decided that a native OLED panel just wouldn't be a good solution for me for a number of reasons. First off, I think 42 inches is too large for how close I usually have my monitor positioned on my desk. I would either have to modify my desk or even get a new one just to accommodate that massive increase in screen real estate.

Second, I don't like all the caveats associated with using an OLED panel for desktop PC. Having to do this and that to make sure burn in isn't accelerated whereas with a true PC monitor I've never even had to care about such things and that just rubs me the wrong way.

Third, it seems that some of the recent PC monitors are getting quite close to OLED black level performance, without having the weaknesses of OLED like burn in and lower brightness. I've been watching and reading tons of reviews and feedback and so far, I'm leaning towards the Samsung Neo G8 4K 32 inch 240Hz monitor with 1196 local dimming zones and high brightness.

It has superior contrast ratio to any IPS panel, to include the elite level IPS monitors with FALD technology that cost $2500-3000 as it's PVA and PVA has always had higher native contrast than IPS; even moreso now due to FALD.

It would be hilarious if I do end up getting that monitor because way back in the day I had a 30 inch Samsung 305T 2560x1600 16:10 aspect ratio monitor that I loved to death. Unfortunately it didn't love me back and I had to RMA it twice. I vowed never to get another Samsung monitor ever again from my experiences with how faulty that monitor was, but I may just go back on my word.

I'll just make sure to also purchase a third party guaranteed warranty if I do get it.
Yeah I'm personally waiting for a 4K oled that's 27-30" 144hz. That would work perfectly for my desk. Hoping to see it in the next year. Also would rather not spend over 1K for it. Already spent $800 on my current and as it turns out it was not really much of an upgrade on my 1440p 144hz Acer IPS.
 
  • Like
Reactions: Tlh97 and Leeea

Fallen Kell

Diamond Member
Oct 9, 1999
6,009
417
126
yes buy a 3090ti please so you can skip getting a 4090, leaving me one less person to fight over in trying to secure one.
Hahahahaa....

That I have to say is a great post :D

I will probably personally be looking for a 4080 and/or 4070 (depending on the power usage and length of card). I might be picking up a 3070 (I tried once already by had to RMA due to DOA, and at this point, I suspect prices should fall more again before I would try a 3070 again). This is for a gaming HTPC build (which I am probably going to upgrade the CPU to a 5800X3D from the 5600X I have in the system now).
 
  • Like
Reactions: Tlh97 and Leeea

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,839
3,171
126
I might be picking up a 3070 (I tried once already by had to RMA due to DOA, and at this point, I suspect prices should fall more again before I would try a 3070 again)

I would not even consider a nvidia card unless it was a Ti.
The 6750XT is a beast, especially with a Ryzen Cpu and SAM activated.
The 6700XT can be found dirt cheap at prices below a 3060Ti even.

The 3070 to me honestly feels boring unless your a miner and even then i hear miners are moving over to 6700XT....
Id see how far a 3080 will fall, or get a 6750XT and call it a day.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
More than half that budget goes for the 4090Ti then, probably $2500 or something.

If it does go that high, I won't be buying it. My personal cap off for GPUs is around $1500. I don't ever see myself spending more than that even though I can afford it because GPUs are the fastest progressing technology in a PC so investing heavily in them is nonsensical to a point.

I used to do that though not going to lie. My most shameless streak was GTX 470 SLI to GTX 480 SLI to GTX 580 SLI to GTX 670 SLI to GTX 770 SLI to GTX 970 SLI to GTX 980 SLI to a single 980 Ti (SLI fever finally broke) to single GTX 1080 to current Titan Xp.

With mining GPUs flooding the market it's harder to offload old hardware these days to recover some of the losses incurred by constantly upgrading your GPU so I'm far pickier and more selective than I used to be.

The reason why I considered getting an RTX 3090 Ti was because technically, it's a large jump from a Pascal based Titan Xp due to things like DLSS and Ray tracing support.

The other expensive item in my build will be the monitor.
 
Last edited:

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
I expect the 4090 to be at the very least the same price as the 3090, so $1500. The Ti variant will obviously cost more, probably $2000 at the very least to maintain the price of the 3090Ti. I was on the SLI bandwagon since the beginning when Nvidia stole it from 3DFX and all the way to the single 1080Ti. It's fun to recall, so let's see if I can:

6800 GT SLI
7800 GTX SLI
8800 GT SLI
GTX 260 core 216 SLI
GTX 570 SLI
GTX 670 SLI
GTX 980TI SLI

And now Nvidia wants QUAD SLI money for a single enthusiast card. They can suck it.
 
  • Like
Reactions: Carfax83 and Ranulf

n0x1ous

Platinum Member
Sep 9, 2010
2,572
248
106
I expect the 4090 to be at the very least the same price as the 3090, so $1500. The Ti variant will obviously cost more, probably $2000 at the very least to maintain the price of the 3090Ti. I was on the SLI bandwagon since the beginning when Nvidia stole it from 3DFX and all the way to the single 1080Ti. It's fun to recall, so let's see if I can:

6800 GT SLI
7800 GTX SLI
8800 GT SLI
GTX 260 core 216 SLI
GTX 570 SLI
GTX 670 SLI
GTX 980TI SLI

And now Nvidia wants QUAD SLI money for a single enthusiast card. They can suck it.
oooo fun SLI history!

for me

8800GTS 640mb SLI
GTX 260 core 216 Tri-SLI
GTX 480 SLI
GTX 670 SLI
GTX 780 SLI
GTX 980 Ti SLI
RTX 2080 Ti SLI
 

VirtualLarry

No Lifer
Aug 25, 2001
56,226
9,990
126
I had SLI GTX 460 1GB cards. Those were such good cards BITD, and so (relatively) affordable compared to today. Imagine if the x60 cards could still be SLI'ed, or the x70 cards.
 
  • Like
Reactions: Carfax83