Question Convince me not to buy a RTX 3090 Ti

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I haven't upgraded in a very long time and I now have the itch. I will be doing a full platform upgrade later this year with either Zen 4 or Raptor Lake. Ideally I would love to pair an RTX 4000 series card with my new platform, but I'm guessing the scalpers and miners will suck up all the initial inventory and I will probably have to wait till next year to get the GPU I want.

Also full disclosure, I've always had a strong preference for "full fat" GPUs and not the scaled down versions.

That said, the mark downs on the elite RTX 3090 Ti cards are becoming very tempting. An aftermarket model will play every game I want with maximum detail and high performance. Only a few upcoming games interest me, like Baldurs Gate 3, Jedi Survivor, Diablo IV, Stalker and several others and I don't foresee any of them being problematic at 1440p, though I'm thinking about upgrading to a 4K ips monitor as well to be honest.

Would I be making a big mistake if I bought an RTX 3090 Ti for $1200?
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
Looking at the used marketplace now, 3080 cards are going for $600-$1000...not that much cheaper than EVGA has them on sale for now.
here's an example:

Heck, used RTX 2070 cards are often selling for their MSRP or more.
He might lose a bit on resale, but not that much.

I'm trying to predict what the used market would look like if a 4070 was announced at $499 and performs like a 3090. These folks can try to get a grand for their used 3080s then. I look forward to having a good laugh.
 

Leeea

Diamond Member
Apr 3, 2020
3,625
5,368
136
Would I be making a big mistake if I bought an RTX 3090 Ti for $1200?
I am a contrary person, so I will tell you the opposite.

You need to buy a 3090 Ti right now! Immediately! Right This Second!

Why you ask?:
1. Eth prices are climbing again, mining demand is going up, and these cards are likely as cheap as they ever will be

2. Eth is supposedly suppose to transition to PoS, but is that really going to happen? They said that several times before they were transitioning to PoS. I am spreading rumors that PoS is not going to happen this time either.

3. The 3090 Ti works with current generation power supplies. The up coming 4000 series are going to require a whole new generation of PSUs, which are not even on the market yet! Even if you can get a rtx4000 series, you will never get your hands on the power supply for one.

4. The upcoming rtx 4000 series is the most hyped launch ever! But it is also going to be a paper launch. Simple truth is nvidia needs to sell off their 3000 series inventory, so rtx4000 series card will likely be unobtainium for months to come.

5. The rtx3090 Ti is the one to get! It fixes the horrible memory subsystem defect from the original release rtx3090 and rtx3080.

6. What else are you going to use $1200 for? A tank of gas?

7. The up coming rtx4000 series will be for more watt hungry then the 3090. Do you even want to deal with a 800 watt heater in your computer case? Does your case have the ventilation for that? The room for a quad slot card that is 16 inches long?

8. In this time of soaring exterior temperatures can your air conditioner handle a rtx4000 series industrial heater? What is that going to do to your electric bill? Truth is, buy the 3090, it is the affordable choice.


And that is what it ultimately comes down to. The rtx3090 Ti is the affordable choice right now. Just buy it. Sell your kidney and make it happen.
 

BoomerD

No Lifer
Feb 26, 2006
62,899
11,289
136
I'm trying to predict what the used market would look like if a 4070 was announced at $499 and performs like a 3090. These folks can try to get a grand for their used 3080s then. I look forward to having a good laugh.

I suppose it's POSSIBLE it would be released at $499...but with crypto starting to come back up...I doubt it. The 4000 series MIGHT have reasonable MSRP attached to them...but between miners and scalpers...and video card companies wanting to take advantage of those inflated prices...I really expect them to be much higher. I've seen speculation that the 4090 will be priced north of $2000. I don't have one of these:
Carnac_the_Magnificent.jpg


So I freely admit they could be released MUCH cheaper than expected...but I won't hold my breath.
 
  • Like
Reactions: Tlh97 and Leeea

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
7,407
2,440
146
A good 3090Ti for $1200 isn't bad. But a good 3080 12GB or 6900XT are also good suggestions. They all have their merits, so up to you. If you were thinking of getting a next gen card anyway, and have the money, it might be fine to get a 3080 12GB or 6900XT now and upgrade to next gen later if desired. With either of those cards, you wouldn't be spending as much money now, and could possibly have more for the next gen cards if they are really good. And even if you don't upgrade to next gen, the 3080 12GB or 6900XT would still be good for quite some time, depending on resolution and such. What res do you play at?
 

MoragaBlue

Member
Jul 17, 2022
38
24
36
Ah, though I probably wouldn't buy it at this price, there is a point where I would, but just not now. Having said that, it's not a horrible nose bleeding price, and I can appreciate the temptation giving the contrast to prior inflated pricing.

I'm of the opinion that if you have another graphics card, I'd probably wait until closer to the end of Q4 where we'll have more visibility on the 4000s performance & pricing, AMD's 7000 specs & asking price and how the secondary market's pricing will unfold. Right now, I'm puzzled on some of the asking prices these used card owners' think they'll get, especially when new ones from reputable retailers are at, near, or below used prices in some cases.

But, if you need a GPU now, can't get a PS5 (like me, not even with invites) and have some games you just wanna play, then, I suppose, a 3090Ti isn't horrible--the 3080s are the sweet spot, in my view.
 
  • Like
Reactions: Tlh97 and Carfax83

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
3. The 3090 Ti works with current generation power supplies. The up coming 4000 series are going to require a whole new generation of PSUs, which are not even on the market yet! Even if you can get a rtx4000 series, you will never get your hands on the power supply for one.

Is this true? I just bought a 1600w platinum rated PSU. The price was so good I just couldn't pass it up. I had heard that the next generation of GPUs would require new connectors, but I assumed they would come with adapters for non ATX 3.0 PSUs.

Having to purchase a new PSU specifically for a new GPU is unreasonable.
 
  • Like
Reactions: Tlh97 and Leeea

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
What res do you play at?

Right now I play at 1440p, but I will certainly be upgrading to 4K with the new platform. I already have the financial means to purchase an LG C2 42 inch OLED HDTV.

From my research, it seems that's the best option for 4K HDR gaming on PC these days.
 

Leeea

Diamond Member
Apr 3, 2020
3,625
5,368
136
Is this true? I just bought a 1600w platinum rated PSU. The price was so good I just couldn't pass it up. I had heard that the next generation of GPUs would require new connectors, but I assumed they would come with adapters for non ATX 3.0 PSUs.

Having to purchase a new PSU specifically for a new GPU is unreasonable.

-blinks-

I do not know if it is true.

They are going to be putting capacitors on the cables to deal with transients. The transients have become a big issue:
Which is why we are getting new connectors.

Would not be surprised if it is a NVidia card you need "new"* design, AMD works with the old. AMD typically uses less power, and Nvidia already skimped on the first gen** rtx3080 and rtx3090 onboard power. The Ti fixes this, buy the Ti or the later gen rtx3080 12GB.


I would hope they would make an add on connector or attachment for current PSU connectors that lets them be used with the new standard.


*PSU cable caps are actually an old thing:
https://linustechtips.com/topic/550306-flipping-psu-cables-with-capacitors/
but I suspect this next gen will be a bit larger.
 
Last edited:
  • Like
Reactions: Tlh97 and Carfax83

Furious_Styles

Senior member
Jan 17, 2019
492
228
116
Is this true? I just bought a 1600w platinum rated PSU. The price was so good I just couldn't pass it up. I had heard that the next generation of GPUs would require new connectors, but I assumed they would come with adapters for non ATX 3.0 PSUs.

Having to purchase a new PSU specifically for a new GPU is unreasonable.

No guarantee on the adapters being included but the ATX 3.0 PSUs will certainly be the new standard and are starting to come out. So AMD will likely be changing over before long as well.
 
  • Like
Reactions: Tlh97 and Carfax83

Leeea

Diamond Member
Apr 3, 2020
3,625
5,368
136
Right now I play at 1440p, but I will certainly be upgrading to 4K with the new platform. I already have the financial means to purchase an LG C2 42 inch OLED HDTV.

From my research, it seems that's the best option for 4K HDR gaming on PC these days.

ok, bad news time.

rtx3090 is not enough to run 4k. Sorry:

Your looking at 60 fps on a good day. That c2 your looking at is a 120 Hz display. Your going to notice it, especially with the oled. 9 out of 10 chance you will turn it down to 1440p, and upscale like everyone else is doing right now.

Both the rtx3090 and rx6900xt are 1440p cards.

If your wanting to play at 4k, wait for both Nvidia and AMD to release their next gen, and make an educated choice.



Lastly, I have a lg c1 oled, and the hype is real.
 
Last edited:

randomhero

Member
Apr 28, 2020
181
247
86
Look, nVidia has just posted preliminary financial report. Revenue down 1.5 billion than expected with 1.5 billion of obligations(read inventory and payed production). All that means that those card are not worth the money they are asking. Don't buy, you're be wasting your money. That 3090ti is USD 500 card, not a penny more.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
Yeah, if playing at 4K then a 3090Ti is definitely not much better than a 3080. The small performance percentage increase translates to a literal 5 FPS when you're already in the mud at around 50 FPS on demanding titles. You need huge generational leaps to make any meaningful difference at 4K. If you play a demanding title that gets 40FPS with a 3080 and even if a 3090Ti is 20% faster, that's still just 48FPS. Between the highs and lows, that 8FPS is completely lost in the dynamics of it all. You need like TWICE the performance to really mean something. If spending $1200, I'd definitely wait for a 4080 or 4090. Then again a 4090 will probably cost ten thousand dollars.
 

Fallen Kell

Diamond Member
Oct 9, 1999
6,039
431
126
Look, nVidia has just posted preliminary financial report. Revenue down 1.5 billion than expected with 1.5 billion of obligations(read inventory and payed production). All that means that those card are not worth the money they are asking. Don't buy, you're be wasting your money. That 3090ti is USD 500 card, not a penny more.
I would say it is probably closer to a $650-700 card due to the memory. If it only had 12GB or 16GB, then I would agree with you on it being a $500 card at the moment due to the expected performance of the next gen cards. The memory amount and performance will cause it to still be useful for CUDA programming, and as such, it will hold to a higher value.
 

BoomerD

No Lifer
Feb 26, 2006
62,899
11,289
136
I would say it is probably closer to a $650-700 card due to the memory. If it only had 12GB or 16GB, then I would agree with you on it being a $500 card at the moment due to the expected performance of the next gen cards. The memory amount and performance will cause it to still be useful for CUDA programming, and as such, it will hold to a higher value.

I gotta say...I like the world you guys live in where the 3090Ti is a $500-$700 card...I'm pleasantly surprised that the 3090's are down to about $1000 atm. (don't WANT one, don't NEED one...but they're nice as hell for those who can benefit from them.)
 
  • Like
Reactions: Tlh97 and Carfax83

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
ok, bad news time.

rtx3090 is not enough to run 4k. Sorry

That's not considering DLSS though. Native 4K isnt't worth the price of admission I agree.

Your looking at 60 fps on a good day. That c2 your looking at is a 120 Hz display. Your going to notice it, especially with the oled. 9 out of 10 chance you will turn it down to 1440p, and upscale like everyone else is doing right now

Are you able to down rez a C2 to 1440p natively, or are you talking specifically about DLSS?

Lastly, I have a lg c1 oled, and the hype is real.

It's a good thing you mentioned this because I have been researching the heck out of this lately. Have you noticed any burn in problems since you've owned yours?

And how does your desktop look? Does it look blurry or washed out at all?
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Look, nVidia has just posted preliminary financial report. Revenue down 1.5 billion than expected with 1.5 billion of obligations(read inventory and payed production). All that means that those card are not worth the money they are asking. Don't buy, you're be wasting your money. That 3090ti is USD 500 card, not a penny more.

This is the money post right here. I saw that financial report on Videocardz and it just reinforced to me how desperate Nvidia and it's partners are to get rid of their RTX 30xx inventory.

I actually wouldn't be surprised if they delayed the launch and availability of the RTX 40xx to give them time to do so. In light of this, I have decided not to purchase the RTX 3090 Ti.

My current setup is nowhere near powerful enough to push it properly anyway, so I should wait a few months until I can purchase my new platform and by then, the RTX 30xx prices should be even cheaper than they are now if I still want to do so or get an RTX 40xx card if they launch on time.

Thanks for all the advice dudes! ;)
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
So if a 4080 is 50% faster than a 3080 and a 3080 had an MSRP of $700, then we can certainly expect a 50% increase in MSRP to match that performance. The 4080 should be $1050, but they might throw us a bone and charge a nice $999.00 for the 80 class this time. After AIBs get ahold of it, prices should be about $12-1500 for the better cooler.
 
  • Like
Reactions: Tlh97 and Ranulf

Fallen Kell

Diamond Member
Oct 9, 1999
6,039
431
126
I gotta say...I like the world you guys live in where the 3090Ti is a $500-$700 card...I'm pleasantly surprised that the 3090's are down to about $1000 atm. (don't WANT one, don't NEED one...but they're nice as hell for those who can benefit from them.)
I'm basing the value on the very high probability of the leaked performance of new cards that will be available within 6-8 months. For something like the 3090 ti, which main "value" is simply that it is the biggest/baddest/fastest card on the market, that "value" has a VERY limited lifespan at this moment in time. Which means it can no longer enjoy the premium value it once held since that lifespan of being able to measure the epeen is too "short". It has some value to places that will use it for it's CUDA capability that do not want to bite the bullet and go into the workstation class or enterprise class cards, but even then for those customers, it still needs to make sense based on the market, as most of those customers won't worry about waiting the 6 months it might take for the next gen cards to be out and buy a next gen card that is either 50-80% faster for the same money, or drop down a level (or two) and buy a card that is roughly comparable or even 10-15% faster and get it for only ~$700-800.

The only redeeming feature of the 3090 ti is it's 24GB memory, which is the main item that will keep the card relevant at all. But the only people who really need that are ones that are running CUDA code which requires a lot of memory. Most people or entities don't need that much at this moment, and the ones that do, well most of those are currently not economical to run since they generate something that you need to hope there is a sucker...I mean buyer, for the coin (but lots of the general people have seen that they are the suckers being left holding the bag of nothingness and gave away their real money to have fake money with no backing behind it).
 
Last edited:
  • Like
Reactions: Tlh97 and Ranulf

Leeea

Diamond Member
Apr 3, 2020
3,625
5,368
136
It's a good thing you mentioned this because I have been researching the heck out of this lately. Have you noticed any burn in problems since you've owned yours?

And how does your desktop look? Does it look blurry or washed out at all?
I am sorry, mine is a 55" and I only use it for couch games ( Elden Ring ) and media consumption ( Disney+, Youtube ). No burn in, but different use case.


Looking real hard at the same 42" monitor you are for desktop, but the price tag is making me blink twice. LG aggressively discounts though, so in a several months that same monitor will be around $1200. I am not worried about the burn in at all. Or more accurately, I willing to accept burn in. I do plan on using the anti-burn in features of the monitor and dropping the brightness to 80%.


My rx6900xt is not a 4k card though, so thinking I am going to wait until next gen to. It has no problem pushing Elden Ring 4k, but Elden ring is capped at 60 fps. But if I jump into a first person shooter it would be to slow.


The oled is just better though. Not a little better, but massively better. It is not just the deep blacks, the contrast, or the color. The picture in game is just clearer. The motion is just easier to see. It makes the game easier.
 
Last edited:

Leeea

Diamond Member
Apr 3, 2020
3,625
5,368
136
Are you able to down rez a C2 to 1440p natively, or are you talking specifically about DLSS?
None of the games I play support DLSS, not that it matters as I have a AMD card.

Both manufacturers have long had features in the drivers that let a person scale up any game if they want to. On AMD it is enable GPU scaling under the display settings. On Nvidia it might be under "Adjust Desktop Size and Position".


When I play 1440p I use a 1440p monitor. I have been playing the total war warhammer 2 quite a bit lately, and that has all be on the 1440p monitor.


I only use the 4k TV for couch games, which all use a game controller ( mine is a xbox 360 for PC ). For me that is the Dark Souls series, all of which are capped internally at 60 fps. As long as the frame time pacing is good it is ok, but I would not be able to play a FPS on that same monitor at those frame rates.
 
Last edited:
  • Like
Reactions: Tlh97 and Carfax83

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Looking real hard at the same 42" monitor you are for desktop, but the price tag is making me blink twice. LG aggressively discounts though, so in a several months that same monitor will be around $1200. I am not worried about the burn in at all. Or more accurately, I willing to accept burn in. I do plan on using the anti-burn in features of the monitor and dropping the brightness to 80%.

Yeah the reviews I've been watching for using it as a desktop monitor are all very positive for the most part.

If I do buy it I'll likely get it from Best Buy, as I have their store card where I can get 0% interest for 24 months.

If you're in the U.S, black Friday and cyber Monday sales should be pretty good I wager.

The oled is just better though. Not a little better, but massively better. It is not just the deep blacks, the contrast, or the color. The picture in game is just clearer. The motion is just easier to see. It makes the game easier.

Yeah it definitely seems that way. To be honest, I've always hated shopping and researching for monitors. There's always some drawback associated with the various types and none of them really excel in all areas.

IPS has good viewing angles, colors and response times, but mediocre contrast. PVA monitors have comparatively good contrast but poor response times and bad viewing angles. TN have very quick response times but suck in everything else.

And now OLED has close to perfect contrast, great response times, true HDR but can suffer from burn in........

You just can't win :persevere: