Question Wait for 3080Ti or not?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ZowieR

Member
Apr 4, 2020
39
1
16
I dont know...

it was a very little stronger than 3080, but the Ti is always the "flagship version".. but it is not the case on this situation? because only a little stronger..

I am in dilemma because I dont know if I play on 1440p/4K ... and some people say that the 10GB VRAM is not bad and good for years..

and some other say differently.. that 20GB of the Ti is very important for the future..

I think that I want play on the highest settings for 3-4 years..

Or it was better decision buying 6800XT right now?



what you think? it is stupid decision by nvidia release better version after only a few months later..
 
Last edited:

Mopetar

Diamond Member
Jan 31, 2011
7,797
5,899
136
It's probably good enough for RT as well since you'll need to run DLSS to get playable frame rates and the extra space that RT requires is offset by the lower resolution. If Microsoft gets DirectStorage working reasonably well in the next year or so I think it alleviates a lot of problems that the 3080 might start running into down the line. If you can start pulling in everything fast enough from SSD (and there's no reason to think anyone who will buy a 3080 right now isn't going get an NVMe drive if it would help) then the memory capacity becomes less of an issue. There's obviously a limit to how much of a juggling act you can do with that, but it probably gets you functionally closer to something like a 12-14 GB card.
 

Borealis7

Platinum Member
Oct 19, 2006
2,914
205
106
no reason to wait in my opinion. try to get a 3080, and if you don't manage to get one by the time TI comes out, see how it impacts the market and decide.
the performance gap between the 3080 and 3090 doesn't leave too much room for a card in the middle, so i say get a 3080 as soon as you can and be done with it.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
I got my pennies in order for an upgrade but there isn't anything available. If you need something now and can find something, just get it. It's looking like the middle of next year for me based on supply.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,107
1,260
126
I expect we're talking 5-10% difference in performance, it's the VRAM if you care about it. I'd get whichever you can actually find in stock. Stock is awful on these cards.
 

sze5003

Lifer
Aug 18, 2012
14,177
622
126
At this point that is what I've started doing. I do want more Vram so I can keep the card longer but who knows when one will be available. By the time I'll probably be able to buy one, pretty sure both will be out and just as unobtainable.
 

Rakehellion

Lifer
Jan 15, 2013
12,182
35
91
I went out and bought the RTX 3080 because of the RTX 3090's performance. I presume the 3080 Ti will be slightly worse and cost at least $1000, unless Nvidia is extremely aggressive on pricing. But it'll be what, 10-15% faster? Basically, I doubt the pricing will match its performance.

10GB memory is pathetic though.
 
  • Like
Reactions: guachi

Guru

Senior member
May 5, 2017
830
361
106
It's probably good enough for RT as well since you'll need to run DLSS to get playable frame rates and the extra space that RT requires is offset by the lower resolution. If Microsoft gets DirectStorage working reasonably well in the next year or so I think it alleviates a lot of problems that the 3080 might start running into down the line. If you can start pulling in everything fast enough from SSD (and there's no reason to think anyone who will buy a 3080 right now isn't going get an NVMe drive if it would help) then the memory capacity becomes less of an issue. There's obviously a limit to how much of a juggling act you can do with that, but it probably gets you functionally closer to something like a 12-14 GB card.
I've never been able to understand this. So you buy a very expensive GPU, $700 dollars or more and want to max everything to get the best quality possible, so obviously you want to enable ray tracing for better effects, but it tanks the performance so much that it makes the gaming experience crap, having a console level performance, so instead of NOT using ray tracing for the small benefits it provides and game at max settings at 4k with 60fps, you are going to use ray tracing with DLSS which actually provides a lower graphical quality and you are still not gaming at 60fps, but at say 45fps.

So instead of gaming at best visual quality at 4k and 60fps, for slightly better traced shadows you enable DLSS which then lowers texture quality and is still not able to provide 60fps experience. So you get worse graphics, worse performance and realistic gaming of 1440p upscaled, so you are not even truly experiencing 4k, its just an illusion of it!

Its one thing if the game you want to enable ray tracing is running at 100fps, so even with ray tracing tanking performance by say 30%, you still get 70fps, then you get the best quality at playable frames, if you do have to enable DLSS to gain playable frames, that means you are sacrificing visual quality and resolution just to have ray tracing effects, which in most games so far have been from worse than rasterization to barely noticeable!
 
  • Like
Reactions: Leeea and guachi

Midwayman

Diamond Member
Jan 28, 2000
5,723
325
126
I've never been able to understand this. So you buy a very expensive GPU, $700 dollars or more and want to max everything to get the best quality possible, so obviously you want to enable ray tracing for better effects, but it tanks the performance so much that it makes the gaming experience crap, having a console level performance, so instead of NOT using ray tracing for the small benefits it provides and game at max settings at 4k with 60fps, you are going to use ray tracing with DLSS which actually provides a lower graphical quality and you are still not gaming at 60fps, but at say 45fps.

So instead of gaming at best visual quality at 4k and 60fps, for slightly better traced shadows you enable DLSS which then lowers texture quality and is still not able to provide 60fps experience. So you get worse graphics, worse performance and realistic gaming of 1440p upscaled, so you are not even truly experiencing 4k, its just an illusion of it!

Its one thing if the game you want to enable ray tracing is running at 100fps, so even with ray tracing tanking performance by say 30%, you still get 70fps, then you get the best quality at playable frames, if you do have to enable DLSS to gain playable frames, that means you are sacrificing visual quality and resolution just to have ray tracing effects, which in most games so far have been from worse than rasterization to barely noticeable!

All that might have been true with DLSS 1, but not so much with DLSS 2. It actually provides a nice anti aliasing effect. There is still some weirdness in areas, but on Balanced or above there is evidence it might be preferable to native res overall. If you haven't, go check out the gamer's nexus cyberpunk dlss coverage. Also for single
 

CP5670

Diamond Member
Jun 24, 2004
5,508
586
126
But there is not all that much difference between the 3080 and 3090 besides the vram anyway?

12%ish? Seems a bit meh.

The 3090 does pull away a bit at 4K, but yeah they are fairly close. The 3080ti vs 3090 might be more like a 2% difference, going by the leaked specs. Although I think if the cards remain this hard to find, we'll have a choice of either paying a big markup on a 3080ti or a small markup on a 3090.
 

Muadib

Lifer
May 30, 2000
17,914
838
126
Does it really matter? Between bots and Covid, getting one is next to impossible. 3090's have been relatively easy to get, but nothing else in the 3000 series has been. I don't see that changing with the 3080Ti, but would love to be proven wrong.
 

CP5670

Diamond Member
Jun 24, 2004
5,508
586
126
That's what I meant. I expect the 3080ti to cost well over MSRP in practice, to the extent that you may as well just buy a 3090. The 3090 is often $1700-1800 but does actually stay in stock at that price for at least a few hours.
 

sze5003

Lifer
Aug 18, 2012
14,177
622
126
I plan on using a 3080ti or 3080 with my ultrawide 34 inch monitor. I don't ever plan on getting a 4k monitor unless I can find one for the same size I have now. I don't ever see myself getting a tv to use as a monitor on my desk. While that 48 inch lg Cx is nice, I can't see myself having that on my desk. Now that I've been working from home I like having my space on the desk.
 

CP5670

Diamond Member
Jun 24, 2004
5,508
586
126
I replaced my PC desk with a mobile TV stand (and repurposed that as a work from home desk), so I could use whatever size display I wanted.

If you're considering an OLED TV, I would recommend getting that first before any of these overpriced video cards. It offers a far bigger improvement in your gaming experience. You can always reduce settings in games or use 2560x1440, which upscales pretty well to 4K.
 

Midwayman

Diamond Member
Jan 28, 2000
5,723
325
126
I replaced my PC desk with a mobile TV stand (and repurposed that as a work from home desk), so I could use whatever size display I wanted.

If you're considering an OLED TV, I would recommend getting that first before any of these overpriced video cards. It offers a far bigger improvement in your gaming experience. You can always reduce settings in games or use 2560x1440, which upscales pretty well to 4K.

FWIW most games have a render scaling option that allows you to render the 3d world at whatever, but still render the UI at native resolution. Much better solution if you need to downsample than just dropping the monitor resolution.
 

gdansk

Golden Member
Feb 8, 2011
1,975
2,354
136
Spending $800 to play 1440p is gross.
Oh, I think the majority of 6800XT and 3080 owners are at that resolution. I'm not and from my own experience I can say that 4K gameplay is seldom worth it. Strategy games maybe benefit the most. I only have a 4K monitor because I stare at text 8 hours a day for my work.

I reiterate my opinion: GPUs are not long term products. This is why I do not see either AMD's middling raytracing performance or Nvidia's gimped memory as worth spending excessively to avoid this generation. Unless Nvidia prices the 3080 Ti extremely aggressively it will not have a relevant price to performance.
 
  • Like
Reactions: Leeea

Midwayman

Diamond Member
Jan 28, 2000
5,723
325
126
Big problem with 4K is the lack of hdmi 2.1 or dp 2.0 monitors. Hopefully we will see more at ces this year. Going back to 60hz isn’t something any serious gamer would contemplate.
 
  • Like
Reactions: Leeea

CP5670

Diamond Member
Jun 24, 2004
5,508
586
126
The current OLED/QLED TVs do support HDMI 2.1 and let you use 4K at 120hz without any compression or latency. For me, it's the main reason to buy one of the new cards.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,839
3,174
126
Spending $800 to play 1440p is gross.


:X

i guess i am rancid and probably stink up the room, because both of these cards are being used on 1440p monitors, well ultra wide 1440p's, which is 1 up what you call gross.
20201218-163132.jpg
 
  • Like
Reactions: ozzy702

Midwayman

Diamond Member
Jan 28, 2000
5,723
325
126
If you have a high refresh monitor @ 1440 its not likely you're 'wasting' frames anyways. If you like to play with ULMB at max settings, its really your best.