Nvidia ,Rtx2080ti,2080,(2070 review is now live!) information thread. Reviews and prices

Page 48 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

PrincessFrosty

Platinum Member
Feb 13, 2008
2,301
68
91
www.frostyhacks.blogspot.com
It's not the next big thing in graphics and it will never become possible with current transistor based silicone processors.

Currently it cuts frames by 66% and doesn't even look that good.

Moores law has basically stopped die shrinking and now we are at the point of 5 year cycles on a real node shrink. Gpus are becoming massive and the cost to match.

So if a £1500 gpu produces 33% of the required performance at current node and the next node gives 25% more performance at the same power. That is £4500 at current frames which is £3375 on 7nm. So in 5 years time with the same improvement you can expect £2571 for a further 25% improvement in RT. So in 2 new nodes you are getting 56% more frames in ray tracing than the current 33% of current frame rate. Which means that you are getting 51% of current normal fps in 2 node shrinks.

The beauty of gpu silicone is that it scales very well and performance is pretty linear so unless a miracle new arch comes out explain to me how this is going anywhere no technogoly has ever been maintain in the pc space by the elite hardware also if it is not mainstream it dies

Tech will start to focus on things like research into new materials and more importantly lowering manufacturing costs, you can get a long way on lowering manufacturing costs because then you can start simply adding more GPUs and start working on rendering that can split the load between many GPUs a lot better than say SLI.

It's almost always paradigm shifts that allow tech to continue advancing, people pick the easy route which is do more of the same but better and faster until that gravy train runs out and then they invest in R&D to do something fundamentally newer.

RT is tough on cards, but there's like 1/3rd of the GPU dedicated to it right now, future chips will have more transistors in them and I'd be willing to bet they end up being dedicated towards something like RT rather than Rasterization. If you double the transistor count down to 7nm you're not necessarily going to keep the ratio of RT cores to FP32 the same, you're going to slowly reduce the ratio so more and more is dedicated to RT. The 2080 can already slay raster games in 4k at 60fps no problem, even the AAA titles. When Nvidia make a 3080 or whatever next gen will be called, I doubt many more of those transistors are going on FP32.

Part of this argument I'd say is about flipping the question upside down and saying what would you spend all that power on if you dedicated a 2080 or 2080TI chip to just rasterization? You're talking probably 2x the speed they are now in total. Running 4k games at 120hz instead of 60? There's a niche market if there ever was one. 8k wont be even remotely mainstream for another 5+ years. Multi monitor is also really niche.

You could argue, spend it on better effects and lighting and whatnot, but we've really reached the point where faking it further is going to give diminishing returns, if we want better quality we need a better rendering method. So RT was inevitable it's just a case of when does it most make sense to make the leap and I think now makes a lot of sense despite the fact that it's a painful one. If Nvidia had dedicated an entire 2080 to just rasterization, do you know what all the reviews would read right now? They'd read, there's literally no point in anyone with a GPU from the last 3-4 years getting on of these cards unless you're on of the exceptionally few people that run in 4k+ (multi monitors with a rest totaling more than 4k) or someone who wants 4k 100+fps, both infinitesimally small markets.
 

jpiniero

Lifer
Oct 1, 2010
14,584
5,206
136
I do think 4k144 is what people want, or at least of those who buy $1200 video cards. The real problem of course is that since we don't know if the consoles will have RT hardware, it's tough to say if developers will really bother with much more than superficial usage.
 
  • Like
Reactions: ozzy702

Elfear

Diamond Member
May 30, 2004
7,097
644
126
Tech will start to focus on things like research into new materials and more importantly lowering manufacturing costs, you can get a long way on lowering manufacturing costs because then you can start simply adding more GPUs and start working on rendering that can split the load between many GPUs a lot better than say SLI.

It's almost always paradigm shifts that allow tech to continue advancing, people pick the easy route which is do more of the same but better and faster until that gravy train runs out and then they invest in R&D to do something fundamentally newer.

RT is tough on cards, but there's like 1/3rd of the GPU dedicated to it right now, future chips will have more transistors in them and I'd be willing to bet they end up being dedicated towards something like RT rather than Rasterization. If you double the transistor count down to 7nm you're not necessarily going to keep the ratio of RT cores to FP32 the same, you're going to slowly reduce the ratio so more and more is dedicated to RT. The 2080 can already slay raster games in 4k at 60fps no problem, even the AAA titles. When Nvidia make a 3080 or whatever next gen will be called, I doubt many more of those transistors are going on FP32.

Part of this argument I'd say is about flipping the question upside down and saying what would you spend all that power on if you dedicated a 2080 or 2080TI chip to just rasterization? You're talking probably 2x the speed they are now in total. Running 4k games at 120hz instead of 60? There's a niche market if there ever was one. 8k wont be even remotely mainstream for another 5+ years. Multi monitor is also really niche.

You could argue, spend it on better effects and lighting and whatnot, but we've really reached the point where faking it further is going to give diminishing returns, if we want better quality we need a better rendering method. So RT was inevitable it's just a case of when does it most make sense to make the leap and I think now makes a lot of sense despite the fact that it's a painful one. If Nvidia had dedicated an entire 2080 to just rasterization, do you know what all the reviews would read right now? They'd read, there's literally no point in anyone with a GPU from the last 3-4 years getting on of these cards unless you're on of the exceptionally few people that run in 4k+ (multi monitors with a rest totaling more than 4k) or someone who wants 4k 100+fps, both infinitesimally small markets.

At this point, I'd rather have 4k/144hz than RTX. Maybe you should start a pole to see how many people on the forum feel the same way.
 
  • Like
Reactions: maddogmcgee

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
at least for me, I think I am satisfied at 1440p largely. Really love 144hz. IMO the performance hit for going to 4k I would rather see spent within the engine in non-resolution based image quality things such as real-time ray tracing. I'd rather have 1440p/144hz and better looking "Ultra" settings than go to 4k most of the time. As long as the IQ improvement is actually noticeable and not one of these things where you can only tell in a screenshot

I actually enjoyed 5760x1080 a lot more than 4k having tried both. The ultra-ultra-wide gave a great sense of immersion, whereas 4k was just crisper.
 

ZGR

Platinum Member
Oct 26, 2012
2,052
656
136
That's why I am hoping wider availability of 2080 ti's help drive 4k144hz adoption. I probably play much older games than you @Headfoot , so the performance hit will not be very noticeable. And I guess 1080p480hz and 3440x1440@200hz is on the horizon? Jeez...

At this point, waiting for HDMI 2.1 and/or DP 1.5 may be more important though...
 

CP5670

Diamond Member
Jun 24, 2004
5,510
588
126
I was thinking about a 2080 but decided to pass and got a used 1080Ti instead from the FS/T forum. I keep my cards for a while and only upgrade every 4-5 years (don't game as much as I used to, but still play some things), but the 2080 is way overpriced for how it performs on current games. It doesn't look like the line will actually do RT well even when more games support it. I got a 4K OLED TV and my old 980 struggles in enough games at this point to warrant an upgrade, although it's still good for 1080p. The TV is worth it over a regular monitor just for the big size and perfect black levels. I would have definitely liked to have 4k at 120hz or 144hz but get a choice between 1080p at 120hz and 4K at 60hz, depending on the game.
 

Sonikku

Lifer
Jun 23, 2005
15,749
4,558
136
Yeah the 2080 isn't worth it. I am shocked Nvidia won't budge on the price given the cryptocurrency bust and the gargantuan overstock of hardware.
 

mohit9206

Golden Member
Jul 2, 2013
1,381
511
136
Yeah the 2080 isn't worth it. I am shocked Nvidia won't budge on the price given the cryptocurrency bust and the gargantuan overstock of hardware.
No reason to drop price. Customers either buy 1080Ti or 2080 either way Nvidia gets money.
 

amenx

Diamond Member
Dec 17, 2004
3,892
2,103
136
I think Nvidia not reducing price is due to them having low stocks of RTX to begin with. It was too much of a risky product to introduce into the market at its high price, so they did not make too many. Its why the 2080ti is/was hard to find at or near MSRP. With the overstocks of Pascal, the last thing they need is to have large, unsold stocks of expensive RTXs around.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
Right now, NV probably wants to get rid of the glut of 1000 series chips.

As soon as those get low enough, NV will let loose with the 2000 cards.

Right now, NV is sandbagging, I think. AMD is not much of a threat at the moment, so NV can sandbag for a while.
 

Jaskalas

Lifer
Jun 23, 2004
33,425
7,485
136
At this point, I'd rather have 4k/144hz than RTX. Maybe you should start a pole to see how many people on the forum feel the same way.

I have not personally seen a game running beyond 1080p, so... not sure how I'd weigh RTX over screen resolution.
 

Sonikku

Lifer
Jun 23, 2005
15,749
4,558
136
No reason to drop price. Customers either buy 1080Ti or 2080 either way Nvidia gets money.
But the 1080ti is no longer in production and because the 2080 is such a poor value in comparison, is largely sold out. If you want a 1080ti now, you're looking at buying used. Nvidia doesn't make any money there.
 

CP5670

Diamond Member
Jun 24, 2004
5,510
588
126
Even the used ones are getting expensive at this point and cost much more than they did a few months ago due to low supply. I think in another month or two, it may no longer make sense to get a 1080Ti.

4K looks great when the screen size is large (32"+) and in games that have high quality textures and/or long outdoor viewing distances. It's less useful for games that are more fast paced or online/competitive, where I would rather have a lower resolution with a higher refresh rate and fps.
 
  • Like
Reactions: happy medium

Elfear

Diamond Member
May 30, 2004
7,097
644
126
I have not personally seen a game running beyond 1080p, so... not sure how I'd weigh RTX over screen resolution.

I've used a monitor at 4k/60hz and another at 1440p/144Hz. Having a high PPI makes for some gorgeous graphics if the game is built with high-res textures in mind (i.e. Skyrim with mods looks amazing while Team Fortress/Fortnite don't look much better at 4k). High refresh gaming is also amazing and feels much more fluid than gaming at 60hz. Being able to combine the two technologies would be awesome.

I'm looking forward to a good Ray Tracing implementation, I just don't think the current hardware is capable enough right now and wish the Turing transistor budget had been used for more traditional rasterization improvements.
 
  • Like
Reactions: ZGR and ozzy702

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
At this point, I'd rather have 4k/144hz than RTX. Maybe you should start a pole to see how many people on the forum feel the same way.

Absolutely. I'm on 1440p 144hz now and the difference is huge. I went from 1080p 60hz, to 1440p 60hz and that was a big jump, but wow, the high refresh rate and adaptive sync makes such a difference.

Looking forward to 4k 144hz HDR with RTX. That should be fantastic but we're a long way out from that being reality.
 

Sonikku

Lifer
Jun 23, 2005
15,749
4,558
136
How GPU bound is 144fps? I play WoW in 4k @ 60htz because it shifts the burden from the cpu to the gpu. If I were to try to do the inverse and do the 144fps thing at 1440p, I don't think I could do it as it would be more cpu limited. Can barely even keep a constant 60fps in this game.
 

lupi

Lifer
Apr 8, 2001
32,539
260
126
The thing I truly hate about this farce of a release, is that 1080ti were getting "affordable" until the reviews started leaking and then they went out of stock everywhere. What happened to this chip glut we were told was about to happen with the 1080/1080ti.

Sigh, should have bought that second one when I had a chance.
 

dlerious

Golden Member
Mar 4, 2004
1,784
723
136
The thing I truly hate about this farce of a release, is that 1080ti were getting "affordable" until the reviews started leaking and then they went out of stock everywhere. What happened to this chip glut we were told was about to happen with the 1080/1080ti.

Sigh, should have bought that second one when I had a chance.
I saw them for a week or two at close to msrp. Maybe the glut was the lower end chips - I see there's a 1060 6GB with GDDR5X coming out. The founders Edition is supposed to cost $300. Looks like they want to start gouging the lower end now as well.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
The thing I truly hate about this farce of a release, is that 1080ti were getting "affordable" until the reviews started leaking and then they went out of stock everywhere. What happened to this chip glut we were told was about to happen with the 1080/1080ti.

Sigh, should have bought that second one when I had a chance.
Well, they are making 1060's with 1080 gpus...
 

Pandamonia

Senior member
Jun 13, 2013
433
49
91
Titan Rtx hahahaha

Look at the laughable spec difference

What a shit show this is.

I'd be dumping this stock
 

sze5003

Lifer
Aug 18, 2012
14,182
625
126
I'm happy I stuck with my 1080ti. Now knock on wood it doesn't go kaput anytime soon. I wouldn't know what to do if it did other than search for a used one.
 

Ottonomous

Senior member
May 15, 2014
559
292
136
Honestly don't understand the disdain and mocking here, just look at the massive bargain that the slightly cut TU102 on the Ti is. literally half price and you're all complaining