Nvidia ,Rtx2080ti,2080,2070, information thread. Reviews and prices September 14.

Page 18 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

ub4ty

Senior member
Jun 21, 2017
749
898
96
https://www.anandtech.com/show/13249/nvidia-announces-geforce-rtx-20-series-rtx-2080-ti-2080-2070
It's official. Nv-link is not supported on the 2070 or lower. So, you'll have to pony up $800 for the privilege of Nv-link support on the 2080 and another $80 for the actual module. This puts the $599 2070 GPU at another big disadvantage as to why it is a more affordable card. The clear segmentation has arrived with a $800 sticker shocking price for the full privilege of the new platform's features with no word on what else is further gimped access wise when compared to the eye watering $2,300 entry level quadro. Catch everyone in 2019.
 
  • Like
Reactions: moonbogg and Ranulf

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
I'm in on the same boat as everyone else. I was looking forward to a GTX 2070, but since actual performance figures weren't released..... the gains for current gen games are going to be really small. The writing is on the wall. I might just see if I can get a good used GTX 1080 and call it a day until 7nm.
 

Ken g6

Programming Moderator, Elite Member
Moderator
Dec 11, 1999
16,818
4,802
75
Inb4 nvidia trys to sell ray tracing off as speed tweaks in bitcoin mining...

then we'll never see these cards drop below 1000.
Or maybe they'll develop a new one, "RayCoin". I didn't check whether that one exists yet.
 
  • Like
Reactions: moonbogg

sze5003

Lifer
Aug 18, 2012
14,320
683
126
I went back and watched some.if the video from the reveal. He talked up shadows so much. Now maybe I don't have a lot of experience with shadows but I didn't really notice a drastic difference where they were turning rtx on and off in the tomb raider party scene.

Sure it looked nicer with rtx on but I didn't find it horrible with it off either. He kept saying no its suppose to look like this, it's suppose to have rtx that's how it should be.

Ok but I didn't see the selling point. The battlefield trailer looked very nice though. Tom. Raider also looked good but it will look good on a 1080ti as well.

Most people turn down shadows anyway to get higher fps if you are playing a shooter or playing online. So now they are just trying to sell us better shadows, that's what I took away from it. I'm sure it will look much better later but for now I'll wait for real performance results.

I also love how he shows the graph at the end saying this is how more powerful the rtx family is, and on the x axis it says rtx ops. What about regular ops, show us regular ops!

Did anyone count how many times he said rtx and shadows? It's like he was trying so hard to tell us we need this. Marketing at it's finest to hype sales. People only really applauded when he said starting at $499.
 
  • Like
Reactions: crisium and wilds

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Looks too shiny to me. Like the entire world got a polish.

It might be more interesting to take a look at it as the swtich on and off at various points from the presentation:
https://www.youtube.com/watch?v=8xM1AcnS5Wo

It isn't that RT makes the world shiny, it's that they chose shiny stuff to demo, but this section talking about the guns shows that ray tracing works for everything, not just the smooth shiny bits as you also get flame reflections on the wood.

It's damn impressive tech IMO. Sure it's too bloody expensivei but it is very sweet overall package and I don't see this failing like PhysX.
 

Ranulf

Platinum Member
Jul 18, 2001
2,920
2,601
136
Wow, no nvlink on 2070. How much will the new "entry level" cards be now? $200 for 2050 and $330 for 2060?
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
27,406
16,255
136
His point is that everyone wrote off AMD and now they're very competitive with an arguably superior product for the price.
Sorry, that was my point, it just didn't seem like that was what he was saying.

We agree !
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
This thread is a great example of how terrible these forums have become.

We finally get real time global illumination in gaming with the hardware to push it- something we have been begging for since the dawn of real time 3D, it's finally here- and people are dropping into their hive mind fanboy idiocy to champion/vilify their respective party. Pathetic.

First off- everyone bashing the expected performance- please make sure you go on record right now stating how much slower the 2070 is going to be compared to the Vega 56- or even the Vega 64 if we want to use the inflated FE pricing. Please make sure to state for the record, clearly, how these cards are vastly inferior in terms of price versus performance compared to the competition.

Very important- if you are bashing these parts for their cost versus performance, go on record stating how much slower it is going to be compared to the competition.

Now this forum for many months turned into a cesspool of gushing over async compute and how it was going to change the industry- something that offers a very small performance improvement in certain situations with driver overhead. Months and months we saw people going off over this. Something nobody even tried to claim would give us any visual benefit whatsoever.

If we have people in this thread that believe that performance, only performance, and always performance is the only thing that matters I'm going to go ahead and save you a ton of cash. 1024x768 all settings on lowest- if you have anything over a 270 you should be good to go for almost every game for years to come. What's more- it isn't like we are seeing big improvements in gameplay lately, so just go ahead and play through all the best games of the last twenty years and stop even coming to threads discussing PC graphics hardware. You save money and people interested in technology advancing can actually have a reasonable discussion without wading through the ignorant crap being spewed by the insane Luddite mentality.

OK, so now we should have everyone complaining about the performance, which we haven't seen yet, on record with how much slower the 2070 is going to be compared to a Vega, and we have people who hate improved IQ being content so which groups do we have left?

The team green boys- these new features are taking up huge chunks of the die. These inflated costs are entirely due to the fact that they are offering real time global illumination. The fact they are doing it in as small of a space as they are is mind blowing- but it is a *huge* chunk of space. These parts are going to offer a very small performance increase over the prior generation compared to what you are used to seeing. If you people are going to try and defend it from that angle- make your calls now on performance and be prepared to be *very* disappointed.

"PC gaming is going to die because people are getting priced out" idiots- seriously, put the crack pipe down. Pull up the Steam user charts and look at what games people are playing and on what monitors- a 1050 is going to keep the masses happy for years- the people who are even thinking about these new parts are a minuscule subset of the market and for us, is the price really going to push us out? 1998 V2 SLI was all the rage, adjusted for inflation that would ring in between $900 and $1000. Really all that different? Really?

This won't see broad scale adoption...... if anyone is actually claiming this, you don't understand what is being discussed. All of the other options people are comparing this to, GameWorks, PhysX, Tesselation- all of those require extra work from developers. Some of it is cut and paste from libraries, some of it is quite a bit more involved but it is extra work. Global illumination isn't. If your engine is set up to utilize it(which all of the major engines will be) you literally just turn it on. That's it. This is *LESS* complex than supporting multiple resolutions. Seriously. If you don't have to worry about legacy parts this is *SIGNIFICANTLY* less work than *CURRENT* solutions. Looks much better- much less work. Devs won't support this..... why?

The 'this has been done before' crowd- not even remotely close to being the league of being true. All of the prior attempts were for a Ray Traced render engine setup. That is handling all of your rendering through ray tracing. That has some huge drawbacks and is simply way too slow and limited to work properly given computational limits. This is adding ray traced lighting to a rasterized rendering pipeline. Gives you the benefits of global illumination without removing the massive benefits of rasterization. No, nothing like this has ever been done in hardware before- and it isn't quite 'as done' as some of you all are thinking. Yes, their have been engines that used shaders to ray trace certain elements, this is full global illumination using ray tracing, a very, very different thing.

The AI hardware on these parts are used to calculate out the actual lighting based on very loose approximations that the actual ray calculations are doing. The effectiveness of this method is actually shockingly good for real time purposes, and the only way it could be reasonably done.

This is a big point some people may not want to here- if AMD doesn't follow within the next couple of years they are out of the graphics business.

In computer graphics this is the biggest game changer we have seen since the Voodoo 1.

That isn't hyperbolic. Pull yourself out of fanboy team red/green muck for an hour and go check out what people who work with visualization are saying. This is *HUGE* and AMD is going to follow suit or cease to be a factor. Does anyone really think Microsoft is going to launch their next console without this? We saw it with the last generation, Sony forcing AMD to change some hardware around that worked out very well for all involved- it will happen again with the next gen of consoles. They aren't shipping without this technology.

The question is when are we going to see AMD's response, to which I think we know probably not until 7nm. No matter which 'side' you are on, trust me when I say you want to see this technology succeed. For those truly rabid for team red- push them to pull a stunt like nVidia did with tessellation. They were late to the game and then smoked AMD, that is the best option for them going forward.

This *IS* the future so many of us have been waiting for for decades. The idiocy involved in this thread would be hysterical to read through if it wasn't so sad.
 

tamz_msc

Diamond Member
Jan 5, 2017
3,865
3,730
136
This thread is a great example of how terrible these forums have become.

We finally get real time global illumination in gaming with the hardware to push it- something we have been begging for since the dawn of real time 3D, it's finally here- and people are dropping into their hive mind fanboy idiocy to champion/vilify their respective party. Pathetic.

First off- everyone bashing the expected performance- please make sure you go on record right now stating how much slower the 2070 is going to be compared to the Vega 56- or even the Vega 64 if we want to use the inflated FE pricing. Please make sure to state for the record, clearly, how these cards are vastly inferior in terms of price versus performance compared to the competition.

Very important- if you are bashing these parts for their cost versus performance, go on record stating how much slower it is going to be compared to the competition.

Now this forum for many months turned into a cesspool of gushing over async compute and how it was going to change the industry- something that offers a very small performance improvement in certain situations with driver overhead. Months and months we saw people going off over this. Something nobody even tried to claim would give us any visual benefit whatsoever.

If we have people in this thread that believe that performance, only performance, and always performance is the only thing that matters I'm going to go ahead and save you a ton of cash. 1024x768 all settings on lowest- if you have anything over a 270 you should be good to go for almost every game for years to come. What's more- it isn't like we are seeing big improvements in gameplay lately, so just go ahead and play through all the best games of the last twenty years and stop even coming to threads discussing PC graphics hardware. You save money and people interested in technology advancing can actually have a reasonable discussion without wading through the ignorant crap being spewed by the insane Luddite mentality.

OK, so now we should have everyone complaining about the performance, which we haven't seen yet, on record with how much slower the 2070 is going to be compared to a Vega, and we have people who hate improved IQ being content so which groups do we have left?

The team green boys- these new features are taking up huge chunks of the die. These inflated costs are entirely due to the fact that they are offering real time global illumination. The fact they are doing it in as small of a space as they are is mind blowing- but it is a *huge* chunk of space. These parts are going to offer a very small performance increase over the prior generation compared to what you are used to seeing. If you people are going to try and defend it from that angle- make your calls now on performance and be prepared to be *very* disappointed.

"PC gaming is going to die because people are getting priced out" idiots- seriously, put the crack pipe down. Pull up the Steam user charts and look at what games people are playing and on what monitors- a 1050 is going to keep the masses happy for years- the people who are even thinking about these new parts are a minuscule subset of the market and for us, is the price really going to push us out? 1998 V2 SLI was all the rage, adjusted for inflation that would ring in between $900 and $1000. Really all that different? Really?

This won't see broad scale adoption...... if anyone is actually claiming this, you don't understand what is being discussed. All of the other options people are comparing this to, GameWorks, PhysX, Tesselation- all of those require extra work from developers. Some of it is cut and paste from libraries, some of it is quite a bit more involved but it is extra work. Global illumination isn't. If your engine is set up to utilize it(which all of the major engines will be) you literally just turn it on. That's it. This is *LESS* complex than supporting multiple resolutions. Seriously. If you don't have to worry about legacy parts this is *SIGNIFICANTLY* less work than *CURRENT* solutions. Looks much better- much less work. Devs won't support this..... why?

The 'this has been done before' crowd- not even remotely close to being the league of being true. All of the prior attempts were for a Ray Traced render engine setup. That is handling all of your rendering through ray tracing. That has some huge drawbacks and is simply way too slow and limited to work properly given computational limits. This is adding ray traced lighting to a rasterized rendering pipeline. Gives you the benefits of global illumination without removing the massive benefits of rasterization. No, nothing like this has ever been done in hardware before- and it isn't quite 'as done' as some of you all are thinking. Yes, their have been engines that used shaders to ray trace certain elements, this is full global illumination using ray tracing, a very, very different thing.

The AI hardware on these parts are used to calculate out the actual lighting based on very loose approximations that the actual ray calculations are doing. The effectiveness of this method is actually shockingly good for real time purposes, and the only way it could be reasonably done.

This is a big point some people may not want to here- if AMD doesn't follow within the next couple of years they are out of the graphics business.

In computer graphics this is the biggest game changer we have seen since the Voodoo 1.

That isn't hyperbolic. Pull yourself out of fanboy team red/green muck for an hour and go check out what people who work with visualization are saying. This is *HUGE* and AMD is going to follow suit or cease to be a factor. Does anyone really think Microsoft is going to launch their next console without this? We saw it with the last generation, Sony forcing AMD to change some hardware around that worked out very well for all involved- it will happen again with the next gen of consoles. They aren't shipping without this technology.

The question is when are we going to see AMD's response, to which I think we know probably not until 7nm. No matter which 'side' you are on, trust me when I say you want to see this technology succeed. For those truly rabid for team red- push them to pull a stunt like nVidia did with tessellation. They were late to the game and then smoked AMD, that is the best option for them going forward.

This *IS* the future so many of us have been waiting for for decades. The idiocy involved in this thread would be hysterical to read through if it wasn't so sad.
Nobody's going to bother with ray-traced reflections in a game like Battlefield V, if turning it on means that you lose half the FPS. Did you even see the video of the Battlefield V demo? They forgot to turn on shadows on certain close-ups in order to focus solely on the shiny stuff. Until the time comes when this stuff can be done incurring minimal performance loss compared to rasterization, on a much larger scale, then all people are going to talk about is the ridiculous pricing and absolute lack of details on hard performance numbers in existing games.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Nobody's going to bother with ray-traced reflections in a game like Battlefield V, if turning it on means that you lose half the FPS.

So you don't think they should offer 4K resolution either, right?
 

Head1985

Golden Member
Jul 8, 2014
1,867
699
136
Wow, no nvlink on 2070. How much will the new "entry level" cards be now? $200 for 2050 and $330 for 2060?
1060 cost 300usd so 2060 will probably be 400usd.
2080ti-1200usd
2080-800usd
2070-600usd
2060-400usd
2050TI-250usd??
 

Shivansps

Diamond Member
Sep 11, 2013
3,918
1,570
136
The main problem for this raytracing is that until it gets to the mainstream gpus is not going to be widely used, maybe a few games here and there... just like AMD when they did pay devs to use Mantle in like 3 titles...

This is a good thing BUT not for this generation, once it gets to mainstream it will be a must have. Probably on the gen after this one.
 
  • Like
Reactions: ozzy702

Ken g6

Programming Moderator, Elite Member
Moderator
Dec 11, 1999
16,818
4,802
75
Very important- if you are bashing these parts for their cost versus performance, go on record stating how much slower it is going to be compared to the competition.
Let's be clear, the competition isn't AMD right now. It's the Nvidia 1000 series. A 2070's shader specs are worse than a 1080's! The only thing that appears to save it, based on what we can see now, is the GDDR6 bandwidth. I'd estimate a 2070 will be about the same performance as a 1080 in games. Worse in mining and distributed computing.

There is one wild card, though, and that's the claim that the 2000 series will be able to do integer and floating point math simultaneously. This sounds a lot like hyper-threading. Gains from that are hard to predict; if it works like they claim, on games I'd roughly guess a 10-50% improvement, but I don't know how games use the shader hardware. On mining and DC, gains from that will be either 0% or something up to ~70%; probably somewhere in between and very dependent on the application. And if it doesn't work like they claim gains will be just 0%.
 

tamz_msc

Diamond Member
Jan 5, 2017
3,865
3,730
136
So you don't think they should offer 4K resolution either, right?
If the 2080 Ti can run Battlefield V at 100 FPS at 4K without RTX, and turning shine on results in only 60 FPS, then nobody is going to bother turning it on in the first place. In fact a fast paced shooter like Battlefield V is the worst place to have these effects included in the first place, it makes more sense in single player games.
 
  • Like
Reactions: beginner99

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
If the 2080 Ti can run Battlefield V at 100 FPS at 4K without RTX, and turning shine on results in only 60 FPS, then nobody is going to bother turning it on in the first place.

Someone is firing around the corner from you, with RT on you can tell if they are prone, crouched or standing by a quick muzzle flash, without you can't.

You'd rather have 4K with old lighting then that? Reflections are a byproduct that looks shiny they pull out to show off, believe me when I say it isn't the coolest part of the technology.

For the record- HDR or bust with this tech, seriously. The dynamic range of SDR is going to be a *HUGE* drawback. BTW- Your concerns I find valid and absolutely worth discussing, decidedly different then a lot of the others :)

Let's be clear, the competition isn't AMD right now. It's the Nvidia 1000 series. A 2070's shader specs are worse than a 1080's!

Do we not want higher performance to enable better visuals? Is that not the reason for it? I'm just trying to establish baselines of what discussion I'm having :)
 

tamz_msc

Diamond Member
Jan 5, 2017
3,865
3,730
136
Someone is firing around the corner from you, with RT on you can tell if they are prone, crouched or standing by a quick muzzle flash, without you can't.
I get it that you neither know what the game looks like while playing, nor do you know how people play it at a high level. Have a look at this and tell me just ONE instance where RTX would help.
Also that statement is ridiculous - why would a multiplayer game have a graphical feature that has a side-effect of giving players who turn it on an advantage over those who don't?
 

maddie

Diamond Member
Jul 18, 2010
5,204
5,614
136
Someone is firing around the corner from you, with RT on you can tell if they are prone, crouched or standing by a quick muzzle flash, without you can't.

You'd rather have 4K with old lighting then that? Reflections are a byproduct that looks shiny they pull out to show off, believe me when I say it isn't the coolest part of the technology.

For the record- HDR or bust with this tech, seriously. The dynamic range of SDR is going to be a *HUGE* drawback. BTW- Your concerns I find valid and absolutely worth discussing, decidedly different then a lot of the others :)



Do we not want higher performance to enable better visuals? Is that not the reason for it? I'm just trying to establish baselines of what discussion I'm having :)
As far as I've seen, nobody has criticized the technology. This is an issue only in your imagination. I keep thinking Don Quixote here.

Where the main discontent arises is about the pricing, and I do not see you even mentioning that. So no, we are looking forward to RT and its uses, but not at the prices Nvidia wants.
 

beginner99

Diamond Member
Jun 2, 2009
5,320
1,768
136
Actually counting the additional month until actual release, 28 months for a 20% increase in power for a 50% increase in cost...

And lets consider >3.5 years back you could get 290x for $250 or lower. Roughly it will be about half the performance of an RTX 2070 but cost less than half. Eg. even 3.5 years latter we get worse performance/$. This is even worse than what happened wit Intel and CPUs.

And who falls for this Raytracing crap especially in BF5? You won't notice this in action is the best case. Worst case it is actually distracting or they wasted time and resources in it instead of better gameplay especially hit detection, net code and physics. It's no surprise that many still think bad company 2 is still the best (albeit physics there with hand grenades are very bad).
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
0:30-0:31- shadow improperly rendered as the guy was falling down the cliff, RT would have enabled the guy ahead to see him incoming earlier. 0:40-0:41- cresting the hill segment literally everything lighting related rendered wrong, positioning was impossible to determine- can't tell based on the angle of light if it would have benefited the players, maybe. 1:24-1:25 shadow missing when he jumps down giving the player a decided advantage. 1:34-1:35 missing muzzle flashes hid his location from oncoming- I could keep going with this, is it really necessary though? Now you can say that he wouldn't notice if you turned it on right now because he's trained not to watch for it. Once you have it, it's really easy to miss it when it's gone.

Also that statement is ridiculous - why would a multiplayer game have a graphical feature that has a side-effect of giving players who turn it on an advantage over those who don't?

3D positional audio anyone? I could actually list a ton, hell even in games like Overwatch there are absolutely graphics settings that give people an advantage. Nothing new(higher resolution back in the day was big, AA, shadows period etc).

As far as I've seen, nobody has criticized the technology.

Uhhh

who falls for this Raytracing crap

I could pull up an awful lot of those from this thread.
 

Tup3x

Golden Member
Dec 31, 2016
1,302
1,438
136
https://www.anandtech.com/show/13249/nvidia-announces-geforce-rtx-20-series-rtx-2080-ti-2080-2070
It's official. Nv-link is not supported on the 2070 or lower. So, you'll have to pony up $800 for the privilege of Nv-link support on the 2080 and another $80 for the actual module. This puts the $599 2070 GPU at another big disadvantage as to why it is a more affordable card. The clear segmentation has arrived with a $800 sticker shocking price for the full privilege of the new platform's features with no word on what else is further gimped access wise when compared to the eye watering $2,300 entry level quadro. Catch everyone in 2019.
Why would anyone buy 2x 2070 when there's 2080 Ti?
 

tamz_msc

Diamond Member
Jan 5, 2017
3,865
3,730
136
0:30-0:31- shadow improperly rendered as the guy was falling down the cliff, RT would have enabled the guy ahead to see him incoming earlier. 0:40-0:41- cresting the hill segment literally everything lighting related rendered wrong, positioning was impossible to determine- can't tell based on the angle of light if it would have benefited the players, maybe. 1:24-1:25 shadow missing when he jumps down giving the player a decided advantage. 1:34-1:35 missing muzzle flashes hid his location from oncoming- I could keep going with this, is it really necessary though? Now you can say that he wouldn't notice if you turned it on right now because he's trained not to watch for it. Once you have it, it's really easy to miss it when it's gone.
Yup, I'm right in assuming that you've never played a competitive FPS. Things like long shadow-maps, AO, extra particles, reflections, temporal AA instead of MSAA, etc. are all distractions in a multiplayer FPS. There are genuine reasons why people turn down settings, because it impacts performance, and running at settings which is a balance between high FPS and visuals is always the compromise people make when playing these kinds of games.
3D positional audio anyone? I could actually list a ton, hell even in games like Overwatch there are absolutely graphics settings that give people an advantage. Nothing new(higher resolution back in the day was big, AA, shadows period etc).
The only graphics settings which give people an advantage are settings that LOWER visual quality, not INCREASE it. That's why people turn off AA in PUBG, because temporal AA in UE4 makes everything look blurry. A game like Battlefield whose sound design has arguably got worse as they've moved to a more casual audience is the last thing that'll take advantage of positional audio. Besides, even without ray-tracing, sound design has improved a lot in competitive FPS, like the introduction of HRTF in CS:GO for example.
 
Status
Not open for further replies.