• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Nvidia ,Rtx2080ti,2080,2070, information thread. Reviews and prices September 14.

Page 55 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.
When it comes down to it, AMD has more interest in stopping that, than NVidia does, because NN training will create advantages on NVidia HW even for AMD optimized titles.

What I should have asked also was if Nvidia allows entry per title or on the company level? Probably the latter because once you're a partner you're in regardless of who you optimize for. Regardless, you make a good point.
 
Last edited:
Based on tom petersons podcast and the performance leak which was a 2080 at 2025mhz, I think we should assume a large amount of the 35-45% perf increase of Turing is simply to benchmark at 2080/2080ti at 2-2.1 ghz compared to thermally limited founders Pascal cards and declare victory.

Add asynchronous compute improvements from volta and mostly bench dx12 titles and theres the majority of the improvement.

The good news for overclockers is that it appears 2.1ghz will be "average" so we can expect 2.2 perhaps from golden samples. Bad news is those holding out for 2.5ghz clocks are gonna be dissappointed.
 
There are reviewer kits out and about right now, with the drivers included. Perhaps he got his hands on a reviewer kit? Plus almost all the benchmarks are in line with my predictions so I want it to be real. 🙂
 
There are reviewer kits out and about right now, with the drivers included. Perhaps he got his hands on a reviewer kit? Plus almost all the benchmarks are in line with my predictions so I want it to be real. 🙂
Who knows could be. Only a week or so more to go. I just want to know if it's safe for me to go to 4k now finally. If so I'll probably want a gsync 4k display anyway.
 
Who knows could be. Only a week or so more to go. I just want to know if it's safe for me to go to 4k now finally. If so I'll probably want a gsync 4k display anyway.

More like two weeks :/ 14 September mate. And my favorite sites likely won't get all cards from Nvidia and that annoys me, much like last time. One gets a 2080 another a Ti or the 2070. I have very few sites who's reviews I trust.
 
Gamers Nexus and Hardware Unboxed comes to mind for me.

Most of them all end up with essentially the same results, so I don't think there are any conspiracies to lie to you, and if a couple did they would stand out from the crowd with strange results.

That being said Gamers Nexus is my favorite Tech Tube channel. They are extremely thorough, tear stuff apart a lot and I haven't detected even a slight hint of bias. Plus Steve has a cool/mellow sense of humor with a good clear presentation style.
 
HotHardware did a video conference with Nvidia's Tom Peterson and talked about a slew of topics.

Most interesting to me were the comments about performance. At around the 25:30 mark, Tom is asked directly about the performance uplift in traditional games with Turing. At first he responds with 35-45% faster by tier (i.e. 2080 over 1080 & 2080 Ti over 1080 Ti) is what we can expect across a variety of games without DLSS. Later, one of the interviewers asks about the charts Nvidia produced that show a 50% uptick in performance and Tom responds that is probably correct and that there will be cases where the 2080 beats the 1080 Ti.

I'd lean towards 35-45% faster on average as that would align with the 2080 beating the 1080 Ti on occasion and Tom's initial comments.

Credit to nEo717 over on [H] for linking the video and pointing out some more interesting discussion points.

 
HotHardware did a video conference with Nvidia's Tom Peterson and talked about a slew of topics.

Most interesting to me were the comments about performance. At around the 25:30 mark, Tom is asked directly about the performance uplift in traditional games with Turing. At first he responds with 35-45% faster by tier (i.e. 2080 over 1080 & 2080 Ti over 1080 Ti) is what we can expect across a variety of games without DLSS. Later, one of the interviewers asks about the charts Nvidia produced that show a 50% uptick in performance and Tom responds that is probably correct and that there will be cases where the 2080 beats the 1080 Ti.

I'd lean towards 35-45% faster on average as that would align with the 2080 beating the 1080 Ti on occasion and Tom's initial comments.

Credit to nEo717 over on [H] for linking the video and pointing out some more interesting discussion points.


Posted a couple of pages (and days) back by sze5003. Credit to him since he found it days ago, right after it went live.
https://forums.anandtech.com/thread...es-september-14.2552804/page-53#post-39554947

While everyone latches onto the 35%-45% stuff, Tom appears to me to just be going along with the released performance slide while adding it will depend on specific game.

The more interesting stuff for me was:

DLSS: Mainly that it will be trained for each game, and maybe each setting/level for each particular game, and that NVidia will do the DLSS training for free on the their network as part of developer relations.

TPU: Will be accessible through CUDA interface, developer can use them for whatever they want in games, and he expects all kinds of new uses developed outside NVidia.

NVLink: Nothing great, he shot down the Memory Pooling idea that many had. Basically new NVLink is just a faster SLI bridge. SLI still requires just as much developer work as it ever did, same old SLI modes, all working the same way. At least for now.
 
Apologies. Didn't realize it had been posted before.

I wasn't chastising, just pointing out if we are doling out credit, it was discovered here much sooner, and that there is all kinds of interesting info in that video, that is better than a confirmation of the performance slide already shown.
 
Who do you trust?

AnandTech (obviously ha ha)
OC3D (Tom is straight up, non scripted)
Guru3D
Digital Foundry

I kinda like GN but he was extremely biased, and wrongfully so, with the EVGA AIO. But let this remark not turn into a trustworthy or not debate, please.

Edit:

I'm worried that with my 1080 @1440p I have to make a lot of compromises for the following titles:

Cyberpunk 2077
Metro Exodus
Shadow of the Tomb Raider
 
Last edited:
AnandTech (obviously ha ha)
OC3D (Tom is straight up, non scripted)
Guru3D
Digital Foundry

I kinda like GN but he was extremely biased, and wrongfully so, with the EVGA AIO. But let this remark not turn into a trustworthy or not debate, please.

Edit:

I'm worried that with my 1080 @1440p I have to make a lot of compromises for the following titles:

Cyberpunk 2077
Metro Exodus
Shadow of the Tomb Raider
Guru 3d is good and I also like digital foundry too. Not sure about tomb raider you may be able to get away without too much aa and still be able to play on mostly high settings.

Metro and cyberpunk will require more horsepower. Although the last metro I played didn't seem to have issues. I can't recall if it was the first one or last light that I finished.

Cyberpunk will definitely make use of the 1080ti and most likely that won't be enough for all maxed settings but it should be close. If you don't end up with a 20 series card you could always get a 1080ti for cheaper.

Even if the 35-45 % increase over my card is true I'll probably get a 2080ti at some point but I know I'll feel kind of crappy spending $1200 for features that I won't use, except for dlss. If you could play at 1440p with rtx on that would be nice but that won't be the case.

But then at least I should be able to max stuff out easily if I stick to 1440p with it so that won't be so bad even though I'd like to do 4k.

I'm going to wait on playing tomb raider until I get one of the new cards. Definitely getting Cyberpunk and resident evil 2 remake when that is out. So I'll hold off on playing any PC games until I have the new Ti, assuming it shows good results. But at least I'll have red dead redemption 2 to occupy my time until that time comes.

Judging by some of the prices of the speculated aib models, they say Asus is planning to sell a 2080ti for $1500?!!

I don't think we will see any cards lower than $1200 as that's the real MSRP thanks to the FE models. If third parties are going to charge so much I may just end up getting a regular model since clocks won't be much different.

https://www.pcgamesn.com/nvidia-rtx-20-series-pros-and-cons

Also, AMD is said to launch the first 7nm cards this year. I'm betting Nvidia will have something too at one point.

https://wccftech.com/amd-confirms-new-7nm-radeon-graphics-cards-launching-in-2018/
 
Last edited:
Theres something seriously wrong with those numbers for Witcher 3.

They claim 44 fps average with max settings at 4k with 1080 ti -- that is barely above 980 Ti which averages around 37fps at 4k max.

In fact a 1080 ti should be averaging around 65 fps with everything maxed out, and a Titan V should be getting around 85 fps.

Early driver issues. Pascal and Vega and Polaris all had wonky benchmarks from game to game like that on release...and with Vega especially, for a very long time after, lol.

IIRC, a Vega 56 and 64 can run (or did very early on) something like 5% to a 1080ti in games like Doom, (not even on Vulcan), but something like 30% or more behind on GTA V. I also recall that performance @2K and 4k doesn't really scale from card to card, comparing the same suite of games. But I haven't really paid attention to updated benchmarks in forever.
 
Guru 3d is good and I also like digital foundry too. Not sure about tomb raider you may be able to get away without too much aa and still be able to play on mostly high settings.

Metro and cyberpunk will require more horsepower. Although the last metro I played didn't seem to have issues. I can't recall if it was the first one or last light that I finished.

Cyberpunk will definitely make use of the 1080ti and most likely that won't be enough for all maxed settings but it should be close. If you don't end up with a 20 series card you could always get a 1080ti for cheaper.

Even if the 35-45 % increase over my card is true I'll probably get a 2080ti at some point but I know I'll feel kind of crappy spending $1200 for features that I won't use, except for dlss. If you could play at 1440p with rtx on that would be nice but that won't be the case.

But then at least I should be able to max stuff out easily if I stick to 1440p with it so that won't be so bad even though I'd like to do 4k.

I'm going to wait on playing tomb raider until I get one of the new cards. Definitely getting Cyberpunk and resident evil 2 remake when that is out. So I'll hold off on playing any PC games until I have the new Ti, assuming it shows good results. But at least I'll have red dead redemption 2 to occupy my time until that time comes.

Judging by some of the prices of the speculated aib models, they say Asus is planning to sell a 2080ti for $1500?!!

I don't think we will see any cards lower than $1200 as that's the real MSRP thanks to the FE models. If third parties are going to charge so much I may just end up getting a regular model since clocks won't be much different.

https://www.pcgamesn.com/nvidia-rtx-20-series-pros-and-cons

Also, AMD is said to launch the first 7nm cards this year. I'm betting Nvidia will have something too at one point.

https://wccftech.com/amd-confirms-new-7nm-radeon-graphics-cards-launching-in-2018/

By the time cyberpunk 2077 is released we will be on cards that are at least a generation beyond the 2080ti so it’s not even an issue to me. Can’t look at games that are years out with no release date. Gotta look at what you are playing now and what is coming in the next few months to a year.
 
By the time cyberpunk 2077 is released we will be on cards that are at least a generation beyond the 2080ti so it’s not even an issue to me. Can’t look at games that are years out with no release date. Gotta look at what you are playing now and what is coming in the next few months to a year.
Probably by 2019 early 2020 as they have said in this article here. This could be when they finish it and then may need more time. I was in Poland last year and met some people at a friend's wedding that worked at CD Projekt Red and they are a pretty small company but he said they rely on government grants to get their work done. This grant should end in June 2019 but that doesn't necessarily mean the game will be released right away. Although it could be a possibility and I'd be really happy if they did.

https://www.google.com/amp/s/www.in...yberpunk-2077-release-date-everything-we-know
 
By the time cyberpunk 2077 is released we will be on cards that are at least a generation beyond the 2080ti so it’s not even an issue to me. Can’t look at games that are years out with no release date. Gotta look at what you are playing now and what is coming in the next few months to a year.

For some reason I thought we had an early 2019 release date, must have mixed it up with another game. So I'm no longer bothered by that one as 7nm will be either in sight or released by then.

That leaves just metro and tomb raider as demanding games. Others on my list are most definitely fine with the 1080 (FH4, Odyssey, rage, doom eternal etc.).
 
Dissapointing if the 12,800 score for 3d mark graphics is legit for 2080 Ti, since an overclocked Titan V can hit 14,000.
 
Last edited:
Posted a couple of pages (and days) back by sze5003. Credit to him since he found it days ago, right after it went live.
https://forums.anandtech.com/thread...es-september-14.2552804/page-53#post-39554947

While everyone latches onto the 35%-45% stuff, Tom appears to me to just be going along with the released performance slide while adding it will depend on specific game.

The more interesting stuff for me was:

DLSS: Mainly that it will be trained for each game, and maybe each setting/level for each particular game, and that NVidia will do the DLSS training for free on the their network as part of developer relations.

TPU: Will be accessible through CUDA interface, developer can use them for whatever they want in games, and he expects all kinds of new uses developed outside NVidia.

NVLink: Nothing great, he shot down the Memory Pooling idea that many had. Basically new NVLink is just a faster SLI bridge. SLI still requires just as much developer work as it ever did, same old SLI modes, all working the same way. At least for now.

One more thing regarding SLI, how about possibilty of primary display card providing classic image rasterization and the other computing the raytraced parts of the scene, then sending the data via NVlink and all its composited into final image? Dont you think that might work? Or no way?
 
One more thing regarding SLI, how about possibilty of primary display card providing classic image rasterization and the other computing the raytraced parts of the scene, then sending the data via NVlink and all its composited into final image? Dont you think that might work? Or no way?
This would be very cool/interesting if nv link worked like that. It would take the load off the primary card and thus allow rtx to be at playable fps for resolutions higher than 1080p. I wouldn't spend this much for two cards just to have that if it was possible.
 
Status
Not open for further replies.
Back
Top