Nvidia ,Rtx2080ti,2080,(2070 review is now live!) information thread. Reviews and prices

Page 17 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ub4ty

Senior member
Jun 21, 2017
749
898
96
I have few benchmarks with MSI 2080 gaming trio vs 1080TI gaming x so both AIB runs at +- same clock 1920 Mhz.
crysisldc7y.jpg

deusyeix4.jpg

divqwd9b.jpg

farcryx8dwz.jpg

middlejhims.jpg

w3gxe7u.jpg

tombraiderfgcbf.jpg

OC result in crysis3:
2080-91fps
1080TI-97fps
2080 = 1080ti for $800.
Sadly, this is even without Ray tracing enabled?
Performance will tank even further on the 2080 when it is.
 

Timmah!

Golden Member
Jul 24, 2010
1,572
935
136
Slow moving .... low BVH variance .. Huge planar surfaces .. low poly count ...
Yep, I'm starting to finalize the last bit of analysis as to the short comings of these cards.

RTX will be a boon for developers and offline renders in terms of 2-6x speedups. However, it is still not a solid feature for Real time ray tracing. Furthermore, this feature belongs on a separate die with a much more substantial amount of resources. Trying to cram this on a single die with 1080ti performance standards results in capability being cut and ridiculous costs. Maybe 7nm gives them more room but this has all the signs of a crude Version 1.0 but is anything but future proof'd. Developers are ok with this because the bleeding edge has a business case and profits associated with it. Consumers are getting rolled which some are fine with if it allows them to state they have the latest and greatest.

Do i assume correctly that BVH changes with the change of what you see on the screen, so for rendering of static images with off-line renderers its not going to cause bottle-neck? I want to buy these cards for archviz Octane rendering.
Other question @ everyone> Do you think 750W PSU (Seasonic X-series Gold) is gonna be enough for 2x 2080Ti used strictly for Octane (which is otherwise light on CPU?) Made the topic abou this in the PSU section: https://forums.anandtech.com/threads/seasonic-x-series-750w-gold-and-dual-2080ti.2554339/
 

maddie

Diamond Member
Jul 18, 2010
5,191
5,589
136
Do i assume correctly that BVH changes with the change of what you see on the screen, so for rendering of static images with off-line renderers its not going to cause bottle-neck? I want to buy these cards for archviz Octane rendering.
Other question @ everyone> Do you think 750W PSU (Seasonic X-series Gold) is gonna be enough for 2x 2080Ti used strictly for Octane (which is otherwise light on CPU?) Made the topic abou this in the PSU section: https://forums.anandtech.com/threads/seasonic-x-series-750w-gold-and-dual-2080ti.2554339/
Are you only going to use this for light CPU software? In other words, when spending this sort of money, why run the power supply so close to the limit, if you later decide to use it in a combined high CPU & GPU scenario?
 

maddie

Diamond Member
Jul 18, 2010
5,191
5,589
136
So the 2080 and 1080TI both win 2 each, and the others are within margin of error the same, but the 2080 is $150 more (newegg approx 650 vs 800), has 3 gig less memory, and a lot less cuda cores.

2080 = 1080ti for $800.
Sadly, this is even without Ray tracing enabled?
Performance will tank even further on the 2080 when it is.

Don't worry, I'm sure a few promoters will soon tell us how great this situation is, and how lucky we are to alive now.
 

ub4ty

Senior member
Jun 21, 2017
749
898
96
So the 2080 and 1080TI both win 2 each, and the others are within margin of error the same, but the 2080 is $150 more (newegg approx 650 vs 800), has 3 gig less memory, and a lot less cuda cores.
And you haven't even seen what happens to FPS when you enabled Ray tracing making the supposed 'new feature' even less valuable when your FPS gets cut by 20-30% further.
Failure @ launch.
 

ub4ty

Senior member
Jun 21, 2017
749
898
96
Are you only going to use this for light CPU software? In other words, when spending this sort of money, why run the power supply so close to the limit, if you later decide to use it in a combined high CPU & GPU scenario?
Agreed, if he has enough money for two 2080tis the least he can do is get that PSU to 1000 Watts. $2,400 on GPUs and a $60 PSU o_O
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
And you haven't even seen what happens to FPS when you enabled Ray tracing making the supposed 'new feature' even less valuable when your FPS gets cut by 20-30% further.
Failure @ launch.
I guess the idea was to implement DSSL with Ray tracing and have no performance loss and no quality of graphics loss.
If true , that sounds ok to me.
 

Timmah!

Golden Member
Jul 24, 2010
1,572
935
136
Are you only going to use this for light CPU software? In other words, when spending this sort of money, why run the power supply so close to the limit, if you later decide to use it in a combined high CPU & GPU scenario?

Yes, both cards only for Octane, which is light on CPU (CoreTemp says about 80W power on CPU package during rendering, Cinebench for comparison is 250W :-D). I dont game much and if i do, i will be certainly use only single card, since Octane does not need or like SLI on anyway. I honestly dont see any other possible "combined" scenario.
I realize it would be safer choice to buy bigger PSU, but something quality like Seasonic rated at more than 750W i would be looking at possibly 300 or 400 EUROs - i am really not ready to spend that much right now. The cards are already uber expensive themselves.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126

GeForce RTX 2080 DLSS Performance Analysis.

Good info here.

Power usage, minimum fps, easily implemented, and Nvidia does it for free for game for developers.
 
Last edited:

ub4ty

Senior member
Jun 21, 2017
749
898
96
Do i assume correctly that BVH changes with the change of what you see on the screen, so for rendering of static images with off-line renderers its not going to cause bottle-neck? I want to buy these cards for archviz Octane rendering.
Other question @ everyone> Do you think 750W PSU (Seasonic X-series Gold) is gonna be enough for 2x 2080Ti used strictly for Octane (which is otherwise light on CPU?) Made the topic abou this in the PSU section: https://forums.anandtech.com/threads/seasonic-x-series-750w-gold-and-dual-2080ti.2554339/
Yes, as you shift around on the screen the BVH changes. A simple summary is.. There is the initial BVH which is costly, then there are updates that are less costly. Updates can be done when an object moves in a simplistic manner across the scene... Say a car. There are tons of other ways to speed this up nonethless nothing can be done until it is stable and computed. This isn't something you should worry about beyond real-time ray tracing. It's a portion of the pipeline but nothing substantial. When you start pushing high frame rates though and force completition within narrow windows, you can see how it starts to occupy a bigger and bigger portion of the render time.

For offline rendering, I would most definitely grab a 2080ti and it is priced as such. The boost in ray tracing performance vs non-HW accelerated cards is pretty substantial... Its just not ready IMO for real-time ray tracing. For offline ray tracing, this is the new standard IMO. For PSU, I'd get a 1000W gold at a minumum. These consume about 280W each when you start considering NVlink being driven I'd think. You're at 560W there.. Slap 100Watts or more for the CPU and now you're at 660...

The GeForce RTX 2080 Ti registers ~277W through our stress test and almost 279W in our gaming loop


A little to close for comfort IMO. That being said, you can always get a watt meter, drop them in your case and do some testing to see how much your system is drawing . However, if you're droping $2,400 on video cards, it'd be nice to upgrade your PSU here.
 
Last edited:
  • Like
Reactions: Timmah!

ub4ty

Senior member
Jun 21, 2017
749
898
96
Yes, both cards only for Octane, which is light on CPU (CoreTemp says about 80W power on CPU package during rendering, Cinebench for comparison is 250W :-D). I dont game much and if i do, i will be certainly use only single card, since Octane does not need or like SLI on anyway. I honestly dont see any other possible "combined" scenario.
I realize it would be safer choice to buy bigger PSU, but something quality like Seasonic rated at more than 750W i would be looking at possibly 300 or 400 EUROs - i am really not ready to spend that much right now. The cards are already uber expensive themselves.
Where in the world do you live? o_O
Jesus Europe..
A 1KW Corsair Gold PSU rated at 9.8/10 by Johny Guru runs about $110 on sale in the states
 
  • Like
Reactions: Timmah!

happy medium

Lifer
Jun 8, 2003
14,387
480
126
https://www.nvidia.com/en-us/geforce/new...mber-2018/

Newly-Announced DLSS Titles:

Darksiders III from Gunfire Games / THQ Nordic
Deliver Us The Moon: Fortuna from KeokeN Interactive
Fear The Wolves from Vostok Games / Focus Home Interactive
Hellblade: Senua's Sacrifice from Ninja Theory
KINETIK from Hero Machine Studios
Outpost Zero from Symmetric Games / tinyBuild Games
Overkill's The Walking Dead from Overkill Software / Starbreeze Studios
SCUM from Gamepires / Devolver Digital
Stormdivers from Housemarque


Other Titles Implementing DLSS:

Ark: Survival Evolved from Studio Wildcard
Atomic Heart from Mundfish
Dauntless from Phoenix Labs
Final Fantasy XV: Windows Edition from Square Enix
Fractured Lands from Unbroken Studios
Hitman 2 from IO Interactive / Warner Bros.
Islands of Nyne from Define Human Studios
Justice from NetEase
JX3 from Kingsoft
Mechwarrior 5: Mercenaries from Piranha Games
PlayerUnknown’s Battlegrounds from PUBG Corp.
Remnant: From The Ashes from Arc Games
Serious Sam 4: Planet Badass from Croteam / Devolver Digital
Shadow of the Tomb Raider from Square Enix / Eidos-Montréal / Crystal Dynamics / Nixxes
The Forge Arena from Freezing Raccoon Studios
We Happy Few from Compulsion Games / Gearbox
 

ub4ty

Senior member
Jun 21, 2017
749
898
96
This game's release will cause me to upgrade my gaming GPU :
Not a moment sooner.

Geforce20 is great for developers and people focused on offline rendering.
It is a great first step in moving the ball forward in graphics.
For consumers, it's far too expensive.
No one has any fundamental issue with the card beyond price.
So, the broader community seems they will revisit this feature set down the road when its more affordable. It seems we will have a much great array of games at this point too.. The drivers will have matured and there will be actual content to exploit the features. Hopefully these new games and content come with bold new ideas and experiences like Cyberpunk 2077
 
Last edited:

sze5003

Lifer
Aug 18, 2012
14,320
683
126
Very much looking forward to to cyberpunk 2077.

I'm starting to think I'll just get a new monitor sometime soon and keep my 1080ti. Looking at the Alienware ultrawide 3440x1440p along with a new desk lol.

The more frames with the 2080ti is nice but it's pretty much 20+ or a bit more in most games I play even at ultrawide resolutions.

I have no problem with keeping my 2080ti order it just rubs me the wrong way with the price not making me really feel like I'm getting something that much greater than what I have at the resolution I'm looking to move to.
 
  • Like
Reactions: ub4ty

Fallen Kell

Diamond Member
Oct 9, 1999
6,240
555
126
2080 = 1080ti for $800.
Sadly, this is even without Ray tracing enabled?
Performance will tank even further on the 2080 when it is.
The statement of the performance will tank further fails to take into account that the ray tracing has its own core. It does share some work with the other execute units within the SM (I believe they are essentially driving the work on the RT core, sending it data, etc... But time will tell once more sites get their reviews and data out there.
 

Fallen Kell

Diamond Member
Oct 9, 1999
6,240
555
126
Very much looking forward to to cyberpunk 2077.

I'm starting to think I'll just get a new monitor sometime soon and keep my 1080ti. Looking at the Alienware ultrawide 3440x1440p along with a new desk lol.

The more frames with the 2080ti is nice but it's pretty much 20+ or a bit more in most games I play even at ultrawide resolutions.

I have no problem with keeping my 2080ti order it just rubs me the wrong way with the price not making me really feel like I'm getting something that much greater than what I have at the resolution I'm looking to move to.
And that is probably the same position most gamers will find themselves. Without 4K monitors, the 1080ti or 1080 is all that is needed. Sure if you want ray tracing, you will need a 2080 or 2080ti, but there is really not much in terms of games that will be out there that can take advantage of ray tracing yet (we do however, need the hardware to exist in order for software to see a reason to implement). Don't get me wrong, I am not knocking on the RTX lineup. It is definitely something that can show a benefit, but it is most definitely an early adopter tech, and something that will probably not really show its use for 2-4 years (by which time there will be second or even third generation hardware out there, assuming there is enough support on the game developers/publishers side).
 
Last edited:
  • Like
Reactions: ub4ty

happy medium

Lifer
Jun 8, 2003
14,387
480
126
This game's release will cause me to upgrade my gaming GPU :
Not a moment sooner.

Geforce20 is great for developers and people focused on offline rendering.
It is a great first step in moving the ball forward in graphics.
For consumers, it's far too expensive.
No one has any fundamental issue with the card beyond price.
So, the broader community seems they will revisit this feature set down the road when its more affordable. It seems we will have a much great array of games at this point too.. The drivers will have matured and there will be actual content to exploit the features. Hopefully these new games and content come with bold new ideas and experiences like Cyberpunk 2077
Cyberpunk 2077 will have hairworks and Raytracing support.
I'm sure it will have DSLL but I cant confirm that yet. Nvidia is heavily invested it seems in this title.
It looks like it wont be released till some time next year at the earliest. We might have 7nm next gen Nvidia gpu's by then. This title will have beastly system requirements though.
 

zinfamous

No Lifer
Jul 12, 2006
111,947
31,484
146
This game's release will cause me to upgrade my gaming GPU :
Not a moment sooner.

Geforce20 is great for developers and people focused on offline rendering.
It is a great first step in moving the ball forward in graphics.
For consumers, it's far too expensive.
No one has any fundamental issue with the card beyond price.
So, the broader community seems they will revisit this feature set down the road when its more affordable. It seems we will have a much great array of games at this point too.. The drivers will have matured and there will be actual content to exploit the features. Hopefully these new games and content come with bold new ideas and experiences like Cyberpunk 2077

Look.

It's the only game I care about, and, to be quite honest: if the player-character genitals are made better with the addition of 10x Gigarays, then I will purchase an nVidia crapbox 2xxx series or 3xxx series at the time.

It will happen. But not at these prices.
 
  • Like
Reactions: ub4ty