Nvidia ,Rtx2080ti,2080,(2070 review is now live!) information thread. Reviews and prices

Page 14 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Thats why i was referring to x86/AMD as conservative approach.
On the other hand there are few things to consider:
1) Backwards compatibility is a non issue, since with emulation technology you easily achieve current gen CPU performance. In fact i consider the lack of performance of current gen CPUs a good opportunity for a move to ARM with emulation for backwards compatibility.
2) Games are sold side-by-side but they are still completely different builds. A different CPU architecture would not get much in the way.
3) Not sure what R&D costs you are referring to.

Emulating x86 code on ARM is very expensive. Any time you go from CISC to RISC (or vice versa) there is a big performance hit. Emulating x86 on x86 is super good. And emulating slow ARM on x86 is doable.

With Ryzen being as awesome as it is, I would be surprised if either console maker goes with something other than that. BUT, anything is possible!
 

zinfamous

No Lifer
Jul 12, 2006
111,947
31,484
146
I'm contemplating the Alienware 34 ultrawide but truth is any of these monitors will take up 11-13 inches of my 24 inch desk. Since I also use it for flight in VR I keep the keyboard pushed up so I can use the mouse and have room for the rest of my gear.

Looked at monitor arms but this desk is like 6 inches thick in the back with metal. What a PITA, too bad there are no ultrawide with low profile stands like my Dell. Maybe this is a sign I should just forget it and stick with what I got.

ec3e48d9202d4ba9e011003fd6941677.jpg

Dude what you've got there is a writing desk from World Market or Ikea. The two drawers there at that depth, and how shallow the surface is: it's a writing desk....that ain't meant for what you want to do with it.

Just go into the dining section of Ikea or wherever and pick up a cheap breakfast table for a hundred bucks or less. All you need is a deep and long flat surface, with a shallow underside. If you need to raise the monitor off of that desk (if you go ultrawide), then cobble together a simple 6"-12" stand from 3 spare wood parts.
 

zinfamous

No Lifer
Jul 12, 2006
111,947
31,484
146
I upgraded my desk for my ultrawide also. Way worth it.

Here's what I got: https://www.amazon.com/Bush-Business-Furniture-Slate-Spectrum/dp/B000W8JO0W

And here's what it looks like compared to my screen. The desk was about $250 and its pretty solid. I would have a better pic but this is an old one I took to show off that wall sign, but you can see the desk with monitor on it. That's a 34" ultrawide. Crap, sorry for huge pic.

rnezJ0X.jpg

dude, your vaporizer is intimidating. I don't think mine can burn the weed at whatever temps yours can.

I bet I have the same Costco chair, though! :D
 
  • Like
Reactions: moonbogg

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
The AnandTech real-world power use is interesting to me, and not in a good way:

https://www.anandtech.com/show/1334...x-2080-ti-and-2080-founders-edition-review/16

The "215 watt TDP" 2080 actually draws 10 watts more than the 1080 ti FE in Battlefield 1, despite the claimed 35 watt lower TDP. The 2080 FE draws +20 watts, the 2080 ti is +61 watts(!) and the FE is +66(!)

So like intel's 7700K* they're running outside their sweet spot for power efficiency and like it they're "factory overclocked" even for the non-FE models.

ArsTechnica also had power draw issues using their cards in VR. If you only have a 650 watt PSU you should probably upgrade to a 750 watt.

( * or was it the 6700K? intel was boring for so many years before Ryzen made them release Coffee lake the generations blend together in my mind.... )
 
Last edited:
  • Like
Reactions: prtskg

sze5003

Lifer
Aug 18, 2012
14,320
683
126
I upgraded my desk for my ultrawide also. Way worth it.

Here's what I got: https://www.amazon.com/Bush-Business-Furniture-Slate-Spectrum/dp/B000W8JO0W

And here's what it looks like compared to my screen. The desk was about $250 and its pretty solid. I would have a better pic but this is an old one I took to show off that wall sign, but you can see the desk with monitor on it. That's a 34" ultrawide. Crap, sorry for huge pic.

rnezJ0X.jpg
Very nice! You guys have convinced me ultrawide looks very nice. I took a look at that on Amazon. 60 inches refers to the length right?

The one I have is 52 inches long which leaves room in the window area for my case and other stuff.

I could even make due with the 48 inch version of your model as long as the width is the big enough for such a monitor.

I wanted the drawers to put crap in and color to match my PC case. Guess I'll be looking for a new desk sometime soon. Good thing ikea is a few min away.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,576
126
The AnandTech real-world power use is interesting to me, and not in a good way:

https://www.anandtech.com/show/1334...x-2080-ti-and-2080-founders-edition-review/16

The "215 watt TDP" 2080 actually draws 10 watts more than the 1080 ti FE in Battlefield 1, despite the claimed 35 watt lower TDP. The 2080 FE draws +20 watts, the 2080 ti is +61 watts(!) and the FE is +66(!)

So like intel's 7700K* they're running outside their sweet spot for power efficiency and like it they're "factory overclocked" even for the non-FE models.

ArsTechnica also had power draw issues using their cards in VR. If you only have a 650 watt PSU you should probably upgrade to a 750 watt.

( * or was it the 6700K? intel was boring for so many years before Ryzen made them release Coffee lake the generations blend together in my mind.... )
Tom's found only about 30 watts more under gaming and torture for the 2080ti over the 1080ti, or 20W over the claimed TDP.

The GeForce RTX 2080 Ti registers ~277W through our stress test and almost 279W in our gaming loop (nearly 20W higher than Nvidia's official TDP rating).
 

tamz_msc

Diamond Member
Jan 5, 2017
3,865
3,730
136
Thats why i was referring to x86/AMD as conservative approach.
On the other hand there are few things to consider:
1) Backwards compatibility is a non issue, since with emulation technology you easily achieve current gen CPU performance. In fact i consider the lack of performance of current gen CPUs a good opportunity for a move to ARM with emulation for backwards compatibility.
2) Games are sold side-by-side but they are still completely different builds. A different CPU architecture would not get much in the way.
3) Not sure what R&D costs you are referring to.
You're grossly overestimating what standard ARM cores can do. Until the time comes when custom ARM designs achieve parity with x86 CPUs(like Apple cores do for low power), it would be foolish to create a home console based on it.
 

Thala

Golden Member
Nov 12, 2014
1,355
653
136
Emulating x86 code on ARM is very expensive. Any time you go from CISC to RISC (or vice versa) there is a big performance hit. Emulating x86 on x86 is super good. And emulating slow ARM on x86 is doable.

For backwards-compatibility you do not have to be faster than the original console. The question is, if emulation can fulfill this requirement - and the answer is yes it can.

With Ryzen being as awesome as it is, I would be surprised if either console maker goes with something other than that. BUT, anything is possible!

Thing is, a Cortex A76 or its successor is even more awesome. Anandtech expects in his Cortex A76 review a Geekbench-score of 3900@3GHz while consuming 750mW per core.

Regarding predictions, i am with you and expecting a Zen/Vega(Navi) architecture but i would like to see an ARM/NVidia one - i am just arguing that the latter one is feasible. With the gained saving from both ARM and NVidia with respect to area/power it should be possible to add RT cores without sacrificing traditional render performance.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,576
126
 Power figure represents Graphics Card TDP only. The use of the VirtualLink™/USB Type-C™ connector requires up to an additional 35 W of power that is not represented in this power figure.

White paper.
 

SMOGZINN

Lifer
Jun 17, 2005
14,359
4,640
136
Any idea as to the required amount of data needed for this to work? I assume it will have to be added to the drivers over time or will it be added to game files? How many titles will have this? Not a pressing issue, just curious.

A better question is what will Nvidia demand for doing this? We have seen them trending towards anti-competitive tactics recently, what demands are they going to make on devs to use their supercomputer to do the DLSS deep AI pass?

Only downside to 34 inch is the stupid base legs that will eat up desk space. But I'll definitely go check them out this weekend.

I have not looked at that specific monitor, but almost all of these have VESA mounts, so just get yourself a desk or wall mount like this. It makes a world of difference when your monitor is not sitting on the desk, and it is nice to be able to quickly move it around to different positions when you want to do other things with the space.
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
I've spent a long time mulling this over, because I have a large savings pot and could easily afford the 2080 Ti. But the fairly harsh Gamers Nexus review kinda nailed it for me, I think it's right to look at traditional performance right now, the 2080 is pointless because it's right on par with the 1080Ti but way more expensive. And the 2080Ti is just bonkers expensive for the extra frames.

I'm running a GTX 1080 right now and the 2080Ti is not all that far off being 2x faster in 4k, and that's what I'm gaming in. I'd say 99% of my steam library runs fine in 4k with a 1080, for the most part maxed out, or with some games 1-2 settings turned down a bit. So even for 4k gamers I'd a hard sell.

I think more than anything what frustrates me most here is that Nvidia has made an insanely large expensive chip which is fine, but then dedicated a huge chunk, something like 1/3rd of the compute transistors to DL/RT stuff that's not really available for raster ops. What they ought to do and probably what AMD could do in the meantime is go and make a similar chip but ignore the RT stuff for now and blast all that power at rasterization. Then you'd have an utter beast of a video card, you could have a card like like 2.5x faster than a 1080. Or opt for smaller cheaper chip that matches the 2080ti in speed but isn't a ludicrous price. So I'm probably going to play the waiting game and see what AMD are up to.

RTX does look nice but it's not something that'll run in a res greater than 1080p and I just can't take a step back from 4k. I had to in Kingdom come deliverance, I couldn't get that game above 30fps in 4k even turning everything off. And running even in 2560x1440 it's just not the same. It feels like Nvidia is 1 generation off with this, they should have conquered 4k in modern games first, brought that to the masses now 4k panels are cheap, then moved on to ray tracing after that.
 
  • Like
Reactions: krumme

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
I think it's right to look at traditional performance right now, the 2080 is pointless because it's right on par with the 1080Ti but way more expensive. .

$800 for 2080 vs $700 1080Ti (checking newegg). $100/15% is "way more expensive"?

Sure if you are buying used 1080Ti vs new 2080, then the 2080 is way more expensive, but when has used vs new ever been a fair comparison?

If I was looking at spending $700 for a 1080Ti I would certainly consider spending the extra $100 for the same performance plus a bunch of forward looking features.
 

Paratus

Lifer
Jun 4, 2004
17,728
16,030
146
I've spent a long time mulling this over, because I have a large savings pot and could easily afford the 2080 Ti. But the fairly harsh Gamers Nexus review kinda nailed it for me, I think it's right to look at traditional performance right now, the 2080 is pointless because it's right on par with the 1080Ti but way more expensive. And the 2080Ti is just bonkers expensive for the extra frames.

I'm running a GTX 1080 right now and the 2080Ti is not all that far off being 2x faster in 4k, and that's what I'm gaming in. I'd say 99% of my steam library runs fine in 4k with a 1080, for the most part maxed out, or with some games 1-2 settings turned down a bit. So even for 4k gamers I'd a hard sell.

I think more than anything what frustrates me most here is that Nvidia has made an insanely large expensive chip which is fine, but then dedicated a huge chunk, something like 1/3rd of the compute transistors to DL/RT stuff that's not really available for raster ops. What they ought to do and probably what AMD could do in the meantime is go and make a similar chip but ignore the RT stuff for now and blast all that power at rasterization. Then you'd have an utter beast of a video card, you could have a card like like 2.5x faster than a 1080. Or opt for smaller cheaper chip that matches the 2080ti in speed but isn't a ludicrous price. So I'm probably going to play the waiting game and see what AMD are up to.

RTX does look nice but it's not something that'll run in a res greater than 1080p and I just can't take a step back from 4k. I had to in Kingdom come deliverance, I couldn't get that game above 30fps in 4k even turning everything off. And running even in 2560x1440 it's just not the same. It feels like Nvidia is 1 generation off with this, they should have conquered 4k in modern games first, brought that to the masses now 4k panels are cheap, then moved on to ray tracing after that.

You make some good points that basically explain why this launch was so different from previous launches.
  • 2080 & 2080Ti launched together
  • Virtually no early performance data
  • FE’s are now OC’d compared to the ‘baseline’
  • Emphasis on new features
  • Emphasis on new NV developed performance measurements
  • Improved FE cooler
  • Ti priced like a Titan
The reason obviously is the baseline 2080 was slower than the 1080Ti unlike the 1080 vs 980Ti or 980 vs 780Ti.

The new cooler allows the FE model to be OC’d from the factory making it faster than 1080Ti FE and on par with AIB 1080Tis.

The lack of normal performance data and emphasis on new features and their performance hides the lack of regular performance.

Finally without the 2080Ti the launch would have been a complete flop. 12+ months later and the new top of the line is $100 more and no faster than the old top of line?

So why is it like this?
  • So AMD dropped the ball - no high end competition but previous Pascal
  • Stuck on the same process again means larger more power hungry chips or no performance or feature improvements. (Similar issues on 28nm)
  • Focus on vendor features rather than performance means locking high end PC gaming to NV. Devs can start NOW on learning to code for RTX features that will help NV down the road when/if AMD performance catches back up.
So if you are waiting on an improved process this seems like the best time to drop a feature rich but marginally performance improved, larger, hotter, more power hungry, more expensive GPU to try and pivot developers to benefit you down the line. Because hey, what are you going to do for a new highend GPU get a 1440P Vega64?
 
  • Like
Reactions: Ajay

Hitman928

Diamond Member
Apr 15, 2012
6,750
12,479
136
It ends up being more like $140 - $160 (rebate) as a cost comparison. 1080 Ti vs 2080.

Without any available game support at launch, it's just all about how much you believe RTX will be supported by devs going forward and if those features are worth the extra cost to you.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
27,375
16,217
136
It ends up being more like $140 - $160 (rebate) as a cost comparison. 1080 Ti vs 2080.

Without any available game support at launch, it's just all about how much you believe RTX will be supported by devs going forward and if those features are worth the extra cost to you.
The only thing I see (aside from benchmarks) is only 8 gig for the 2080 and 11 gig for the 1080ti. Also cuda cores, only 2944 vs 3584. In many apps not yet benchmarked, both of those make a difference, and so for $140 less, I will take the 1080TI in a heartbeat.
 
  • Like
Reactions: kawi6rr and psolord

realibrad

Lifer
Oct 18, 2013
12,337
898
126
It really does feel like Nvidia did this as a way to deal with the 1080/ti stock. These cards do not compete with those cards in any real way. I expect the next cards to come up very quickly once the 1080/ti stock goes down and prices drop for the 2080/ti too much.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
The only thing I see (aside from benchmarks) is only 8 gig for the 2080 and 11 gig for the 1080ti. Also cuda cores, only 2944 vs 3584. In many apps not yet benchmarked, both of those make a difference, and so for $140 less, I will take the 1080TI in a heartbeat.

But the 2080 has more efficient CUDAs, so it is really hard to tell, which way untested applications will swing:
100948.png


Then there is the new frontier of Tensor AI application that will open up...

I bet you won't make it much more than 6 months before you get an RTX card to play with.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
It really does feel like Nvidia did this as a way to deal with the 1080/ti stock. These cards do not compete with those cards in any real way. I expect the next cards to come up very quickly once the 1080/ti stock goes down and prices drop for the 2080/ti too much.

What "This" are you referring to?

Pricing? Pricing is up because of the massive dies, more expensive GDDR 6 memory, that both cost more money, and surprise they pass these increases on to the consumer.

If we were living in an alternate universe with more competitive AMD, you still wouldn't be getting these cards cheaper, you would be getting different cards without RT, and smaller dies cheaper.