Nvidia ,Rtx2080ti,2080,2070, information thread. Reviews and prices September 14.

Page 15 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Jul 24, 2017
93
25
61

Yeah, and if you look at the number of games on that list that are already out then it kinda proves my point - that ray tracing will just be a "cherry on top" feature that makes the game look a little cooler but won't be essential. When I say "developed with ray tracing as a core feature" I'm talking about a hypothetical game game where the core experience is significantly worse with RT features off. I don't think that will come for quite a few years yet.
 
Mar 11, 2004
23,444
5,852
146
There's no way prices are going to elevate higher above these already insane MSRPs.
I'm maybe thinking about the 2070 but with no performance details that's still questionable. If what I hear is correct 2070 has no nvlink you can just about forget it. $800 is ridiculous for an nvlink'd 2080 that I am still unsure about the hardware access gimping. $1200 for a 1080ti is outright comedy o_O.

This is what got Intel into trouble and I'm sure come 2019, Nvidia is going to be a steep price.
12nm ... being phased out by 7nm in one year and this is the way you launch a product you need people to critically adapt? KEK

I'd have bought one if I thought they'd be unavailable, prices would rise, and I'd be able to charge some pleb double msrp on ebay.. I don't think that is even possible or profitable after fees.

No way prices go above MSRP. The 2070 is the only reasonable card. However, if 2070 has no Nvlink, its DOA.

1080/1080ti are still in stock for pre-order as one would expect at these prices after hours sitting on Nvidia's website

I'm not sure why you're so focused on NVLink. These are consumer gaming cards.

I also think you're not seeing things from Nvidia's perspective. These cards are not going to provide criticality for ray-tracing. This is just the first step, and it will be a longer term thing. They're not thinking "oh we need these out ASAP so that we can get ray-tracing in games all over!" Nvidia might be greedy but they're not stupid, they know ray tracing won't take off until mass market products have it.

The only reason they're selling these consumer cards is because they brought out the Quadros, and they need to produce enough to justify the cost of developing the chip. This lets them whereas Quadros alone probably wouldn't. It gives them something to tout (after Volta no-showed in anything resembling a non-commercial product last year), they needed new Quadros (and had a compelling reason to release them with the ray-tracing hardware) but the cost of producing them is high without some volume, and even then its expensive (as seen in the pricing). But because of the ray-tracing hardware they can somewhat justify the extra cost in consumer cards, and it gets the ball rolling on developers utilizing the feature, so that when they implement it in more affordable products, they can point to games already implementing it.
 

Ranulf

Platinum Member
Jul 18, 2001
2,920
2,601
136
There's no way prices are going to elevate higher above these already insane MSRPs.
I'm maybe thinking about the 2070 but with no performance details that's still questionable. If what I hear is correct 2070 has no nvlink you can just about forget it. $800 is ridiculous for an nvlink'd 2080 that I am still unsure about the hardware access gimping. $1200 for a 1080ti is outright comedy o_O.

This is what got Intel into trouble and I'm sure come 2019, Nvidia is going to be a steep price.
12nm ... being phased out by 7nm in one year and this is the way you launch a product you need people to critically adapt? KEK

I'd have bought one if I thought they'd be unavailable, prices would rise, and I'd be able to charge some pleb double msrp on ebay.. I don't think that is even possible or profitable after fees.

No way prices go above MSRP. The 2070 is the only reasonable card. However, if 2070 has no Nvlink, its DOA.

1080/1080ti are still in stock for pre-order as one would expect at these prices after hours sitting on Nvidia's website

Eh, the hype train will get them over MSRP for at least the first go round after release. NE already has the Asus blower card going for $1200.

https://www.newegg.com/Product/Prod...1&cm_re=PPSSGZSCRAVQVC-_-14-126-248-_-Product
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
Yeah, and if you look at the number of games on that list that are already out then it kinda proves my point - that ray tracing will just be a "cherry on top" feature that makes the game look a little cooler but won't be essential. When I say "developed with ray tracing as a core feature" I'm talking about a hypothetical game game where the core experience is significantly worse with RT features off. I don't think that will come for quite a few years yet.

Yep, hardware PhysX all over again. With PhysX for years adding a second GPU to handle it meant you got some minor extra eye candy like blowing leaves.

And half the time it felt like the extra bits were intentionally held back from the software version just to earn those co-marketing dollars and other support from nvidia.

Maybe it will be different this time but I'm not going to hold my breath.
 

MrTeal

Diamond Member
Dec 7, 2003
3,919
2,708
136
Without knowing the performance vs the current gen, I can't see why anyone would want to pre-order a 2080. The Ti makes sense; it's going to be faster than the 1080 Ti and if you really want one, you might as well put in a preorder. The 2080 is a couple hundred more than 1080 Ti's have been going for though, dropping that kind of cash without knowing the performance delta seems ill advised.
 

Guru

Senior member
May 5, 2017
830
361
106
Ray Tracing is years off, at least 5-6 years off to be rendered in real time efficiently and at high fps. Thing is they will have some half baked implementation that is just a gimmick like physX, in order to justify the massive price increases in these new cards.

Essentially the absurd 'hey the ONE flag in our game now moves, only with physX'.

Ray Tracing is amazing, when done fully and properly, but again we are years off of a consumer graphic card that can actually process it in real time with great efficiency.
 

alcoholbob

Diamond Member
May 24, 2005
6,390
470
126
Without knowing the performance vs the current gen, I can't see why anyone would want to pre-order a 2080. The Ti makes sense; it's going to be faster than the 1080 Ti and if you really want one, you might as well put in a preorder. The 2080 is a couple hundred more than 1080 Ti's have been going for though, dropping that kind of cash without knowing the performance delta seems ill advised.

If the clock speeds are disappointing you might even be better off buying a used Titan V...their dropping down to the 1600 range used.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Ray Tracing is years off, at least 5-6 years off to be rendered in real time efficiently and at high fps. Thing is they will have some half baked implementation that is just a gimmick like physX, in order to justify the massive price increases in these new cards.

Essentially the absurd 'hey the ONE flag in our game now moves, only with physX'.

Ray Tracing is amazing, when done fully and properly, but again we are years off of a consumer graphic card that can actually process it in real time with great efficiency.

Says the guy who was telling us he was certain we were not getting RTX cards but plain old GTX cards:
OMG this is so silly. The RTX brand is reserved for the already announced quadro cards, which even though were announced few days ago, they will be coming Q4 2018....

So not to burst your bubble, but all of the "hints" in the video are for the RTX Quadro announcement. Its going to be GTX 2000 series for desktop with the memory and price point that I've already posted few times and you can check in my post history and coming late Q4 for the solo GTX 2080 and then rest of the series in 2019.

I don't think your guesswork is worth much. RTX HW is what NVidia is delivering, and they have been working with devs behind the scenes to get support into major game engines, and to get updates for already shipping games.
 

24601

Golden Member
Jun 10, 2007
1,683
40
86
Finally, we can go back to the DX9 era of real rendering instead of 100 layers of photoshop filters.
 

pj-

Senior member
May 5, 2015
505
279
136
Says the guy who was telling us he was certain we were not getting RTX cards but plain old GTX cards:


I don't think your guesswork is worth much. RTX HW is what NVidia is delivering, and they have been working with devs behind the scenes to get support into major game engines, and to get updates for already shipping games.

I can't wait to see how ugly and disjointed RTX looks in pubg
 
  • Like
Reactions: USER8000 and krumme

Despoiler

Golden Member
Nov 10, 2007
1,968
773
136
Ray Tracing is years off, at least 5-6 years off to be rendered in real time efficiently and at high fps. Thing is they will have some half baked implementation that is just a gimmick like physX, in order to justify the massive price increases in these new cards.

I agree. JHH talked a lot about full screen realtime ray tracing (RTRT, RRT? we got another acronym folks), but the reality there are only going to be 3 effects that are in play for this generation. I think this stuff is cool from a tech perspective and I'm always happy to see things advance, but this is not a game changer as saleman #1 JHH presented. None of the demos were in a live game and I think that is telling. Slowing stuff down and pointing out the difference is one thing. Playing it realtime is another. I would not be gawking at the effects in BF5 when people are shooting at me either. I can see it now. 70% of the BF5 gamers eliminated per match were by knife while watching the L33T GigaRays.
 
Last edited:

Paul98

Diamond Member
Jan 31, 2010
3,732
199
106
This will be quite interesting when actual benchmarks come out, and I really like the sound of not having fake it to get good dynamic lighting and shadows. Also how do these RT cores work with the rest of the GPU, if you are using RT cores for these things does it mean it will free up GPU power that would normally used to fake these effects on the none RT cores? Or is this just marketing BS?
 

sze5003

Lifer
Aug 18, 2012
14,320
683
126
In order for Ray tracing to pick up mainstream the consoles have to have a chip capable of doing it. Then it will become mainstream amongst games. I see this going the way physix did.

Waiting on 2080ti vs 1080ti performance results without the rtx enabled before I consider it.
 

ub4ty

Senior member
Jun 21, 2017
749
898
96
Not much of a worry for NVidia, since they haven't sold GPUs to Apple for years. That is much bigger worry for AMD, because Apple is probably one their most significant GPU buyers.



The DX12 ray tracing announcement, was in March GDC announced together with NVIdia RTX. All the demos were NVidia Demos. The Microsoft raytracing dev kit HW was NVidia Volta. This is pretty much a Microsoft/NVidia partnership, and NVidia would have been driving the APIs in favorable directions to work the actual HW it was building.

NVidia has a huge lead in real-time Raytracing.
https://gpuopen.com/announcing-real-time-ray-tracing/
https://www.guru3d.com/news-story/amd-also-announces-support-for-real-time-ray-tracing.html
https://www.tomshardware.com/news/amd-radeon-rays-raytracing-software,36702.html

It was all a meme. I used to read word up magazine.

For both AMD/Nvidia, it's all a meme until there are sound benchmarks and details provided.
I don't like the fact that Proprietary DirectX-12 is all over this technology and am still parsing materials to understand the clear demarkation between their stack and Nvidia's/AMDs. All together this seems like a beta feature that is still being worked out. Launching the world's new game changing technology and providing no benchmarks or elaborations beyond marketing slides isn't confidence inspiring. I'm not impressed by either AMD/Nvidia especially when I have learned another company put this into a functional card over 10 years ago. It's just vector math processors fundamentally.

R2500-inline.jpg
 
  • Like
Reactions: psolord

yottabit

Golden Member
Jun 5, 2008
1,672
874
146
I only caught part of the presentation

Did they detail out the functionality of the "RT Cores" and how many there are?

I am assuming they are physical hardware/die space? Separate from the regular shader/CUDA cores?

Can they be used for anything else if not real time ray tracing or just sit idle?
 

Paul98

Diamond Member
Jan 31, 2010
3,732
199
106
If this really does allow realistic lighting from both GI, and small lights that look good without extra work it will be amazing.
 

ub4ty

Senior member
Jun 21, 2017
749
898
96
I only caught part of the presentation

Did they detail out the functionality of the "RT Cores" and how many there are?

I am assuming they are physical hardware/die space? Separate from the regular shader/CUDA cores?

Can they be used for anything else if not real time ray tracing or just sit idle?
Nope, it wasn't specified. This is what I'm waiting for. They're scant on all of the details that matter which inspires little to no confidence from me. I'm currently trying to confirm whether or not Nvlink is missing from 2070 and lower. If it is, good god they really are going for a screw job. If you search around, you'll find one of the companies that had ray tracing working in hardware some years ago. They fully detail the hardware pipeline and how it interfaces to the broader GPU pipeline. From what Nvidia has detailed, its essentially the same. My understanding on the processing is that its essentially just straight forward vector math.. A bunch of ALUs which is why it can process so many rays at once.
block-diagram.png


My bigger question is whether or not this can be broken out to a separate chip w/o latency issues. Baring latency issues, I see no reason to go the intel route of huge monolithic dies which cause for insane pricing.
 
Last edited:

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
27,406
16,255
136
Well, I just got a 1080TI FTW3 for $729. I may get another if I see them for that again. Until I see benchmarks, no way would I pay $1200 for a 2080TI, that like 66% more for what I see as possibly 30% performance improvement.

Edit: directly from EVGA I can get the FTW3 DT (100 mhz slower) for $649 right now. And the regular FTW3 is $729 right now for Elite members.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
I don't suppose I'll be sitting idle looking for gigarays when people are shooting at my head. But this will look awesome in Witcher 3 where you can take your time to enjoy.

Cyberpunk 2077 even better, since it takes place in high tech world with lots of reflective surfaces and artificial light sources.
 
  • Like
Reactions: ub4ty

sze5003

Lifer
Aug 18, 2012
14,320
683
126
Yea I comepletely forgot about that game. If done right, it will be nice. Especially when it's raining too.
Cyberpunk 2077 even better, since it takes place in high tech world with lots of reflective surfaces and artificial light sources.
 

ub4ty

Senior member
Jun 21, 2017
749
898
96
Well, I just got a 1080TI FTW3 for $729. I may get another if I see them for that again. Until I see benchmarks, no way would I pay $1200 for a 2080TI, that like 66% more for what I see as possibly 30% performance improvement.

Edit: directly from EVGA I can get the FTW3 DT (100 mhz slower) for $649 right now. And the regular FTW3 is $729 right now for Elite members.
They've got a lot of confidence launching at these prices, scant on details about a new feature-set, and void of comparative benchmarks. I was expecting them to try to sell me on it and it was more or less like they were convinced that everyone was just going to begin buying them or that they didn't necessarily want to sell volume. This is above the pricing of some Quadros.. Which I predicted was their eventual goal. The issue comes down to whether or not you get Quadro level access to the cards which I doubt. So, a month from actual launch and this is what everyone has to go by.. On just about every forum and site on the internet everyone is shocked by the pricing and lack of details.

This is what got Intel into its current troubles. The same patterns.
 

sze5003

Lifer
Aug 18, 2012
14,320
683
126
They've got a lot of confidence launching at these prices, scant on details about a new feature-set, and void of comparative benchmarks. I was expecting them to try to sell me on it and it was more or less like they were convinced that everyone was just going to begin buying them or that they didn't necessarily want to sell volume. This is above the pricing of some Quadros.. Which I predicted was their eventual goal. The issue comes down to whether or not you get Quadro level access to the cards which I doubt. So, a month from actual launch and this is what everyone has to go by.. On just about every forum and site on the internet everyone is shocked by the pricing and lack of details.

This is what got Intel into its current troubles. The same patterns.
Had they showed comparison benchmarks with rtx on and off and with the little brother 1080ti I probably would have pre ordered one if the performance is as great as they were talking about. They made it sound like "the 2070 is better the titan xp" and by that statement they expected people to simply believe that the 2080ti is faster by a whole lot. But they were referring to ray trace.

The whole presentation felt like a software presentation.
 
Status
Not open for further replies.