[anandtech] NVIDIA SIGGRAPH 2018 Keynote -RTX GPUs

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
So there's a good chance GTX 2080 will be 3072 shaders? The rumours if it only being 8% faster than the GTX 1080 Ti seem much more viable now.
 

Guru

Senior member
May 5, 2017
830
361
106
Desktop graphics will be GTX 2000 series, just as I've been saying, and yeah it will be GTX, not RTX. RTX is for the new professional market graphics. My bet is still 2019 though, potentially very late Q4 2018 for just the single GTX 2080 as a founders edition costing probably around $700, just before the Holiday season and ride on that as a gift graphic card.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Desktop graphics will be GTX 2000 series, just as I've been saying, and yeah it will be GTX, not RTX. RTX is for the new professional market graphics. My bet is still 2019 though, potentially very late Q4 2018 for just the single GTX 2080 as a founders edition costing probably around $700, just before the Holiday season and ride on that as a gift graphic card.

You keep saying that, but all evidence is to the Contrary.

We are 6 days from the RTX 2080 reveal on Aug 20. According to all the easter eggs in NVidias own video.

This is almost certainly aimed at September shipping date.
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
RoyTeX
Not_11
Mac-20
Eight Tee
gimme 20
2,0,8,0

At least 5 references for 2080 and 1 for RTX. As for date, they had GPS coordinates for Cologne as a clue. That's where Gamescon is in less than a week.

Put it all together.
 

HurleyBird

Platinum Member
Apr 22, 2003
2,818
1,553
136
GV100 will still have more NVLink capacity. Looks like Turing v Volta is an HPC vs Workstation card breakdown. I personally think nVidia has enough $$ and potential revenue to customize the same base level architecture for each niche a bit differently. Volta continues to be the hyperscale chip of choice?

I don't think we've heard anything about DP yet either, so that could go either way.

Whats throwing me, is that if RTX 5000 is GT104 destined for consumer RTX 2080/2070, why would it still be full of Tensor cores? I know they will likely disable them, but it seems like a waste of silicon for what will be a volume consumer part, and probably an insignificant volume of Quadro parts.

It's full of tensor cores to accelerate denoising. It's pretty obvious that Nvidia is looking to beat AMD over the head with the raytracing club. If they get sufficient software support and AMD doesn't have a decent raytracing solution, well, it will be like 3DFX being stuck at 16-bit colour again, only much much worse. I'm just crossing my fingers that Navi has something similar up its sleeve.
 

24601

Golden Member
Jun 10, 2007
1,683
40
86
I don't think we've heard anything about DP yet either, so that could go either way.



It's full of tensor cores to accelerate denoising. It's pretty obvious that Nvidia is looking to beat AMD over the head with the raytracing club. If they get sufficient software support and AMD doesn't have a decent raytracing solution, well, it will be like 3DFX being stuck at 16-bit colour again, only much much worse. I'm just crossing my fingers that Navi has something similar up its sleeve.

AMD doesn't even have anything planned post GCN.

Why do you think Raja Koduri (and his friends) ragequit AMD?

Also, Nvidia doesn't do anything purely to "get one over" on competitors.

Jensen obviously loves doing bold moves as evidenced in basically his entire time at Nvidia.
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,407
8,595
126
RTX is terrible branding. too close to RX, which is already in use in this industry.
 
  • Like
Reactions: Headfoot

Head1985

Golden Member
Jul 8, 2014
1,867
699
136
So there's a good chance GTX 2080 will be 3072 shaders? The rumours if it only being 8% faster than the GTX 1080 Ti seem much more viable now.
Lets speculate:

TITAN/Full DIE 4608SP at 2Ghz boost(1730Mhz base clock) that is 18.4TF.1080TI have 12.1TF(average boost 1700mhz).Thats 52% more raw power.ADD 10% ipc from new arch and we can see full Big Turing 62% faster than 1080TI(2080TI will be cutdown so probably around 50-55% performance)
RTX2080 3072SP at 2100Mhz(small die always boost higher than big.GTX1080 boost at 1850Mhz and 1080TI at 1700Mhz) that is 12.9TF.1080TI have 12.1TF(boost at 1700Mhz).6% more raw power than 1080TI+10% IPC=around 15% faster than 1080TI.
RTX2070 2304SP at 2100Mhz 9.6TF.Raw power 26% bellow 1080TI + 10% IPC and we are at 15% bellow 1080TI/15% above GTX1080.But i think it will be more like 5-10% slower because GTX1070 is average 25% slower than GTX1080 and if GTX2080 will be 15% faster than 1080TI, then 2070 will be 10% slower.The cutdown should be same 1GPC out so same gap between them.
GTX2060 1536SP at 2100Mhz that is 6.4TF.GTX1070 have 7.1TF(boost 1850mhz).1070 have 10% more raw power.If we add 10% IPC the 2060 should be around 1070 performance.Maybe 1070TI if 2060 will have fast GDDR6
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Denoising ray tracing. I think the number of rays per pixel is pretty low in order to make real time rendering possible.

Thanks. That makes sense.

But it makes the die space hit even higher for Ray Tracing, so it really limits this to 2080/2070 and above cards.

It also does make more sense then that the RTX 5000 could use the same (smaller than RTX 8000) chip in RTX 2080.

So RTX 5000 specs might be what the consumer RTX 2080 card gets.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Lets speculate:

1: TITAN/Full DIE 4608SP at 2Ghz boost(1730Mhz base clock) that is 18.4TF.1080TI have 12.1TF(average boost 1700mhz).Thats 52% more raw power.ADD 10% ipc from new arch and we can see full Big Turing 62% faster than 1080TI(2080TI will be cutdown so probably around 50-55% performance)
2: RTX2080 3072SP at 2100Mhz(small die always boost higher than big.GTX1080 boost at 1850Mhz and 1080TI at 1700Mhz) that is 12.9TF.1080TI have 12.1TF(boost at 1700Mhz).6% more raw power than 1080TI+10% IPC=around 15% faster than 1080TI.
RTX2070 2304SP at 2100Mhz 9.6TF.Raw power 26% bellow 1080TI + 10% IPC and we are at 15% bellow 1080TI/15% above GTX1080.But i think it will be more like 5-10% slower because GTX1070 is average 25% slower than GTX1080 and if GTX2080 will be 15% faster than 1080TI, then 2070 will be 10% slower.The cutdown should be same 1GPC out so same gap between them.
3:GTX2060 1536SP at 2100Mhz that is 6.4TF.GTX1070 have 7.1TF(boost 1850mhz).1070 have 10% more raw power.If we add 10% IPC the 2060 should be around 1070 performance.Maybe 1070TI if 2060 will have fast GDDR6


The problem I have is die size and price.

1: is 754 sq-mm. This would be $2000-$3000 like the Titan V, and if they ever made a 2080Ti out of it, it would probably cost $1500+.

2: may be 498 sq-mm (66% of above). That gives us a die larger than 1080 Ti die size + more expensive GDDR6 memory. Expect larger than 1080 Ti pricing, especially since NVidia will be justifying it with "revolutionary" Ray Tracing capabilities. $800+ wouldn't surprise me given the increased production cost and lack of competition.
 

Timmah!

Golden Member
Jul 24, 2010
1,572
935
136
The problem I have is die size and price.

1: is 754 sq-mm. This would be $2000-$3000 like the Titan V, and if they ever made a 2080Ti out of it, it would probably cost $1500+.

2: may be 498 sq-mm (66% of above). That gives us a die larger than 1080 Ti die size + more expensive GDDR6 memory. Expect larger than 1080 Ti pricing, especially since NVidia will be justifying it with "revolutionary" Ray Tracing capabilities. $800+ wouldn't surprise me given the increased production cost and lack of competition.

Well, if they want to release faster card than previous high-end, even if by meager 10 percent, at pretty much the same node as before, the chip will need to be relatively big. 300mm2 wont cut it this time around as with 1080.

Personally, i have several hopes:

1. RTX 2080 is gotta be the same chip as the Quadro 5000, no tensor and RT stuff stupidly cut away as they used to do with FP64 stuff. I could live just with half the VRAM and no NVlink (although i would not mind), but hopefully its the same full chip with no functionality artificially borked for market segmenting reasons.
2. Since RTX stuff is clearly made not just for games, but for content creation too, i hope my app of choice - Octane Render - is going to be able to use it. They claimed 25x speedsup against Pascal for raytracing - if it accelerates Octane say only 5x compared to my current 1080, its money well worth. I dont care if its just 10 percent faster than 1080Ti for games, but it needs to be this much faster in Octane.
3. Hopefully, the price will be right. I paid 770 EUROs for 1080, not keen to pay even more than that now.

If all of this happens, i am very likely buying one :p Bottom line, this is the most exciting GPU release in years, as far i am concerned.
 
  • Like
Reactions: GodisanAtheist

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
So more than half of the die is taken up by the tensor units and the ray tracing units.

It would be reasonable to assume consumer parts won't have them. Titan V doesn't really count.
 

Timmah!

Golden Member
Jul 24, 2010
1,572
935
136
So more than half of the die is taken up by the tensor units and the ray tracing units.

It would be reasonable to assume consumer parts won't have them. Titan V doesn't really count.

It is possible, but no raytracing for games then.
 

Timmah!

Golden Member
Jul 24, 2010
1,572
935
136
Thanks, did not know that.

Anyway, 15x for the smaller chip then. I am OK with that too :-D
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
So more than half of the die is taken up by the tensor units and the ray tracing units.

It would be reasonable to assume consumer parts won't have them. Titan V doesn't really count.

RTX 2080 revealed next week will have RT and Tensor units. It's reasonable to assume that parts below 2080/2070 wont have RT/Tensor.
 

Muhammed

Senior member
Jul 8, 2009
453
199
116
Desktop graphics will be GTX 2000 series,
NVIDIA hinted at RTX 2080 in all of it's promotional materials. So It's RTX 2080. Period.

bRmBmnJ.png


My bet is still 2019 though, potentially very late Q4 2018 for just the single GTX 2080
Nope, launch is this month, the channel is already brewing.
 
  • Like
Reactions: PeterScott

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Agreed that RTX is confusing branding. Bad move. If I was an AMD trademark lawyer this would be a big pain, its close enough to be worrisome but far enough that it almost fits in with typical graphics card trade naming practice. Not jealous of their position on this at all
 
Mar 11, 2004
23,444
5,852
146
Agreed that RTX is confusing branding. Bad move. If I was an AMD trademark lawyer this would be a big pain, its close enough to be worrisome but far enough that it almost fits in with typical graphics card trade naming practice. Not jealous of their position on this at all

If I were AMD I wouldn't say a damned thing about it as its to their benefit if anything. Nvidia has much more brand recognition, so if Nvidia dupes people into not knowing that RX cards from RTG are not RTX cards from Nvidia then, hey its their own fault. The curious thing would be if Nvidia would throw a fit if AMD calls their stuff XTR or XRT or even better ARTX (which AMD bought ArtX back in the early 2000s, that's actually what set them on the course for the R300 series).

Can existing games be updated to take advantage of ray tracing?

Probably but I can't imagine it'll be optimal and likely will come at a huge performance cost. I believe some game engines already have a ray-tracing path (I think they're actually being used in doing effects professionally some), so it might be simpler, but it'll still be a big hit I'm sure and even with specialized hardware I'm not sure it'll be able to offer worthwhile performance for the visual quality. I would be curious how the pro applications compare to the ray tracing DX12 stuff Microsoft is working on implementing. From what I've gathered they're basically sorta half-baking the ray tracing into games (where its sorta approximating it as best it can, but means its not nearly as intensive). I'm not sure its nearly as revolutionary as they make it out to be, but maybe I haven't seen what it can do and it is early so too early to judge anyway. And maybe its something that would have a lot bigger benefits for say VR, although I feel like that's actually going to double up on the raytracing problems (since you're now processing thing per eyeball).
 

24601

Golden Member
Jun 10, 2007
1,683
40
86
Had to post this
"Can it run MineSweeper?"
Considering the GPU is used in computational ground penetrating radar imaging for oil and natural gas exploration, yes it most certainly can play minesweeper.