Desktop graphics will be GTX 2000 series, just as I've been saying, and yeah it will be GTX, not RTX. RTX is for the new professional market graphics. My bet is still 2019 though, potentially very late Q4 2018 for just the single GTX 2080 as a founders edition costing probably around $700, just before the Holiday season and ride on that as a gift graphic card.
GV100 will still have more NVLink capacity. Looks like Turing v Volta is an HPC vs Workstation card breakdown. I personally think nVidia has enough $$ and potential revenue to customize the same base level architecture for each niche a bit differently. Volta continues to be the hyperscale chip of choice?
Whats throwing me, is that if RTX 5000 is GT104 destined for consumer RTX 2080/2070, why would it still be full of Tensor cores? I know they will likely disable them, but it seems like a waste of silicon for what will be a volume consumer part, and probably an insignificant volume of Quadro parts.
I don't think we've heard anything about DP yet either, so that could go either way.
It's full of tensor cores to accelerate denoising. It's pretty obvious that Nvidia is looking to beat AMD over the head with the raytracing club. If they get sufficient software support and AMD doesn't have a decent raytracing solution, well, it will be like 3DFX being stuck at 16-bit colour again, only much much worse. I'm just crossing my fingers that Navi has something similar up its sleeve.
It's full of tensor cores to accelerate denoising.
Denoising what? 3D graphics in games don't have noise.
Lets speculate:So there's a good chance GTX 2080 will be 3072 shaders? The rumours if it only being 8% faster than the GTX 1080 Ti seem much more viable now.
Denoising ray tracing. I think the number of rays per pixel is pretty low in order to make real time rendering possible.
Lets speculate:
1: TITAN/Full DIE 4608SP at 2Ghz boost(1730Mhz base clock) that is 18.4TF.1080TI have 12.1TF(average boost 1700mhz).Thats 52% more raw power.ADD 10% ipc from new arch and we can see full Big Turing 62% faster than 1080TI(2080TI will be cutdown so probably around 50-55% performance)
2: RTX2080 3072SP at 2100Mhz(small die always boost higher than big.GTX1080 boost at 1850Mhz and 1080TI at 1700Mhz) that is 12.9TF.1080TI have 12.1TF(boost at 1700Mhz).6% more raw power than 1080TI+10% IPC=around 15% faster than 1080TI.
RTX2070 2304SP at 2100Mhz 9.6TF.Raw power 26% bellow 1080TI + 10% IPC and we are at 15% bellow 1080TI/15% above GTX1080.But i think it will be more like 5-10% slower because GTX1070 is average 25% slower than GTX1080 and if GTX2080 will be 15% faster than 1080TI, then 2070 will be 10% slower.The cutdown should be same 1GPC out so same gap between them.
3:GTX2060 1536SP at 2100Mhz that is 6.4TF.GTX1070 have 7.1TF(boost 1850mhz).1070 have 10% more raw power.If we add 10% IPC the 2060 should be around 1070 performance.Maybe 1070TI if 2060 will have fast GDDR6
The problem I have is die size and price.
1: is 754 sq-mm. This would be $2000-$3000 like the Titan V, and if they ever made a 2080Ti out of it, it would probably cost $1500+.
2: may be 498 sq-mm (66% of above). That gives us a die larger than 1080 Ti die size + more expensive GDDR6 memory. Expect larger than 1080 Ti pricing, especially since NVidia will be justifying it with "revolutionary" Ray Tracing capabilities. $800+ wouldn't surprise me given the increased production cost and lack of competition.
Just FYI. They compared big Turing RTX 6000 against GP104 GTX1080 in that comparison :/They claimed 25x speedsup against Pascal for raytracing...
So more than half of the die is taken up by the tensor units and the ray tracing units.
It would be reasonable to assume consumer parts won't have them. Titan V doesn't really count.
So more than half of the die is taken up by the tensor units and the ray tracing units.
It would be reasonable to assume consumer parts won't have them. Titan V doesn't really count.
NVIDIA hinted at RTX 2080 in all of it's promotional materials. So It's RTX 2080. Period.Desktop graphics will be GTX 2000 series,
Nope, launch is this month, the channel is already brewing.My bet is still 2019 though, potentially very late Q4 2018 for just the single GTX 2080
Agreed that RTX is confusing branding. Bad move. If I was an AMD trademark lawyer this would be a big pain, its close enough to be worrisome but far enough that it almost fits in with typical graphics card trade naming practice. Not jealous of their position on this at all
Can existing games be updated to take advantage of ray tracing?
Considering the GPU is used in computational ground penetrating radar imaging for oil and natural gas exploration, yes it most certainly can play minesweeper.Had to post this
"Can it run MineSweeper?"
Whew! I was really sweating it there for a moment! ThanksConsidering the GPU is used in computational ground penetrating radar imaging for oil and natural gas exploration, yes it most certainly can play minesweeper.
