NVIDIA Volta Rumor Thread

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

TheF34RChannel

Senior member
May 18, 2017
786
309
136
I'm guessing we'll see a $599-649 USD price tag for GTX 2080. If it's only ~10% faster than the GTX 1080 TI, then I think $599 is more likely. If GV204 is able to crush 1080 TI by 20% or more at 4k, then expect $649 or more pricing.

You see, that I could live with because it's good old normalcy. I also expect (or hope) the GTX 2080 to be quite a bit better than the current Ti, more than 10% and maybe even your suggested 20% (30% maximum though).
 

Glo.

Diamond Member
Apr 25, 2015
5,657
4,409
136
The GPUs I am most excited about are those that do not have 6 pin connector: GTX 2050 Ti, and 2050.

If GTX 2050 Ti will give GTX 1060 or slightly faster level of performance - bring it on!
 

TheF34RChannel

Senior member
May 18, 2017
786
309
136
The GPUs I am most excited about are those that do not have 6 pin connector: GTX 2050 Ti, and 2050.

If GTX 2050 Ti will give GTX 1060 or slightly faster level of performance - bring it on!

I just want a GTX 2080 (Ti) with 2x 8-pin connectors so it can get as much as it needs. I say this extremely prematurely because it may be so energy efficient it can do with less but it is more comforting to me that they're there, of that makes sense?
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
Doesn't matter if it needs it or not, someone will make one :) Heck, there's a decent chance someone will make a 2080 that way, which really will be silly.
(The really 'extreme' xx80 editions are a bit odd, it makes some sort of sense for the ti's.).
 

TheF34RChannel

Senior member
May 18, 2017
786
309
136
Doesn't matter if it needs it or not, someone will make one :) Heck, there's a decent chance someone will make a 2080 that way, which really will be silly.
(The really 'extreme' xx80 editions are a bit odd, it makes some sort of sense for the ti's.).

Someone will, if that someone is one of my favorite brands remains to be seen - luxery problems eh. They should just make the 2080 the Ti specced part and have no Ti, like before, gives everyone more time to enjoy the best longer (which won't happen as long as the Titan is around, since it and the Ti are symbiotic in existence).

Anyway, when do we gather to get some factual information about consumer Volta do you reckon?
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
You see, that I could live with because it's good old normalcy. I also expect (or hope) the GTX 2080 to be quite a bit better than the current Ti, more than 10% and maybe even your suggested 20% (30% maximum though).

Well it may be, but it depends on how much Nvidia was willing to balloon die sizes. 12nm FFN isn't much of a density improvement over 16nm FF+, but it does boast a 30% perf/w improvement. So if the Volta architecture can squeeze 15% perf/w improvement before factoring in node improvements, that is a 50% increase in performance, or only about a 11% increase over a 1080 TI at 4k. Volta needs to bring it to be more of a leap than Maxwell was over Kepler and Pascal over Maxwell (at least at their initial releases).
 

zuzu

Junior Member
Jul 1, 2017
20
0
6
yea but that wont be possible perf/w if they add more cores and they will
right now it's 300-400w for ti ,and u add more cores = u get 30% of cores and if u trust pr lies of
12nm it's even ,300-400w for 1.8+ghz
 

zuzu

Junior Member
Jul 1, 2017
20
0
6
yea same cores just more of them ,and more mean 1-3% loss on communicating clock to clock
gp vs vg 100
 

TheF34RChannel

Senior member
May 18, 2017
786
309
136
Well it may be, but it depends on how much Nvidia was willing to balloon die sizes. 12nm FFN isn't much of a density improvement over 16nm FF+, but it does boast a 30% perf/w improvement. So if the Volta architecture can squeeze 15% perf/w improvement before factoring in node improvements, that is a 50% increase in performance, or only about a 11% increase over a 1080 TI at 4k. Volta needs to bring it to be more of a leap than Maxwell was over Kepler and Pascal over Maxwell (at least at their initial releases).

Good points. More information at this point would be welcome but Nvidia is keeping quiet - presumably because it's still far away. There aren't even too believable leaks, only the usual speculation.
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
Well, with how they're dominating the market just now they've got an obvious major reason to stay quiet until genuinely near release :)

iirc quite good at it too - I seem to remember Pascal rather surprising at least some people?
 
  • Like
Reactions: TheF34RChannel

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
Well, yes, but that's only one month from start of rumours to release. Much tighter than AMD have been recently.

I can't imagine anyone ever launching anything significant in August :)
 

zlatan

Senior member
Mar 15, 2011
580
291
136
The top gaming Volta will come around february. The base configuration will use 24GB GDDR6, so they have to wait for the memory. After that there will be a 48GB version, for the serious gamers. :)
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
Top gaming Volta in Feb? You can only mean a GV102 Titan Xv.

So you expect a GV104 GTX 2080 even earlier. Since they always come first.

We shall see.
 

TheF34RChannel

Senior member
May 18, 2017
786
309
136
The top gaming Volta will come around february. The base configuration will use 24GB GDDR6, so they have to wait for the memory. After that there will be a 48GB version, for the serious gamers. :)

All this would be great but, I'm skeptical, sorry. I'd also like to see a source for your information, is that possible? - Or hear about a source.
 

Bouowmx

Golden Member
Nov 13, 2016
1,138
550
146
For pioneering models implementing GDDR6, I would assume 1 GB/chip density to make production easy, meaning total 12 GB in NVIDIA GV102 for GeForce, 8 GB in GV104, and 6 GB in GV106. Besides, 4K high-quality gaming today can fit in those buffers. Quadro variants come later, with double memory.
 

TheF34RChannel

Senior member
May 18, 2017
786
309
136
For pioneering models implementing GDDR6, I would assume 1 GB/chip density to make production easy, meaning total 12 GB in NVIDIA GV102 for GeForce, 8 GB in GV104, and 6 GB in GV106. Besides, 4K high-quality gaming today can fit in those buffers. Quadro variants come later, with double memory.

That sounds much more realistic to me, nicely guessed sir!
 

Samwell

Senior member
May 10, 2015
225
47
101
Yes, it won't be more than 12 GB at the start. Hynix themself had only 8 Gigabit GDDR6 in their announcement for a 768Gb/s GPU in Q1 18. I think we'll need to wait a long time till the next ram jump.
 

TheF34RChannel

Senior member
May 18, 2017
786
309
136
Yes, it won't be more than 12 GB at the start. Hynix themself had only 8 Gigabit GDDR6 in their announcement for a 768Gb/s GPU in Q1 18. I think we'll need to wait a long time till the next ram jump.

I do suspect that GV104 will receive a little more love in terms of the amount of memory, maybe 10GB?
 

Bouowmx

Golden Member
Nov 13, 2016
1,138
550
146
Totaling 10 GB using 8 × 32-bit memory channels and 1 GB chips means some channels have more memory attached than others, and memory segments like in GeForce GTX 660 and 970. It's possible but I think it's unnecessary, and NVIDIA probably would exercise more caution after the 970 situation.
 

xpea

Senior member
Feb 14, 2014
429
135
116
well by nvidia volta is out for like 4 years now :))
i rly hope they will plop out dx12/vulcan hw spec in this gpu ,
i mean they say volta will be 2014 gpu (dx11)
so this is 2018 gpu and all game ind moving to hw vulcan dx12
if it plop out like just another dx11 gpu like pascal ,,
huh i will start to slowly hate nvidia

i just wish rx vega to kill all gtx gpus till now just so we can get best gpu ever in volta :D
Please stop to spread non sens and start to write better English, this forum is not a phone messenger app :pensive:
Volta is even more advanced than Vega in DX12 specs. It has finer granularity in threads control (hardware based), better geometry performance and miles ahead in efficiency that its not even funny.

I don't know: that sounds pointless to me. Designing 3 new dies for 256 extra cores? Might as well design dies for Volta, featuring 140% the number of Pascal cores, as shown in GV100.
^^This
No more Pascal is planned. Nvidia is on full tilt with Volta from now on
 

zuzu

Junior Member
Jul 1, 2017
20
0
6
xpea ..tests say gp100 vs gv100 core to core ,,they are same
so dont know what u read but volta is pascal with moar cores
i wish it was dx12 in hw but ,,