• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

NVIDIA Volta Rumor Thread

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Sweepr

Diamond Member
May 12, 2006
5,148
1,129
131
This is interesting:

...e. For example, ≈ 800mm2 is expected to be the maximum possible die size that can be manufactured [18, 48]. For the purpose of this paper we assume that GPUs with greater than 128 SMs are not manufacturable on a monolithic die.

...in this paper we evaluate building a 256 SM GPU out of four GPMs of 64 SMs each

We show that with these optimizations, a 256 SMs MCM-GPU achieves 45.5% speedup over the largest possible monolithic GPU with 128 SMs. Furthermore, it performs 26.8% better than an equally equipped discrete multi-GPU, and its performance is within 10% of that of a hypothetical monolithic GPU that cannot be built based on today’s technology roadmap.
http://research.nvidia.com/sites/default/files/pubs/2017-06_MCM-GPU:-Multi-Chip-Module-GPUs//p320-Arunkumar.pdf

NVIDIA is assuming a 128 SMs monolithic die is possible on 7 nm, and up to 256 SMs with their MCM solution. In comparison GV100 packs 84 SMs @ 815mm² with TSMC '12 nm FFN'.
 

Samwell

Senior member
May 10, 2015
225
47
101
Could it be a case of needing to take a step back in order to prevail over a monolithic chip a few generations later? A strange musing really. Wouldn't make sense if the new design would lose performance over the old one, temporary or not.
You'll never be able to be as efficient as a monolithic chip, going off die costs power. In the paper was a good table which shows this:


On Chip you only need 80 femtojoule to move a bit, on package you already need 6x more. There might be ways to improve this a bit, but it will always be worse than on chip.

It's more an economical need. You don't have another possibility because the smaller nodes need too high R&D costs. AMD managed to make 2 Polaris last year without a high end gpu and this year probably they will only bring the higher end and not refresh the mainstream. They don't do it because they like it, but because of not enough money for R&D. On 7nm R&D cost should be about double than 16nm. That would make the situation even worse, but by building 1 die and scaling it to 2 or even 4 multidie gpus you save soo much money. With Navi i'm pretty sure AMD will have a top to bottom lineup if they really use multidie gpus. Additionally you have smaller chips which means better yield. Nvidia on the other hand with single chips will have a power advantage, but higher production and r&d cost. But as on 5nm and smaller the r&d cost are skyrocketing, they will have no other way as to do the same.

@zuzu
Multidie concepts aren't like sli. You have enough bandwidth between them, that they behave like 1 chip.

But all this MCM stuff is further future. Here we need more volta news.
 
  • Like
Reactions: TheF34RChannel

TheF34RChannel

Senior member
May 18, 2017
782
301
106
You'll never be able to be as efficient as a monolithic chip, going off die costs power. In the paper was a good table which shows this:


On Chip you only need 80 femtojoule to move a bit, on package you already need 6x more. There might be ways to improve this a bit, but it will always be worse than on chip.

It's more an economical need. You don't have another possibility because the smaller nodes need too high R&D costs. AMD managed to make 2 Polaris last year without a high end gpu and this year probably they will only bring the higher end and not refresh the mainstream. They don't do it because they like it, but because of not enough money for R&D. On 7nm R&D cost should be about double than 16nm. That would make the situation even worse, but by building 1 die and scaling it to 2 or even 4 multidie gpus you save soo much money. With Navi i'm pretty sure AMD will have a top to bottom lineup if they really use multidie gpus. Additionally you have smaller chips which means better yield. Nvidia on the other hand with single chips will have a power advantage, but higher production and r&d cost. But as on 5nm and smaller the r&d cost are skyrocketing, they will have no other way as to do the same.

@zuzu
Multidie concepts aren't like sli. You have enough bandwidth between them, that they behave like 1 chip.

But all this MCM stuff is further future. Here we need more volta news.

They will need to do something to make up for the loss, unless it doesn't translate into too much of an overall performance drop compared to single chip designs (for gaming; prosumers and professionals are an entirely different ballgame and might suffer more from MCM losses).
 

Puffnstuff

Lifer
Mar 9, 2005
15,328
4,059
136
I don't think Nvidia are in any rush to release consumer Volta cards.
You're probably right as they really don't need to since AMD cannot present a credible threat. We need something like a Voodoo 2017 Graphinator model to blow the doors off everybody to get things moving along again.
 

GoNavy1776

Member
Jul 7, 2017
52
8
41
Any updates on when Volta is dropping now? I was thinking of getting two 1080Ti next month but now if Volta is right around the corner why not just wait right?
 

Qwertilot

Golden Member
Nov 28, 2013
1,586
243
106
HPC Volta is basically launched of course. Consumer Volta is anywhere between this Autumn & next Spring. Given NV's previous behaviour though, don't expect the like for like 1080ti replacement from the Volta line up for 6-12 months on top of that.

The 1180 will (very likely) be a little faster than a 1080ti while drawing less power, but it won't be a big gap in performance terms.
 

zlatan

Senior member
Mar 15, 2011
580
291
136
HPC Volta is basically launched of course. Consumer Volta is anywhere between this Autumn & next Spring. Given NV's previous behaviour though, don't expect the like for like 1080ti replacement from the Volta line up for 6-12 months on top of that.

The 1180 will (very likely) be a little faster than a 1080ti while drawing less power, but it won't be a big gap in performance terms.
They need to wait for GDDR6, so march is the target.

We should probably not compare it to Pascal. It has a lot of Vega-like advancements, so Volta will be different than the actual hardwares. Even if this won't be visible in the logical diagram, the hardware is designed to support SM6.x. While Pascal can support it as well, the performance of the important functions will be very limited.

Vega and Volta may not be superfast with the present games, but both architecture will wipes the floor with the actual ones in the future.
 

GoNavy1776

Member
Jul 7, 2017
52
8
41
They need to wait for GDDR6, so march is the target.

We should probably not compare it to Pascal. It has a lot of Vega-like advancements, so Volta will be different than the actual hardwares. Even if this won't be visible in the logical diagram, the hardware is designed to support SM6.x. While Pascal can support it as well, the performance of the important functions will be very limited.

Vega and Volta may not be superfast with the present games, but both architecture will wipes the floor with the actual ones in the future.
What game engines do you suppose will be better with Vega and Volta? I do not know so your teaching me something here. I am all about Vega but all the people are saying that it isn't really any good but there is always more to the story than a bunch of people that lack knowledge and just go off of hype of others.

If vega is going to be an investment for future titles (a risky buy?), the same as Volta, then are game titles in 2018 going to shine with both new architectures or am I confused?
 

Qwertilot

Golden Member
Nov 28, 2013
1,586
243
106
From what I'd seen the thinking was that they'll only really need GDDR6 for the big chips? They could plausibly (so might) launch the 1170/1180 earlier using GDDR5X. We'll see. Its definitely all pure speculation at this stage.
 

Trumpstyle

Member
Jul 18, 2015
72
23
81
The cards will almost guaranteed be called 2070/2080 not 1170/1180. Expect release 1 year after 1080 ti, so feb-march 2018 :)

Edit: lol few of my words was swedish instead of english, fixed!
 
Last edited:

GoNavy1776

Member
Jul 7, 2017
52
8
41
The cards will almost guaranteed be called 2070/2080 not 1170/1080. Expect release 1 year efter 1080 ti, so feb-mars 2018 :)
So it's still perfectly viable to get a 1080ti or Vega if itnpans out. Ok good. I want to replace my aging 980ti soon.
 

TheF34RChannel

Senior member
May 18, 2017
782
301
106
They need to wait for GDDR6, so march is the target.
This. GDDR6 is what they are waiting for.

So it's still perfectly viable to get a 1080ti or Vega if itnpans out. Ok good. I want to replace my aging 980ti soon.
All indicators are that Vega is about as good as the 1080, so I'd grab a 1080Ti personally.

The [2080] will (very likely) be a little faster than a 1080ti while drawing less power, but it won't be a big gap in performance terms.
I think we'll be looking at a very nice performance increase for the 2080 in comparison. It has to offer a decent increase over the current Ti for it to be viable.
 

crisium

Platinum Member
Aug 19, 2001
2,632
593
136
GV104 (2080) could get GDDR5X again and GDDR6 could be saved for GV102 (Ti). This would enable an earlier launch. But given the current state of affairs (no competition, mining selling many cards anyway) I wonder if it would be a wise business decision to launch this year.
 
  • Like
Reactions: xpea

TheF34RChannel

Senior member
May 18, 2017
782
301
106
GV104 (2080) could get GDDR5X again and GDDR6 could be saved for GV102 (Ti). This would enable an earlier launch. But given the current state of affairs (no competition, mining selling many cards anyway) I wonder if it would be a wise business decision to launch this year.
Nah my money is on GDDR6 on all the top cards and indeed the lack of competition, the Ti being relatively fresh and selling well and the need to wait for GDDR6, there's no incentive to launch this year. March seems to be the most likely to me.
 

TiSA88

Member
Mar 24, 2017
26
2
11
Nah my money is on GDDR6 on all the top cards and indeed the lack of competition, the Ti being relatively fresh and selling well and the need to wait for GDDR6, there's no incentive to launch this year. March seems to be the most likely to me.
It should have incentive, the fact that gta v can't sustain 4k 60 on max settings on the 1080 Ti
It's a 2015 game..
 

ASK THE COMMUNITY