[bitsandchips]: Pascal to not have improved Async Compute over Maxwell

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
But people dont buy perf/mm2 :)

We have seen it before, AMD having better perf/mm2 and it didn't do anything for them.
 

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
People would buy perf/mm^2 if it were the prime factor of price. Sure they don't buy it but they might buy its knock-on effects.
 

Kris194

Member
Mar 16, 2016
112
0
0
It may not be that stupid for Nvidia to bet on row power. More powerful GPU means that you have better performance everywhere, even in Directx 11 games, not only when developer wants it.
 

Mondozei

Golden Member
Jul 7, 2013
1,043
41
86
If true, then devastating. Not just for NV, but also the game industry since Game Devs can't ignore 80% of the market for the benefit of 20%, even if it makes things a lot better.

I certainly hope this isn't the case, but if it is, AMD's the next purchase for me. There will always be games that will use it and the boost Async Compute provides is significant.
 

Snarf Snarf

Senior member
Feb 19, 2015
399
327
136
If true, then devastating. Not just for NV, but also the game industry since Game Devs can't ignore 80% of the market for the benefit of 20%, even if it makes things a lot better.

I certainly hope this isn't the case, but if it is, AMD's the next purchase for me. There will always be games that will use it and the boost Async Compute provides is significant.

They would be ignoring 80% of the PC market... keep in mind the consoles are fully Async Compute compatible and they hold a much larger market share than any of the PC market does. AMD actually predicted something correctly without giving up too much to preempt the need for Async, so kudos to them. Either way, unless 75% of the titles coming out in 2016 and 2017 are DX12 with Async Nvidia still might have the better performers with pure grunt in a lot of games.
 

thesmokingman

Platinum Member
May 6, 2010
2,302
231
106
2016 is going to be one heck of a year as this runs its course. It's gonna be interesting to see the change in tact as arguments go. If true and things go to brute force where does that put power efficiency...?
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,476
136

Snarf Snarf

Senior member
Feb 19, 2015
399
327
136
That wasn't due to perf/mm2. But due to perf/watt. ;)

Egg or chicken first? Perf/mm2 would imply that you could reach performance levels of bigger chips with less transistors and in return use less power leading to higher perf/watt. Technicalities of arguments aside, efficiency is going to be the name of the game going forward. Anyone who thinks otherwise is ignoring all the writing on the wall.
 

zlatan

Senior member
Mar 15, 2011
580
291
136
I can't say to much on this topic, but Pascal will be an improvement over Maxwell especially at this feature. But no, it won't have GCN-like capabilities. It will be close to GCN 1.0, but nothing more.
 
Last edited:

MrTeal

Diamond Member
Dec 7, 2003
3,919
2,708
136
Please refresh your memory. AMD's perf/sq mm lead during HD 4000 and HD 5000 series allowed them to capture dominant market share in discrete mobile and overall GPU market share lead (even if just slightly). :D

http://www.xbitlabs.com/news/graphi..._on_Discrete_GPU_Market_Mercury_Research.html

Better perf/mm² might allow them to price their GPUs better and thus drive sales, but that's an outcome of perf/mm². People still don't care about it, and other than a few people here no cares whether an IHV giving them performance X at Y watts for Z dollars does it with a 200mm² or 300mm² die.
 

thesmokingman

Platinum Member
May 6, 2010
2,302
231
106
I can't say to much on this topic, but Pascal will be an improvement over Maxwell esspecialy at this feature. But no, it won't have GCN-like capabilities. It will be close to GCN 1.0, but nothing more.


It's a rumor but it is sort of far fetched given how much advanced notice Nvidia has had on AC to not put any effort into improving their situation. I would have thought someone at NV would have seen the writing on the wall as AMD won the consoles, and that the writing being that they could now shape the future. Or do we accept the common tact that Nvidia could care less about the consoles... if so lol hindsight is 50/50 right?

Insert pic of Jackie Chan with hands on head...

How could they not adapt to the unfolding changes in the development space in regards to AC?
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,596
136
Thats a shame if true :( Its 2.5 year now since mantle and +3 year since the consoles went gcn. Nv have known perhaps 2 years before that what would happen.

Ofcource Mantle hit harder than some still refuse to accept. But i presumed all the way Pascal would have true asynch capabilities.
Its a damn miss. But hey it makes buying next gen gfx pretty simple. Its pretty idiotic to buy a gfx without it if you intend to play new dx 12 games. There is not two ways about it. Get ready for the biggest compute smokescreen in history - will beat Maxwell technical marketing several times.
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,596
136
I can't say to much on this topic, but Pascal will be an improvement over Maxwell especially at this feature. But no, it won't have GCN-like capabilities. It will be close to GCN 1.0, but nothing more.
There is 2 ace units in 7970 and 8 units in 290 as i recall. 400% difference. And thats to 2013.
Who in their sane mind buys gfx with capabilities from like 2011 :( nv should man up and change it asap.
 

Pinstripe

Member
Jun 17, 2014
197
12
81
People are overestimating the advantages of Asynch. Xbox One/PS4 Asynch is already being easily outmatched by Maxwell raw power. Consoles are still the target platform for developers. And if AMD launches Polaris 10/11 mid-tier cards first while Nvidia launches faster GP104 at the same time, then raw performance will outclass the elegant and efficient multi-engine design of Polaris anytime. And by the time compute-heavy stuff (VR) becomes mainstream enough, we already have Volta and Navi in our hands. I would assume Volta at this point does have multi-engine support, provided Nvidia deems it as important enough.

AMD, always first in hardware innovation but unable to capture market share.
 

Despoiler

Golden Member
Nov 10, 2007
1,968
773
136
Perf/mm2 isn't going to sell anything. If they want to win its perf/watt.

My assumption is that Polaris will have perf/watt parity with Pascal. If that is true you have to have some metric that marketing hammers on. I just threw out perf/mm2 because as people have already pointed out, it is a statement about efficiency.
 

thesmokingman

Platinum Member
May 6, 2010
2,302
231
106
People are overestimating the advantages of Asynch. Xbox One/PS4 Asynch is already being easily outmatched by Maxwell raw power. Consoles are still the target platform for developers. And if AMD launches Polaris 10/11 mid-tier cards first while Nvidia launches faster GP104 at the same time, then raw performance will outclass the elegant and efficient multi-engine design of Polaris anytime. And by the time compute-heavy stuff (VR) becomes mainstream enough, we already have Volta and Navi in our hands. I would assume Volta at this point does have multi-engine support, provided Nvidia deems it as important enough.

AMD, always first in hardware innovation but unable to capture market share.


But what if it isn't? :\
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
It will be close to GCN 1.0, but nothing more.

So we can expect results like the 280x where using async doesn't really give a performance boost, but it doesn't hurt either?

Isn't there some DirectX 12 feature Nvidia supports that GCN doesn't? Is there some way Nvidia can throw some money around to get developers to use that feature to even the playing field.
 

coercitiv

Diamond Member
Jan 24, 2014
7,448
17,754
136
Judging from Nvidia's PR stance on Maxwell AC capability my money is on Pascal bringing in at least some improvements, enough to tick the AC support box anyway. Otherwise they would have had a different PR approach from the start.
 

Pinstripe

Member
Jun 17, 2014
197
12
81
But what if it isn't? :\

We don't know for certain, but going by the rumors I get the sense Polaris 10 will be a Hawaii replacement while GP104 will be a GM200 replacement. Hawaii+Asynch is still slower than a GTX 980 Ti. Besides, raw performance is always easier for developers to deal with than this pesky and complicated Asynch stuff. I could be wrong though.
 

Actaeon

Diamond Member
Dec 28, 2000
8,657
20
76
Trying to find some silver lining behind this...

If Pascal is just a shrunk Maxwell, what are the chances that Maxwell will continue to see improvements via driver and doesn't fall off like Keplar did? This would be similar to what we're seeing with AMD's older GCN cards still getting driver improvements.
 

Pinstripe

Member
Jun 17, 2014
197
12
81
Trying to find some silver lining behind this...

If Pascal is just a shrunk Maxwell, what are the chances that Maxwell will continue to see improvements via driver and doesn't fall off like Keplar did? This would be similar to what we're seeing with AMD's older GCN cards still getting driver improvements.

Kepler didn't fall off due to a "driver hijack", but the fact that it's architecture is outdated with modern (PS4/XB1 tailored) game engines. Given that Pascal is mostly just a shrinked Maxwell with some extras, Maxwell should hold up pretty well for this console generation provided people don't go crazy on the game settings.
 

Leadbox

Senior member
Oct 25, 2010
744
63
91
If true, then devastating. Not just for NV, but also the game industry since Game Devs can't ignore 80% of the market for the benefit of 20%, even if it makes things a lot better.

I certainly hope this isn't the case, but if it is, AMD's the next purchase for me. There will always be games that will use it and the boost Async Compute provides is significant.
I have never really understood this line of reasoning. First of all the 80% marketshare thing - nvidia had a couple of uncontested quarters of sales were 4 out of 5(80%) gpus sold were geforces. This does not mean what a lot of ppl who throw this percentage about think it means and by that I mean not representative of actual use case out there.Secondly, it's not like games are suddenly not going to work for XX% of the market just because they don't support a certain feature. Noone is going to be ignored, they just won't have the fastest optimal performance.
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,596
136
Kepler didn't fall off due to a "driver hijack", but the fact that it's architecture is outdated with modern (PS4/XB1 tailored) game engines. Given that Pascal is mostly just a shrinked Maxwell with some extras, Maxwell should hold up pretty well for this console generation provided people don't go crazy on the game settings.
Maxwell doesnt hold up well with the new batch of dx12 games and we are just about to start. Even my older 7970 ages better than my 970. Its pretty obvious. I dont know why its so difficult to accept.
There is no asynch capability in Maxwell worth for gaming. Its tacked on arm stuff usable for stuff like cuda. End of story.
Either Pascal changes that or not. It looks like it doesnt. And thats just bad. All the talk about raw power is nonsense.
When the game enters situations where compute is heavy it will be a stutter fest. Fps might look good on average but it will just feel like crap.