• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

Pascal graphics card to launch at Computex 2016 and enter mass shipments in July

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

kraatus77

Senior member
Aug 26, 2015
266
59
101
Let me be clear. There are many folks around here who prefer AMD/AMD products who are perfectly rational and are absolute joys to have intelligent debates/discussions with. Same goes for NVIDIA, Intel, ASUS, ASRock, whatever.

But then you have those posters who consistently make very far out claims that don't jive with reality. For example, I have seen buyers of the GTX 960 mocked for not purchasing a Radeon R9-whatever in its place. These people have been called "sheep."

In fact, NVIDIA buyers in general on these forums seem to be treated as ignorant fools with more money than brains. It doesn't occur to the "GPU elite" who think AMD is the only sensible choice, who narrowly focus on raw perf/$ that there are perhaps other considerations that are important to buyers.

If you want good gaming performance in a system where a high wattage PSU is not realistically an option, something like the GTX 950 or GTX 960 makes a ton of sense, even if it's not the best perf/$ option from an upfront acquisition perspective.

Then let's talk about say, a GTX 980 Ti buyer. The 980 Ti, by-and-large, is objectively viewed as the superior ultra-high end GPU in terms of both raw performance in a wide swath of games as well as in power efficiency. It's also something of an "overclocker's dream."

Yet, what I am finding is that people are now trying to dissuade others from purchasing a high-end NVIDIA card because of the "DX12 boogeyman." The value prop? Forget all of the games you want to play today, the games of "the future" will all magically run better on AMD because...[Async Compute, GCN is the most advanced GPU architecture in the history of ever, AMD first to HBM, ...].

It's getting old.

don't act like nvidia users are saints either. this stuff happens from both sides. only way to avoid those boosters are by using facts. if you can't do that than simply don't respond. ( this is not really pointed at you btw)
 

KaRLiToS

Golden Member
Jul 30, 2010
1,918
11
81
Let me be clear. There are many folks around here who prefer AMD/AMD products who are perfectly rational and are absolute joys to have intelligent debates/discussions with. Same goes for NVIDIA, Intel, ASUS, ASRock, whatever.

But then you have those posters who consistently make very far out claims that don't jive with reality. For example, I have seen buyers of the GTX 960 mocked for not purchasing a Radeon R9-whatever in its place. These people have been called "sheep."

In fact, NVIDIA buyers in general on these forums seem to be treated as ignorant fools with more money than brains. It doesn't occur to the "GPU elite" who think AMD is the only sensible choice, who narrowly focus on raw perf/$ that there are perhaps other considerations that are important to buyers.

If you want good gaming performance in a system where a high wattage PSU is not realistically an option, something like the GTX 950 or GTX 960 makes a ton of sense, even if it's not the best perf/$ option from an upfront acquisition perspective.

Then let's talk about say, a GTX 980 Ti buyer. The 980 Ti, by-and-large, is objectively viewed as the superior ultra-high end GPU in terms of both raw performance in a wide swath of games as well as in power efficiency. It's also something of an "overclocker's dream."

Yet, what I am finding is that people are now trying to dissuade others from purchasing a high-end NVIDIA card because of the "DX12 boogeyman." The value prop? Forget all of the games you want to play today, the games of "the future" will all magically run better on AMD because...[Async Compute, GCN is the most advanced GPU architecture in the history of ever, AMD first to HBM, ...].

It's getting old.

I'm trying to understand the point of your post?

I don't see where you mention about Pascal graphics card launching at computex and entering mass shipments... ??

And are you one of those posters?????????????????? o_O

__________________________

On topic, I think the first company to release 14nm GPUs will get my money for a test drive until big Pascal and Vega.
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
If you are lucky you may see GP104 with 2 stacks of HBM2. But it will also cost accordingly. Tho GDDR5X is still the joker.

Just as reference.
2 stacks of 1.4Ghz HBM2=360GB/sec
256bit 12Ghz GDDR5X=384GB/sec

Without HBM2 or GDDR5X and limited to 256bit. Its hard to imagine a performance level worth upgrading to.

huh? where is this from? HBM 1.4ghz?
 

Sweepr

Diamond Member
May 12, 2006
5,148
1,143
136
So I guess now,
competition = transition.

Yes, it is. They're obviously talking about NVIDIA, no matter how you twist the facts.

The competing system consumes 140 watts. This is 86 watts. We believe we’re several months ahead of this transition, especially for the notebook and the mainstream market. The competition is talking about chips for cars and stuff, but not the mainstream market.
 

xpea

Senior member
Feb 14, 2014
458
156
116
back to topic:
http://www.hardwareluxx.de/index.ph...it-pascal-gpu-und-samsung-gddr5-speicher.html

GP106 spotted in Revision A1 (should be the production one) and made in week 13 of 2016 (we are week 15). Still hot from the oven !

For such small chip, qualification should be fast and we can expect to see boards in retail before September.

Then on chiphell forum, someone showed ~300mm² Pascal with 8GB GDDR5 (link doesn't work for me but there is it: https://www.chiphell.com/thread-1563086-1-1.html )

Looks like Nvidia is going to deploy Pascal from top to bottom at very fast pace (this is where deep pockets and big R&D budget helps)
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
back to topic:
http://www.hardwareluxx.de/index.ph...it-pascal-gpu-und-samsung-gddr5-speicher.html

GP106 spotted in Revision A1 (should be the production one) and made in week 13 of 2016 (we are week 15). Still hot from the oven !

For such small chip, qualification should be fast and we can expect to see boards in retail before September.

Then on chiphell forum, someone showed ~300mm² Pascal with 8GB GDDR5 (link doesn't work for me but there is it: https://www.chiphell.com/thread-1563086-1-1.html )

Looks like Nvidia is going to deploy Pascal from top to bottom at very fast pace (this is were deep pockets and big R&D budget helps)

256bit on GP106 as well. Bodes well for GP104.
 

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
back to topic:
http://www.hardwareluxx.de/index.ph...it-pascal-gpu-und-samsung-gddr5-speicher.html

GP106 spotted in Revision A1 (should be the production one) and made in week 13 of 2016 (we are week 15). Still hot from the oven !

For such small chip, qualification should be fast and we can expect to see boards in retail before September.

Then on chiphell forum, someone showed ~300mm² Pascal with 8GB GDDR5 (link doesn't work for me but there is it: https://www.chiphell.com/thread-1563086-1-1.html )

Looks like Nvidia is going to deploy Pascal from top to bottom at very fast pace (this is were deep pockets and big R&D budget helps)


Good spot! :thumbsup:
 

airfathaaaaa

Senior member
Feb 12, 2016
692
12
81
back to topic:
http://www.hardwareluxx.de/index.ph...it-pascal-gpu-und-samsung-gddr5-speicher.html

GP106 spotted in Revision A1 (should be the production one) and made in week 13 of 2016 (we are week 15). Still hot from the oven !

For such small chip, qualification should be fast and we can expect to see boards in retail before September.

Then on chiphell forum, someone showed ~300mm² Pascal with 8GB GDDR5 (link doesn't work for me but there is it: https://www.chiphell.com/thread-1563086-1-1.html )

Looks like Nvidia is going to deploy Pascal from top to bottom at very fast pace (this is were deep pockets and big R&D budget helps)

your first link doesnt say anything about the gp 106 being spotted what they are saying is that the codes from the px2 follow the one of p100 but are younger aka they got manufactured at the end of 2015 or on january 2016
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
GP106 at 200mm2 and only 128bit memory ??

That would be like a GTX980 with 128bit memory only.
 

Head1985

Golden Member
Jul 8, 2014
1,867
699
136
GP106 at 200mm2 and only 128bit memory ??

That would be like a GTX980 with 128bit memory only.

Gm206 GTX960 is 228mm2 and is 128bit.
Gp104 should be 2x GP106 so 256bit and 350-400mm2

btw without DP units GP 104 should have close to 4K SP if its 350-400mm2 and GP 106 close to 2048SP
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Gm206 is 228mm2 and is 128bit.
Gp104 should be 2x GP106 so 256bit and 350-400mm2

GP106 at 200mm2 will have almost the same transistor count as GM204 (GTX 980).

So it will be like having a GTX980 with 128bit memory.
 

Adored

Senior member
Mar 24, 2016
256
1
16
Welcome to your new 1080p overlords. Nvidia has been doing the right thing by scaling buses down while AMD does the wrong thing by going upwards.

If Nvidia's full range is shader heavy and bandwidth lite, they will win all the benchmarks that matter.
 

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
Gm206 GTX960 is 228mm2 and is 128bit.
Gp104 should be 2x GP106 so 256bit and 350-400mm2

Polaris 10 is meant to be 232mm2 with a 256 bit memory controller on a 10% denser process,so closer to a 250MM2 card on the process Nvidia is using.

According to what was said by the earlier poster,there is meant to be another 300MM2 chip released by Nvidia.

It will be interesting since it might mean AMD barely gets to GTX1070 performance,and a cut down Polaris 10 also being used to fight the GTX1060 too.
 

Head1985

Golden Member
Jul 8, 2014
1,867
699
136
GP106 at 200mm2 will have almost the same transistor count as GM204 (GTX 980).

So it will be like having a GTX980 with 128bit memory.
Yeah GP106 close to 2048SP
GP 104 close to 4K SP
If they uses DDR5X it will be OK.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
GP106 at 200mm2 will have almost the same transistor count as GM204 (GTX 980).

So it will be like having a GTX980 with 128bit memory.

We have to see. I agree its way bandwidth starved. But so is Polaris with its 50% reductions compared to 28nm.

I am not buying a 256bit GDDR5 card with much more power than my GTX980, that's for sure. GDDR5X, sure. Else 2048bit HBM2.

If these cards from AMD and Nvidia release with such low busses. Then its clear they will do a second resell next year as GDDR5X editions. Same chips, new memory.
 

airfathaaaaa

Senior member
Feb 12, 2016
692
12
81
Polaris 10 is meant to be 232mm2 with a 256 bit memory controller on a 10% denser process,so closer to a 250MM2 card on the process Nvidia is using.

According to what was said by the earlier poster,there is meant to be another 300MM2 chip released by Nvidia.

It will be interesting since it might mean AMD barely gets to GTX1070 performance,and a cut down Polaris 10 also being used to fight the GTX1060 too.

polaris 10 is mid to high cards
polaris 11 will be the low to mid ones
also i think there is already a small polaris on compubench i saw it earlier tho the numbers didnt make any sense
 

Adored

Senior member
Mar 24, 2016
256
1
16
One idea I had is that they both prepare for GDDR5X by using the X bus (GDDR5X requires a modified MC) and going shader heavy incase GDDR5X was ready, but if not they would be able to use cut-down parts with normal GDDR5 first of all. This will help a ton with yield, though it means larger GPUs overall.

Like Shintai says, they would then release full version GPUs with GDDR5X next year.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
One idea I had is that they both prepare for GDDR5X by using the X bus and going shader heavy incase GDDR5X was ready, but if not they would be able to use cut-down parts with normal GDDR5 first of all. This will help a ton with yield, though it means larger GPUs overall.

Like Shintai says, they would then release full version GDDR5X next year.

I dont think you will see cut down parts outside the norm.

Now we dont know what GP104 will use yet. However assuming the worst (256bit GDDR5).

Then I am sure we will see Polaris 10+11, GP107, GP106 and GP104 as rereleases in 2017, just with GDDR5X as the sole change. And then you can add Vega 10+11 and whatever GP as top bins.
 

Adored

Senior member
Mar 24, 2016
256
1
16
I dont think you will see cut down parts outside the norm.

Now we dont know what GP104 will use yet. However assuming the worst (256bit GDDR5).

Then I am sure we will see Polaris 10+11, GP107, GP106 and GP104 as rereleases in 2017, just with GDDR5X as the sole change. And then you can add Vega 10+11 and whatever GP as top bins.

There must be a point where they are so bandwidth starved that cutting down the GPU would barely harm performance. The question would be if yields were so bad that it was worth it to do so.

It maybe could be on the larger of the GPUs. Whatever they do, GDDR5X just won't be ready in time for summer though so either we get something like this or they've both nailed a new compression technique or maybe larger caches and register files help to alleviate the worst of it.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Well lets see. The joker is both HBM2 and GDDR5X. I doubt GP104 cards will be cheap in any way. 256bit GDDR5 just doesn't make sense at that price point.

HBM2 wasn't ready either, yet it ships in June on GP100.