[Kitguru]Nvidia`s big Pascal GP100 have taped out - Q1 2016 release

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

shady28

Platinum Member
Apr 11, 2004
2,520
397
126
If they are going to full HBM (hardly) they will increae the prices and ditch the lowest tier since Intel is covering that tier now.

The thing is how much they increase

Intel really doesn't yet cover anything but the very, very low end right now. The threads that were talking about that were based on a GTA V benchmark with a Broadwell i7 / Iris Pro 6200 vs Athlon 860 quad core at 720p minimum settings that Tom's did. That author did a major disservice because the dGPUs were CPU limited in that game and the low settings favored iGPUs due to lower memory bandwidth needs.

Bottom line is if you want to game at 720p with everything set to minimum, yeah an i7 Iris Pro is fine.

If they wanted to give a dose of reality, they should have also ran it at medium settings 1080p with an i7 on all of the machines too. I guarantee you the iGPUs numbers would have crashed.
 

Ancalagon44

Diamond Member
Feb 17, 2010
3,274
202
106

Interesting.

At first I didn't think it was possible, but, look at this:
GF110, the GTX 580, was manufactured on a 40nm process and was 520mm² in size, with 3000 million transistors.

GM200, the GTX 980, was manufactured on a 28nm process and was 600mm² in size, with 8000 million transistors.

2.5x the number of transistors in a slightly larger area, when they moved from 40nm to 28nm. It looks like 28nm to 14nm will be a bigger jump than 40nm to 28nm was, so we could see a doubling of the transistor count and still end up with a slightly smaller die.

Very keen to see what both AMD and Nvidia will release next year.
 

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
Whatever Pascal GPU comes first, it won't be the big one. I don't care what anyone says. They will trick and fool us. We will buy it. 9 months to a year later the real chip comes and we all facepalm, AGAIN.
 

Sabrewings

Golden Member
Jun 27, 2015
1,942
35
51
Whatever Pascal GPU comes first, it won't be the big one. I don't care what anyone says. They will trick and fool us. We will buy it. 9 months to a year later the real chip comes and we all facepalm, AGAIN.

Yep, I'll be waiting for a fully unlocked GP200 or equivalent. Thought, that'll probably go into the next Titan, and I'd settle for a slightly cut down equivalent like the 980 Ti.

Until then, my 980 Ti will carry me along quite well, I'm sure. It helped over buying my needs this gen.
 

CakeMonster

Golden Member
Nov 22, 2012
1,630
809
136
If its 9 months, and you're a serious gamer, its worth getting a 16nm part which would be a huge upgrade over what you have now. Even if that part is a cheapo midrange part that NV overcharges for compared to what they _could_ have released. If there is a "680" or a "980" that stays on top for 9 months (except for a Titan), I would seriously get it on day 1. 9 months is a big chunk of your life.
 

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
If its 9 months, and you're a serious gamer, its worth getting a 16nm part which would be a huge upgrade over what you have now. Even if that part is a cheapo midrange part that NV overcharges for compared to what they _could_ have released. If there is a "680" or a "980" that stays on top for 9 months (except for a Titan), I would seriously get it on day 1. 9 months is a big chunk of your life.

lol. I suppose it is.
 

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
Logic upon release day: You might get hit by a train tomorrow. Purchase 4 mid range Pascals for $650 each today.
 

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
8,216
3,130
146
I am hoping pascal will support an SLI solution similar to AMD's, bridge less. I may go back to green then :D
 

BHZ-GTR

Member
Aug 16, 2013
89
2
81
Guys

Friends please answer my question ?

What Nvidia company can provide about 17 billion transistors in DIE ?

Using what technology can do and what features do ?

Heat consumption will rise
 
Last edited:

Sabrewings

Golden Member
Jun 27, 2015
1,942
35
51
Guys

Friends please answer my question ?

What Nvidia company can provide about 17 billion transistors in DIE ?

Using what technology can do and what features do ?

Heat consumption will rise

Just going off the node sizes, it shouldn't be very difficult for TSMC to do. You're shrinking to about half the node size which should allow nearly four times the transistors per mm2. They're only doubling the transistors so a smaller die than GM200 with 17 billion transistors is technically reasonable.

Heat should drop as your current flow per gate will drop to about 1/4 with the smaller node.
 

jpiniero

Lifer
Oct 1, 2010
16,819
7,259
136
It may or may not be fake, but it's pretty much on point with the increase in percentage of transistors for flagship dies when moving to a new node.

It'd be stupidly expensive and the yield would be so terrible you wouldn't get enough volume even if you sold it at the prices it would need to be.
 

Sabrewings

Golden Member
Jun 27, 2015
1,942
35
51
It'd be stupidly expensive and the yield would be so terrible you wouldn't get enough volume even if you sold it at the prices it would need to be.

You would only need yields for a die roughly half the size of a GM200 die to cram double the transistors in. If they're going full production, half the area of GM200 shouldn't be out of the question.
 

Mondozei

Golden Member
Jul 7, 2013
1,043
41
86
Yep, I'll be waiting for a fully unlocked GP200 or equivalent. Thought, that'll probably go into the next Titan, and I'd settle for a slightly cut down equivalent like the 980 Ti.

Until then, my 980 Ti will carry me along quite well, I'm sure. It helped over buying my needs this gen.

Although the 980 Ti is a rarity for how much performance you get for the extra price at the top end. You should be so lucky to see that happen twice! (and yes, I am jealous :D).
 

Mondozei

Golden Member
Jul 7, 2013
1,043
41
86
Whatever Pascal GPU comes first, it won't be the big one. I don't care what anyone says. They will trick and fool us. We will buy it. 9 months to a year later the real chip comes and we all facepalm, AGAIN.

With the slowdown of newer nodes, GP104 makes much more sense for 2016. Even with the smaller version, new arch, plus two node jumps and HBM2 will nevertheless be a huge bang. Bring full fat GP100 for consumers in 2017. Hope for 10 nm in 2018.
If not, 2018 will be 2015 on repeat. New arch on old node. Still decent jump.
 

jpiniero

Lifer
Oct 1, 2010
16,819
7,259
136
You would only need yields for a die roughly half the size of a GM200 die to cram double the transistors in. If they're going full production, half the area of GM200 shouldn't be out of the question.

That sounds about right on a size basis. Except you have to remember that the $/transistor improvement at 16 FF is little to none, so even with comparable yield the cost would be substantially more... and that's going to take some time to get that good of yield. Throw in Apple eating up the majority of the wafers and it's just not realistic.
 

Head1985

Golden Member
Jul 8, 2014
1,867
699
136
New info.pascal GPU with 16nm GPU in it.

nvidia-pascal-gpu-201bulfs.jpg


Pascal microarchitecture.
DirectX 12 feature level 12_1 or higher.
Successor to the GM200 GPU found in the GTX Titan X and GTX 980 Ti.
Built on the 16FF+ manufacturing process from TSMC.
Allegedly has a total of 17 billion transistors, more than twice that of GM200.
Taped out in June 2015.
Will feature four 4-Hi HBM2 stacks, for a total of 16GB of VRAM for the consumer variant and 32GB for the professional variant.
Features a 4096bit memory interface.
Features NVLink and support for Mixed Precision FP16 compute tasks at twice the rate of FP32 and full FP64 support. 2016 release.


Release should be around april 2016
http://wccftech.com/nvidia-pascal-n...ked-memory-1-tbs-bandwidth-powering-hpc-2016/
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
At this point it should be pretty obvious that the first chip released WONT be the fully fledged chip. It will be the x60 chip "upgraded" to an x80 chip again. They might even come up with even a new way to market it. Nobody can act like there was no prior notice this time.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
At this point it should be pretty obvious that the first chip released WONT be the fully fledged chip. It will be the x60 chip "upgraded" to an x80 chip again. They might even come up with even a new way to market it. Nobody can act like there was no prior notice this time.

Makes sense from a business point of view:

1) 980Ti is faster than Fury X so why release a 70-80% faster Pascal GP100 chip for $650-699 and cannibalize massive profits on the 980TI when you are under no pressure to do that? NV can just utilize the same strategy of marketing flagships aka $499 680 2GB/$579 GTX680 4GB, $650 780 and $550 980. There is no way they are going back to selling those cards for $250-300 considering how much profits they have realized with this new strategy.

2) There is likely huge pent-up demand for Tesla cards with DP. NV would be able to sell these cards for $5000+ to these customers vs. maximum $1000-1500 on the PC as a Titan Y? Even less incentive to release a 90-100% unlocked GP100 for $699 right off the bat.

3) Going back to point #1, since GTX680 and on, customers as a whole group have shown by voting with their wallets that they no longer care if the chip is mid-range or true flagship -- they will pay flagship prices for more performance regardless on where the chip lies in the NV architectural designation, even if it's just 15-25% more rather than the flagship historical 50-100% more. NV has heard this loud and clear. NV now has the data to completely abandon the decade(s) old way of launching graphics cards and move forward with bifurcating a generation, or maybe even splitting it into 3 parts like they did with 680->780->780Ti :awe:. Why would any business deviate from a strategy that's been SO successful and when consumers do not show resentment?

I think it is still possible for us to see a $650-700 GP100 Pascal but if it does launch in April 2016 as a consumer graphics card, I would bet it'll be something like a cut down successor aka in-line with a GTX780, not the full fledged flagship / direct replacement of the 980Ti/Titan X the Pascal generation. I actually foresee a mid-range successor to 980 at $550-600 and would be pleasantly surprised if NV goes back to the good old days. ^_^

On the whole though looks like Pascal is moving along very nicely - since the chip already tapped out with such a massive die and 1TB/sec HBM2, it looks like from their engineering end the fabs, things are smooth. Sounds like they have the execution down which means the only thing left is figuring out the order of releasing low-end to high-end cards to maximize profits over the next 2 years and ramping up yields. HBM2 and 16nm should provide massive gains in performance just by looking at GTX580->780 as a point of reference of a full node shrink + new architecture.

Mental note: as of October 2015, the cheapest GTX980 is $480 US. It will be interesting to see how much that level of performance costs by October 2016 with Pascal. I am expecting big improvements in the $450-550 price level by end of 2016. I expect much lower improvements in the $200 segment though where cards like the R9 390 are creeping down to $260-265 and are at least 50% faster. We might see another generation where the best cards to buy are $250+ and everything below is overpriced and under-powered offerings like right now.
 
Last edited:

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
@Russian

Nvidia successfully -and permanently- moved the price of high end up the ladder and AMD has followed suit. ~$500 is now the mid-die price from Nvidia and ~$400 is the new mid-die price from AMD. I don't see a $650-700 GP100 based video card for a long time. 16gb of HBM2 memory sounds awfully expensive to me. Since these cards will (likely) be out of my price range and thermal constraints anyways, I will be most interested in GP104 and whatever AMD has to counter GP104 in a similar power envelope. It'll be interesting to see if significantly faster GDDR5 ram is available (GDDR5X) or if they will be hitting the 8gbs speed wall and hoping that is enough. I don't see ~260gb/s being enough bandwidth for GP104, but I'm not an engineer and I'm sure Pascal is implementing more bandwidth savings features since Nvidia is perf/w & mobile conscious.

EDIT: Also, GTX980 staying above $450 likely shows that AMD's 300 series is having little to no effect on Nvidia's sales. Higher prices are here to stay forever.
 
Last edited:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Seems like both Nvidia and Intel just smack AMD around like it's a toy. Neither Intel/Nvidia is under any pressure to produce amazing products (by old standards), you can argue they are almost in a sense elongating current product life lines (blame it on nodes difficulties), and basically competing with themselves in their own markets.