Pascal graphics card to launch at Computex 2016 and enter mass shipments in July

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

xpea

Senior member
Feb 14, 2014
458
156
116
There must be a point where they are so bandwidth starved that cutting down the GPU would barely harm performance. The question would be if yields were so bad that it was worth it to do so.

It maybe could be on the larger of the GPUs. Whatever they do, GDDR5X just won't be ready in time for summer though so either we get something like this or they've both nailed a new compression technique or maybe larger caches and register files help to alleviate the worst of it.
well, HBM2 should also not be ready before 2017. Finally P100 is shipping next month with it.

Except Nvidia and Micron, nobody know the state of GDDR5X production. Maybe a good surprise ? Or maybe new memory compression requires much less memory bandwidth...

edit: beaten by ShintaiDK !
 

Adored

Senior member
Mar 24, 2016
256
1
16
Well there's "ready" and "ready". At $10K+ per card and tiny volume anything coming out the fab is useable product.

If Nvidia has been buying up all the very early production from TSMC, Samsung and Micron then AMD is in a world of pain, however I'm pretty sure we'd have seen that in their costs the last two quarters.

Micron has been saying summer mass production of GDDR5X for the past two months, and only sending samples in spring. I just don't think they are in some collusion with Nvidia on that level.
 

Ranulf

Platinum Member
Jul 18, 2001
2,861
2,509
136
Let me be clear. There are many folks around here who prefer AMD/AMD products who are perfectly rational and are absolute joys to have intelligent debates/discussions with. Same goes for NVIDIA, Intel, ASUS, ASRock, whatever.

But then you have those posters who consistently make very far out claims that don't jive with reality. For example, I have seen buyers of the GTX 960 mocked for not purchasing a Radeon R9-whatever in its place. These people have been called "sheep."

In fact, NVIDIA buyers in general on these forums seem to be treated as ignorant fools with more money than brains. It doesn't occur to the "GPU elite" who think AMD is the only sensible choice, who narrowly focus on raw perf/$ that there are perhaps other considerations that are important to buyers.

If you want good gaming performance in a system where a high wattage PSU is not realistically an option, something like the GTX 950 or GTX 960 makes a ton of sense, even if it's not the best perf/$ option from an upfront acquisition perspective.

Then let's talk about say, a GTX 980 Ti buyer. The 980 Ti, by-and-large, is objectively viewed as the superior ultra-high end GPU in terms of both raw performance in a wide swath of games as well as in power efficiency. It's also something of an "overclocker's dream."

Yet, what I am finding is that people are now trying to dissuade others from purchasing a high-end NVIDIA card because of the "DX12 boogeyman." The value prop? Forget all of the games you want to play today, the games of "the future" will all magically run better on AMD because...[Async Compute, GCN is the most advanced GPU architecture in the history of ever, AMD first to HBM, ...].

It's getting old.

Oh give me a break. The 960 has never been a compelling card for the price, its just what nvidia put in that perf spot because they could. Nvidia has had no problem telling its customers to put up or pound sand the past two+ years. If they had been honest that the 960 would be released 3+ months later (and only 2gb) I would have actually bought a 970 when they were released. I waited and was disappointed to say the least. It made my 290 purchase a wise one.

My 7870 was $150 in December of 2013 and it is still competitive with a 960.

The anti 980ti advice for right now is wise given what has happened in the last two years with Kepler and the 980 itself.
 

Kris194

Member
Mar 16, 2016
112
0
0
I hope that by reference boards they don't mean shitty nvidia's reference cooler. Moreover there is no way that GP104 will be GDDR5 based. One month is more than enough to solder memory chips to the board.
 
Last edited:

tential

Diamond Member
May 13, 2008
7,348
642
121
Welcome to your new 1080p overlords. Nvidia has been doing the right thing by scaling buses down while AMD does the wrong thing by going upwards.

If Nvidia's full range is shader heavy and bandwidth lite, they will win all the benchmarks that matter.

This is Nvidia, I always expect them to do well and I don't expect a poorly run company like amd to succeed. I think amd makes great hardware but in terms of winning the "benchmark battles" it's like Nvidia trains for it when they release a new gpu.

I'm curious though so far which architecture looks better for High resolution. That's all I care about actually. I don't care who wins the war or who releases first, I'm going to get Polaris because I'm forced to, but I want to know which architecture is going to be more suited to high resolution gaming. 1080p makes me wanna cry, it's all about 4k. This is the last year I touch a 1080p native screen for pc gaming it's freesync 4k from here on out with Polaris and vega. Don't see why anyone would want to stick with 1080p after this gen but whatever.

Sent from my C6833 using Tapatalk
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
That wasn't really the case; Samsung said they started mass production of HBM2 in January. Micron hasn't started mass production of GDDR5X yet.

Micron started to ramp GDDR5X into mass production in February. And expect full volume production in summer. Its not impossible to get chips for a higher end SKU.
 

Adored

Senior member
Mar 24, 2016
256
1
16
Micron started to ramp GDDR5X into mass production in February. And expect full volume production in summer. Its not impossible to get chips for a higher end SKU.

You have to look at the context here. What exactly is "full volume production" for GDDR5X when only AMD and Nvidia are using it? In that segment we're talking what, 10 million GPUs in a whole year?

Another rarely discussed factor is cost. I'm quite sure Micron didn't create GDDR5X out of the goodness of their hearts.
 

DooKey

Golden Member
Nov 9, 2005
1,811
458
136
This is Nvidia, I always expect them to do well and I don't expect a poorly run company like amd to succeed. I think amd makes great hardware but in terms of winning the "benchmark battles" it's like Nvidia trains for it when they release a new gpu.

I'm curious though so far which architecture looks better for High resolution. That's all I care about actually. I don't care who wins the war or who releases first, I'm going to get Polaris because I'm forced to, but I want to know which architecture is going to be more suited to high resolution gaming. 1080p makes me wanna cry, it's all about 4k. This is the last year I touch a 1080p native screen for pc gaming it's freesync 4k from here on out with Polaris and vega. Don't see why anyone would want to stick with 1080p after this gen but whatever.

Sent from my C6833 using Tapatalk

I don't understand why you care about a NV offering when you admit yourself you are stuck with Freesync (by your own choice). Personally I have a G-sync monitor but I have no probs buying a freesync monitor if AMD has the better card for gaming this generation. Anyway, this gen isn't going to be good enough for 4K in my opinion because I refuse to upgrade till I can get a monitor and card that will do 144hz at 4K. I guess I have higher standards than some, but that's how I roll.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,576
126
You have to look at the context here. What exactly is "full volume production" for GDDR5X when only AMD and Nvidia are using it? In that segment we're talking what, 10 million GPUs in a whole year?

Another rarely discussed factor is cost. I'm quite sure Micron didn't create GDDR5X out of the goodness of their hearts.

They likely created it because a company (or two) expressed interest...and that company will likely be using it soon, imo.
 

96Firebird

Diamond Member
Nov 8, 2010
5,741
340
126
They likely created it because a company (or two) expressed interest...and that company will likely be using it soon, imo.

That, and they must have seen HBM coming in the near future. GDDR5X may not be hyped as much as HBM(2), but it keeps Micron in the game when it comes to graphics cards.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
I can see cards released (Retail availability) in late Q3 early Q4 with GDDR-5X, but not earlier.
 

Magee_MC

Senior member
Jan 18, 2010
217
13
81
Yes, it is. They're obviously talking about NVIDIA, no matter how you twist the facts.

I don't see it that way. I think that he's talking about the transition from 28 to 14 and that they are ahead of their internal schedule. I think that too many people are trying to read into what is said instead of reading what is said.
 

jpiniero

Lifer
Oct 1, 2010
16,761
7,217
136
Micron started to ramp GDDR5X into mass production in February.

No they haven't. This is what their blog on own website says back in Feburary:

There have been a variety of rumors in the industry about GDDR5X availability timeframes, so I’d like to clear that up from the beginning: Micron’s GDDR5X program is in full swing and first components have already completed manufacturing. We plan to hit mass production this summer.

https://www.micron.com/about/blogs/2016/february/gddr5x-has-arrived

I'd have to think you would need several months of mass production type volume to handle consumer discrete GPUs. So unless they do an epic paper launch the earliest of an release would have to be September. By comparison the mass production Micron is talking about Samsung started with HBM2 in February.

Or the X70/X80 doesn't have GDDR5X.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
No they haven't. This is what their blog on own website says back in Feburary:



https://www.micron.com/about/blogs/2016/february/gddr5x-has-arrived

I'd have to think you would need several months of mass production type volume to handle consumer discrete GPUs. So unless they do an epic paper launch the earliest of an release would have to be September. By comparison the mass production Micron is talking about Samsung started with HBM2 in February.

Or the X70/X80 doesn't have GDDR5X.

From your link:

Micron is currently ramping GDDR5X to mass production

And this was 2 months ago.

The volume for a single card like for example the GP104 and its price class wouldn't be that much. Maybe a million cards in the entire 2016. If we talk about GP106, GP107, Polaris 11 and Polaris 10, sure. But we dont.

Take a look at Fiji and HBM1. Not exactly what you call volume.
 
Last edited:

tential

Diamond Member
May 13, 2008
7,348
642
121
I don't understand why you care about a NV offering when you admit yourself you are stuck with Freesync (by your own choice). Personally I have a G-sync monitor but I have no probs buying a freesync monitor if AMD has the better card for gaming this generation. Anyway, this gen isn't going to be good enough for 4K in my opinion because I refuse to upgrade till I can get a monitor and card that will do 144hz at 4K. I guess I have higher standards than some, but that's how I roll.

That's great for you, 4k and high resolution matters to me and it's already possible on crossfire/sli midrange either r9 290 cf so not sure why u wouldn't do the same thing with Polaris.

Pascal matters to me obviously because it's a rough indicator or Polaris performance and it's interesting. Pascal would need a significant high resolution advantage to switch over so I'll see if they happens.