[SA] News of Nvidia’s Pascal tapeout and silicon is important

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Feb 19, 2009
10,457
10
76
No wonder Koduri sounded so confident about being ahead of nvidia by a decent margin.

Good point, there was that recent interview where Raja Koduri said AMD will be significantly ahead of the competition on the new FF nodes at bringing GPUs to market.

You could say he's probably bluffing and it could be one of those "It's an overclocker's dream" thing, or that he's pretty certain to say it publicly.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
He sells with sensationalism; to those who want to believe what he writes.

Sounds like you're his perfect audience.

This isn't about me. Funny how those who have nothing to refute what is actually in the article want to make it about something besides the article.

I guess you get upset with most marketing. They don't often show you real products in marketing. Like I said, who cares. I guess I know now, those who have a grudge.

I'm not upset at all. He's a liar. It's not the first time. He's a repeat offender at it.

AMD was selling every CPU that it could make back in those days.

Whether that's true or not, I don't know. Nor do I care. Whether or not you just made that up isn't relevant to Intel's actions. In the end, Intel paid AMD.
 
Mar 10, 2006
11,715
2,012
126
Whether that's true or not, I don't know. Nor do I care. Whether or not you just made that up isn't relevant to Intel's actions. In the end, Intel paid AMD.

In the end, Intel wasn't found guilty and AMD agreed to settle for a small fraction of what it could have gotten if AMD's claims were valid, IMO.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
In the end, Intel wasn't found guilty and AMD agreed to settle for a small fraction of what it could have gotten if AMD's claims were valid, IMO.

They settled out of court. That's not the same as, "Intel wasn't found guilty". How much do you think it would have cost to pursue the case and how long could Intel have drawn it out? You only see what you want to.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I'm not upset at all. He's a liar. It's not the first time. He's a repeat offender at it.

That's like calling an actor in a commercial, drinking some energy drink that contains water rather than the actual product, a liar (at least in that case). He was at a show, he held a mockup and called it a new card. It's not like they showed its performance. They didn't sell that card to anyone. It was just a prop in a show.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
That's like calling an actor a liar (at least in that case). He was at a show, he held a mockup and called it a new card.

So now it's not even real life? It's all some Broadway play, I guess? Make all the excuses you want to. Doesn't change anything.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
So now it's not even real life? It's all some Broadway play, I guess? Make all the excuses you want to. Doesn't change anything.

That is all it was. A show. He held up a mockup card. Just like so many boxes with photoshop pictures. Just like the actor in a commercial drinking water from an energy drink can. It's just a show for marketing.

If no one saw a screw, no one would have known, and no one was duped of anything.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
That is all it was. A show. He held up a mockup card. Just like so many boxes with photoshop pictures. Just like the actor in a commercial drinking water from an energy drink can. It's just a show for marketing.

If no one saw a screw, no one would have known, and no one was duped of anything.

I think some investors would argue with you.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I think some investors would argue with you.
It seems silly to get upset with someone holding up a card that was fake at a show. Even an investor. If he made the card up (it didn't exist), misrepresented what the card can do, or falsified benchmarks, then get upset. Showing what a card is to look like, and act like it's the card in your hand is something done over and over in every industry.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
It seems silly to get upset with someone holding up a card that was fake at a show. Even an investor. If he made the card up (it didn't exist), misrepresented what the card can do, or falsified benchmarks, then get upset. Showing what a card is to look like, and act like it's the card in your hand is something done over and over in every industry.

They never said: "This is how the pascal GPU will look like", but rather: "This, I have here is pascal GPU"

This tactic is used in my country by a rich oil prospecting company to speculate on the stock market. They falsly claim to have found an oil/gas source with plenty resources. You know what happens to the share price? The next day the company said they really found a large oil quantities, barreled already, but with ORLEN/BP/etc lebels on it.
They never said it was underground natural source...
 
Last edited:

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
What about not showing it at all? We all know what a card looks like, so by showing it he wanted to make the audience believe he had a working sample in his hands, that's the issue. If you can't see past that then i rest my case.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
This is one thing a lot of people don't seem to realize. The mainstream Skylake CPU is a tiny chip - only about 122mm^2. That's nothing by GPU standards; $99 cards like the GTX 750 have a bigger die than this. You have to go back to Sandy Bridge before the mainstream Intel CPU had a die even as big as Pitcairn. (Sandy Bridge-E, with its 8 cores and ginormous L3 cache, was 435mm^2, about the same size as Hawaii. And even the $999 consumer part only had 6 of those 8 cores enabled.)

Incidentally, this fact puts Intel's "process lead" in perspective - they still haven't produced any genuinely big-die chips on 14nm. And the long delay for Broadwell-E points to yield issues as the likely cause.

Competition in the GPU market has forced AMD and Nvidia to give us a lot more transistors for the dollar than Intel does in the CPU arena.

You wrongly assume development, design etc is something like 1:1.

GPUs are overpriced 10-25x compared to DRAM/NAND. Why is AMD/Nvidia so greedy? ;)
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
More cores do not result in lower clocks if you actually allow a higher TDP, instead of having desktop high-end part at 65W or below, bump it to 95W, give it 8 cores, bigger chip, same price. Faster performance all round.

Intel's focus has been die shrinks = more profit selling smaller die for higher $.

You are out of touch with the world we live in. Its about performance/watt, not about performance.

And we already know what 8 cores need and will deliver on 14nm. 3.3Ghz base, 140W. Remember you dont just add cores, you need the entire uncore to follow.
 
Feb 19, 2009
10,457
10
76
@ShintaiDK
Intel's record margins is enough to say it loudly you don't need to try to defend them selling us expensive gimped chips just because they can.

Mobile perf/w is >>>> that way. On the desktop, enthusiast grade, 95W CPU is nothing major. Heck go 125W too if the performance justifies it.

Seriously if Zen do push Intel, we will see a resurgence in generational leap in CPUs like we've seen on GPUs and not the progress of late where its laughable.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
@ShintaiDK
Intel's record margins is enough to say it loudly you don't need to try to defend them selling us expensive gimped chips just because they can.

Mobile perf/w is >>>> that way. On the desktop, enthusiast grade, 95W CPU is nothing major. Heck go 125W too if the performance justifies it.

Seriously if Zen do push Intel, we will see a resurgence in generational leap in CPUs like we've seen on GPUs and not the progress of late where its laughable.

Right, pro bono companies again? Or companies running in red like AMD?

125W on the desktop doesn't justify it. We have been over this multiple times. Arachnotronic also explained the differences with GPUs contra CPUs. Lack of performance/watt is also one of AMDs reasons for their huge GPU share drop. And why they are more or less non existent in the CPU segment. And you wish to follow this course?

Lets see what you are going to pay for small 14/16nm GPU dies. Then you can make a drama over that as well. ;)
 
Feb 19, 2009
10,457
10
76
Lack of perf/w on AMD's stuff is not relevant to Intel because they actually have performance. There's two factors when you say perf/w, the first is performance.

Let's get back to the topic though.
 

NTMBK

Lifer
Nov 14, 2011
10,487
5,912
136
You are out of touch with the world we live in. Its about performance/watt, not about performance.

And we already know what 8 cores need and will deliver on 14nm. 3.3Ghz base, 140W. Remember you dont just add cores, you need the entire uncore to follow.

This is the bit that people forget about. The network-on-chip gets increasingly complex and increasingly busy, the more cores you add. Every core needs to be able to communicate with every other core, in order to maintain cache coherency. This communication overhead does not scale linearly with number of cores.

A more complex NoC adds die size, adds power consumption, adds latency to memory fetches. It's not just a matter of bolting on another 4 cores and calling it a day.
 

FatherMurphy

Senior member
Mar 27, 2014
229
18
81
I think some investors would argue with you.

Perhaps with regard to the Fermi incident. If I recall correctly, in that instance, JHH suggested that Fermi was on track to ship months sooner than it actually did. If I were an investor, yes, I would be upset that the company failed to meet the target and that I was induced into buying/keeping my shares.

With regard to the Drive PX 2, that's simply not the case, and folks here have blown that demonstration out of context. The demonstration wasn't about Pascal, about Geforce, about the status or availability of Pascal as dGPUs. It was about the Drive PX 2, what it would look like, what it would be capable of when released, and its release date. If it turns out that the Drive PX 2 fails to meet JHH's descriptions, let's revisit this argument much later this year.

But let's not pretend that anything JHH said during that demonstration had any bearing on Pascal's availability in Geforce products.

Again, we know so little about Pascal's availability beyond "2016." Indeed, most of what we know (or think we know) about Pascal's availability can only be inferred by tangential news (i.e. the availability of HBM2 and/or GDDR5X).
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Perhaps with regard to the Fermi incident. If I recall correctly, in that instance, JHH suggested that Fermi was on track to ship months sooner than it actually did. If I were an investor, yes, I would be upset that the company failed to meet the target and that I was induced into buying/keeping my shares.

With regard to the Drive PX 2, that's simply not the case, and folks here have blown that demonstration out of context. The demonstration wasn't about Pascal, about Geforce, about the status or availability of Pascal as dGPUs. It was about the Drive PX 2, what it would look like, what it would be capable of when released, and its release date. If it turns out that the Drive PX 2 fails to meet JHH's descriptions, let's revisit this argument much later this year.

But let's not pretend that anything JHH said during that demonstration had any bearing on Pascal's availability in Geforce products.

Again, we know so little about Pascal's availability beyond "2016." Indeed, most of what we know (or think we know) about Pascal's availability can only be inferred by tangential news (i.e. the availability of HBM2 and/or GDDR5X).

You are forgetting that he said, more than once, that the PX2 had Pascal chips on it. But it was proven that this was a lie, as it did not have Pascal chips on it. The chips on it were a year old. Which means his performance numbers that he claimed were also a lie. As you cannot performance test something that does not exist.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
nVidia hasnt "performance test[ed]" anything. Huang showed a powerpoint slide and a mockup. Nothing was demonstrated on a PX2 modul.
 

FatherMurphy

Senior member
Mar 27, 2014
229
18
81
You are forgetting that he said, more than once, that the PX2 had Pascal chips on it. But it was proven that this was a lie, as it did not have Pascal chips on it. The chips on it were a year old. Which means his performance numbers that he claimed were also a lie. As you cannot performance test something that does not exist.

He gave the theoretical performance of the PX2 with unreleased Pascal chips and unreleased Tegra chips (it still gets me that no one seems to be upset that there were mock-up Tegra chips on there, but the GPUs have created a forum frenzy).

The public demonstrations were done with a Titan X (I believe) and NV acknowledged that.

The PX 2 was a mock up, not an actual unit.

You are missing the point, I think. The grievance any investor would have is if the PX 2 misses the deadlines or theoretical performance numbers that JHH gave out.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
They never said: "This is how the pascal GPU will look like", but rather: "This, I have here is pascal GPU"

This tactic is used in my country by a rich oil prospecting company to speculate on the stock market. They falsly claim to have found an oil/gas source with plenty resources. You know what happens to the share price? The next day the company said they really found a large oil quantities, barreled already, but with ORLEN/BP/etc lebels on it.
They never said it was underground natural source...

I already said he acted like he had it. It's a show, big deal. It's not like they a working card didn't exist, he just used a prop on stage as the a card was likely in a lab somewhere still being worked on.
 
Mar 10, 2006
11,715
2,012
126
This is the bit that people forget about. The network-on-chip gets increasingly complex and increasingly busy, the more cores you add. Every core needs to be able to communicate with every other core, in order to maintain cache coherency. This communication overhead does not scale linearly with number of cores.

A more complex NoC adds die size, adds power consumption, adds latency to memory fetches. It's not just a matter of bolting on another 4 cores and calling it a day.

Wait, are you serious? I thought adding cores was a trivial matter of copy & paste!

Are you really suggesting that there are engineering challenges and trade-offs to be made as core counts scale up that might not be optimal for the consumer target market here?

BURN HIM!!!!
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
I already said he acted like he had it. It's a show, big deal. It's not like they a working card didn't exist, he just used a prop on stage as the a card was likely in a lab somewhere still being worked on.

Wouldn't he use production rejected pascal not working dies then. This fake was more obvious than the wooden screw. If he had any pascal, not even functional one, he would use it. He hadn't, lest draw our own conclusions each...
 
Status
Not open for further replies.