Nvidia: Not Enough Money in a PS4 GPU for us to bother

Page 22 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
You realize that Bobcat is faster than Atom, and Jaguar is faster than Bobcat? There also aren't many octa core phones running around out there.

What i read about Jaguar is it will be around 15% faster per clock in average.
A15 is on par with Jaguar per clock. And nVidia will clock the CPU up to 1,9GHz.

And why do we need 8 cores? How many x86 applications you know scaling up to 8 threads? How many games? Is there one application in which Bulldozer is twice as fast as i5?

nVidia announced that they will use their high performance ARM CPU in Parker 2015. So 1 1/2 years after the release of the PS4 nVidia will provide more CPU performance to anybody than Sony. And the lifetime of a console is now 6-7 years.
 

Olikan

Platinum Member
Sep 23, 2011
2,023
275
126
You might wanna read this tidbit...the days of getting "more" out of a process is long gone:
http://forums.anandtech.com/showthread.php?t=2308199

Post #13 and #35 by Idontcare

i know very well this, parametrics don't change as used to be...but that don't apply to design layout changes ;)

an example is that the 580 was way better than the 480, nvidia learned that some transistors weren't that good and replaced them...
same thing for the 7970 to 7870
same reason for bulldozer to piledriver
ivy to haswell

....can i keep going?
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
What is with 4890 over 4870?
Or Cayman over Cypress?

You assume that AMD can do the same thing like nVidia. But yet every time they tried it they failed.
 

Imouto

Golden Member
Jul 6, 2011
1,241
2
81
AMD's problem is not the small die. It's the power consumption. TITAN is 25% faster than the 7970GHz and uses less power.

There is no grow potential for AMD on 28nm.

Do you realize that the lower tier AMD cards are on par with Nvidia's counterparts? Same perf per mm^2 and power consumption.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
i know very well this, parametrics don't change as used to be...but that don't apply to design layout changes ;)

an example is that the 580 was way better than the 480, nvidia learned that some transistors weren't that good and replaced them...
same thing for the 7970 to 7870
same reason for bulldozer to piledriver
ivy to haswell

....can i keep going?

Yeah keep faling LOL
Bulldozer to piledriver...from an AMD biased source.
http://semiaccurate.com/2012/11/05/amds-bulldozer-core-compared-with-piledriver/#.UUi_P1J7x9M

Ivy Brigde to Haswell:
http://en.wikipedia.org/wiki/Haswell_(microarchitecture)#Confirmed_new_features
A bit more than a "refresh"...it's a tock....a new architechture/core

Keep going....
 

absolutezero

Junior Member
Jan 11, 2013
21
0
0
What i read about Jaguar is it will be around 15% faster per clock in average.
A15 is on par with Jaguar per clock. And nVidia will clock the CPU up to 1,9GHz.

And why do we need 8 cores? How many x86 applications you know scaling up to 8 threads? How many games? Is there one application in which Bulldozer is twice as fast as i5?

nVidia announced that they will use their high performance ARM CPU in Parker 2015. So 1 1/2 years after the release of the PS4 nVidia will provide more CPU performance to anybody than Sony. And the lifetime of a console is now 6-7 years.

I am curious, proof that A15 is faster than brazos(clock by clock)?
 

Olikan

Platinum Member
Sep 23, 2011
2,023
275
126
What is with 4890 over 4870?
Or Cayman over Cypress?

You assume that AMD can do the same thing like nVidia. But yet every time they tried it they failed.

4890 and 4870, actually have a very similar perf/watt :colbert:
Cayman over Cypress .....really? the changes were bigger than vliw-4 to GCN

and....are you telling me that the massive perf/watt of the 7870 have over the 7970 don't exist?

Yeah keep faling LOL
Bulldozer to piledriver...from an AMD biased source.
http://semiaccurate.com/2012/11/05/a.../#.UUi_P1J7x9M

Ivy Brigde to Haswell:
http://en.wikipedia.org/wiki/Haswell...d_new_features
A bit more than a "refresh"...it's a tock....a new architechture/core

Keep going....

perf/watt is where?
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
4890 and 4870, actually have a very similar perf/watt :colbert:
Cayman over Cypress .....really? the changes were bigger than vliw-4 to GCN

and....are you telling me that the massive perf/watt of the 7870 have over the 7970 don't exist?



perf/watt is where?

At someone WHO cares....just as sweetspot, monthly WHQL and all other strange metics people go apeshit over when performance is lacking.
 

Olikan

Platinum Member
Sep 23, 2011
2,023
275
126
At someone WHO cares....just as sweetspot, monthly WHQL and all other strange metics people go apeshit over when performance is lacking.

i was counter arguing that tahiti reached a power wall... not finding excusses that amd lost the crown

even with a respin, a 7970 based card won't reach the performance of nvidia's titan...
but a 5xxmm² card using the same perf/watt of the 78xx series, that btw, is similar to kepler cards.... would
 

itsmydamnation

Diamond Member
Feb 6, 2011
3,077
3,908
136
What i read about Jaguar is it will be around 15% faster per clock in average.
A15 is on par with Jaguar per clock. And nVidia will clock the CPU up to 1,9GHz.
.
IPC is up 15%, IPC is not performance, the core is now twice as wide, bobcat was only 64bit FPU and that really hurt it on FP based workloads, its integer performance is quite solid on to begin with. look at what previous consoles used, it is all vector. Compared to Xenons FP throughput is going to be around 6 times higher clock adjusted because Jaguars IPC per thread is way higher.
 
Last edited:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
i was counter arguing that tahiti reached a power wall... not finding excusses that amd lost the crown

even with a respin, a 7970 based card won't reach the performance of nvidia's titan...
but a 5xxmm² card using the same perf/watt of the 78xx series, that btw, is similar to kepler cards.... would

Keep dreaming...

Why u so mad?

Why do you post irrelevant garbage?
Not big enough to use arguments...so you need to lie? ^^
 

NTMBK

Lifer
Nov 14, 2011
10,448
5,831
136
Vast quantities?
So NVIDIA is bleeding money?

Just ignore the facts okay:
revenue.png


Look to me like their focus in split like this:
GPU (Consumer Space)
Quadra/Tesla (Professional)
And the last comes mobile space.

That's the plot of revenue, not spending, which is what I was talking about. Between Tegra and the ongoing Project Denver saga, NVidia have sunk a lot of R&D into chasing the mobile ARM market, which is notoriously low margin. And as you just handily demonstrated, it isn't getting them much revenue.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Oh, the irony.

Yeah, its not like he has a fetisch abot me and in the same time posting nothing but garbage:

Why u so mad?

I don't get what you are arguing lol.

Assumptions?

You would think someone trying to insult someone would use a spellchecker so they don't look like a dumbASS.

This is how I imagine Lonbjerg's reaction:

Image removed due to baiting another member.

Super Moderator BFG10K.

Always playing the victim card.

Just ask Lonbjerg, he'll tell you all about microstuttering in Crossfire/SLi.


I don't quite understand what that link was supposed to convey, whether it was supposed to be guilt by association or to prove he is a fanboy.

If it's guilt by association then I was going to google your name but I found your signature, you know where you accept that your part of a focus group for nVidia.

If it's supposed to show how he is a fanboy then you and Lonbjerg are both arguably equally as fanatic as him in defending your company.

The most ironic thing is that people bag on you for both being a fanboy and through guilt by association(i.e.your signature) but it's "offensive" to you but suddenly when it's someone else it's perfectly fine.



Nah I never commented on those two because I know nothing about them, I can't comment on something I don't know anything about.

Probably does(it's his job) but why would you imply he is e-mailing reviewers? Do you have any proof of this?

(Notive the irony there in bold?)

Your a contradiction to your sig.

He has issuuses...and seem to have some mental attachment too me...so take you irony...and use it on your self ^^
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
That's the plot of revenue, not spending, which is what I was talking about. Between Tegra and the ongoing Project Denver saga, NVidia have sunk a lot of R&D into chasing the mobile ARM market, which is notoriously low margin. And as you just handily demonstrated, it isn't getting them much revenue.

How much have they sunk in?
 

NTMBK

Lifer
Nov 14, 2011
10,448
5,831
136
What i read about Jaguar is it will be around 15% faster per clock in average.
A15 is on par with Jaguar per clock. And nVidia will clock the CPU up to 1,9GHz.

Whereabouts have you see those clock speeds? I was under the impression that 1.9GHz was the single-core max-turbo speed, and we hadn't heard standard clocks.

And why do we need 8 cores? How many x86 applications you know scaling up to 8 threads? How many games? Is there one application in which Bulldozer is twice as fast as i5?

Plenty of modern game engines scale to 8 threads. Most obvious example is Frostbite 2, of Battlefield 3 fame, an in line to power the majority of EA's next-gen titles (Battlefield, MoH, Command and Conquer, Mass Effect, Dragon Age). The latest CryEngine, from Crysis 3, is another. Don't look at FX8 vs. i5, as that's harder to draw parallels across architectures- compare i5 vs i7, or FX4 vs FX8.

nVidia announced that they will use their high performance ARM CPU in Parker 2015. So 1 1/2 years after the release of the PS4 nVidia will provide more CPU performance to anybody than Sony. And the lifetime of a console is now 6-7 years.

Given that we know almost nothing about Denver, don't bet too much on it just yet. NVidia has been making promises about it for what, 4 years now?
 

itsmydamnation

Diamond Member
Feb 6, 2011
3,077
3,908
136
How much have they sunk in?


you can get an idea from their 10-K reports, in the last 3 years somewhere around 1 billion invested for a loss of 500 million. Personally i dont see this as a bad thing, infact its good leadership, they have set a direction and they are following through. It was always going to a few generations of product to start to see results. Its getting to the point now where one would expect to start seeing said results.


edit: this is just CPB segment not denver.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
That's the plot of revenue, not spending, which is what I was talking about. Between Tegra and the ongoing Project Denver saga, NVidia have sunk a lot of R&D into chasing the mobile ARM market, which is notoriously low margin. And as you just handily demonstrated, it isn't getting them much revenue.

50% is more than AMD has. So when we call the ARM market "low margin" how would you describe AMD's whole business modell and market? :whiste:

BTW: nVidia is investing $600 millions this year into Tegra after their revenue was $704 millions last year.
 

Imouto

Golden Member
Jul 6, 2011
1,241
2
81
50% is more than AMD has. So when we call the ARM market "low margin" how would you describe AMD's whole business modell and market? :whiste:

Everyone is calling that a falling apart house. Don't get why you need to bash AMD to make your point. I don't think you'll find anyone not seeing a desperate move from AMD to keep its sinking ship somehow with its nose out the water.

BTW: nVidia is investing $600 millions this year into Tegra after their revenue was $704 millions last year.

$764 millions revenue -> $157 millions operating income loss

They blew those $764 millions and an extra $157 millions and they've been doing it for years.

Nvidia has been losing tons of money in the first years of Tegra but somehow a console maker can't do the same. I really don't know what's happening with Tegra but this is going to be a really hard year for NV since it's sticking its nose in the ass of companies that dwarf it by orders of magnitude.

As NTMBK said it's an overpopulated and low margin market. They're investing money to get positioned like AMD is doing with consoles.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Nvidia is trying to get away from x86, isn't it obvious?

They want to move away from their dependency on Intel/AMD for platforms on their high end workstations.

Smartphones/Tablets just seem to be a way to pay for the R&D, a means to an end if you would.


Hopefully it goes someplace, at this point for my needs the cpu is really the coprocessor.