Nvidia: Not Enough Money in a PS4 GPU for us to bother

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
Microsoft is also going AMD. My point is, if they have a different approach, with final specs of the console. Criticism or praise, that's not on AMD, it's on the console's legacy. The two consoles apparently will have a AMD APU.
Somehow this turned from Nvidia's comments on not having the design contract to Nvidia being against the technology?
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,476
136
The experiences will be pretty similar, but the PC will be brute forcing its way to comparable performance.

this is exactly what people need to understand. Do not compare a desktop HD 7850 with the PS4 GPU. its definitely much more capable and developers will be able to get much more performance in a console given the low overheads in a console OS and console API with close to metal programming control.

https://twitter.com/ID_AA_Carmack/status/50277106856370176

"Consoles run 2x or so better than equal PC hardware, but it isn’t just API in the way, focus a single spec also matters"
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,476
136
The PS4 fixes all four of these issues. Totally new frameworks built from the ground up for this platform, with buckets of graphics power and the bandwidth to feed it. Not to mention that rumours are pointing towards the next Xbox having essentially the same design.

In terms of raw horsepower, yeah, it's considerably weaker. But this design means that they can support whole new features that a standard PC design just can't do. Look at GPGPU physics- PhysX can come up with some seriously nice looking graphical effects (especially with the GPGPU power of GK110), but it can't affect the underlying simulation of the game world. But imagine if all of these thousands of simulated particles actually impacted on the world, knocking players out of place, deflecting bullets, washing bridges away in a torrent of GPGPU fluids, whatever. That's what you could realistically achieve by killing the communication cost between CPU and GPU.

very well said. true CPU/GPU communication can happen only when the communication latency is low. Till now all GPU physics effects were graphical and did not affect gameplay, physics (run on CPU) and AI. the APU tech is the facilitator for game developers to push games to the next level of immersion and gameplay experience.
 
Last edited:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
this is exactly what people need to understand. Do not compare a desktop HD 7850 with the PS4 GPU. its definitely much more capable and developers will be able to get much more performance in a console given the low overheads in a console OS and console API with close to metal programming control.

https://twitter.com/ID_AA_Carmack/status/50277106856370176

"Consoles run 2x or so better than equal PC hardware, but it isn’t just API in the way, focus a single spec also matters"

No it's not.
Back when the PS3 launched fans were rabid about the RSX being better than the PC counterpart.

I got one of those.
an AGP(yes not PCI-e) Gainward Bliss 7800 GS+(Full G71 24 pixel/8 vertex core, 512 MB RAM) on a P4 Northwood 2.4GHz.

I enjoyed better I.Q., better FPS and better resolution on the titles that were multiplatform over the PS2.

Despite the API "inefficiency"
 

The Alias

Senior member
Aug 22, 2012
646
58
91
No it's not.
Back when the PS3 launched fans were rabid about the RSX being better than the PC counterpart.

I got one of those.
an AGP(yes not PCI-e) Gainward Bliss 7800 GS+(Full G71 24 pixel/8 vertex core, 512 MB RAM) on a P4 Northwood 2.4GHz.

I enjoyed better I.Q., better FPS and better resolution on the titles that were multiplatform over the PS2.

Despite the API "inefficiency"

cell was horrible to code for . getting horsepower out of the ps3 was beyond a chore
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,476
136
No it's not.
Back when the PS3 launched fans were rabid about the RSX being better than the PC counterpart.

I got one of those.
an AGP(yes not PCI-e) Gainward Bliss 7800 GS+(Full G71 24 pixel/8 vertex core, 512 MB RAM) on a P4 Northwood 2.4GHz.

I enjoyed better I.Q., better FPS and better resolution on the titles that were multiplatform over the PS2.

Despite the API "inefficiency"

you mean PS2 titles or PS3 titles. anyway if it is PS3 can you tell which game played and looked better on the Pentium 4 PC with 7800 GS+ than on the PS3.

also games like BF3, Crysis 3, Farcry 3 run on PS3. try these games on a Pentium 4 PC with 7800 GS+.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
also games like BF3, Crysis 3, Farcry 3 run on PS3. try these games on a Pentium 4 PC with 7800 GS+.
This, they would run painfully slow if at all. Anyway, going by what the devs are saying about the PS4, a very exciting bit of hardware. This will be a real boon to PC gaming in general, the cross platform issues will largely be a thing of the past, even better for AMD GPUs that share the same basic architecture.

This is a huge win for AMD when you look at it objectively, you can be damn sure Nvidia would have jumped all over the chance to have their hardware in the PS4 if they were able to supply an x86 APU.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
you mean PS2 titles or PS3 titles. anyway if it is PS3 can you tell which game played and looked better on the Pentium 4 PC with 7800 GS+ than on the PS3.

also games like BF3, Crysis 3, Farcry 3 run on PS3. try these games on a Pentium 4 PC with 7800 GS+.

You mean they run upscaled, no AA, no AF, low textures, reduced draw distance ect?
I remember doing a side by side comparison with Call of Duty 2 back in the day. and some other games....no contest.

And while the console has been stuck with that kind for perfomanc ever since...I moved on.
The G80 laughed it's ass of looking at the consoles...
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
You mean they run upscaled, no AA, no AF, low textures, reduced draw distance ect?
I remember doing a side by side comparison with Call of Duty 2 back in the day. and some other games....no contest.

And while the console has been stuck with that kind for perfomanc ever since...I moved on.
The G80 laughed it's ass of looking at the consoles...

Oh gee, you think? The first G80 based part was released in Dec 06 if memory serves. The xbox 360 was released in 2005.

If you want to get super technical about things, the xbox 360 did exceed the capability of PCs at release - the 360 had a unified shader model before even PCs did. Obviously ATI released their x1800/x1900 series shortly afterward which introduced the unified shader model to PCs (which .. again...the 360 had this before PCs did) and closed the performance gap and exceeded it. PCs were once again ahead of the 360. But I assure you, the 360 at launch had very impressive specifications and in many ways exceeded PCs. It didn't last long, but you're looking at a 300$ piece of hardware vs. a 2000$ PC with a GPU along costing over 500$ in some cases.

There is no need to turn this into a PC elitist type of argument, both consoles and PCs have their place - you can easily enjoy both as I do. Obviously PCs will scale faster over time, but at the same time a PC costs a lot more. They both have their place....I think the PS4 will be a cool piece of tech at launch just like the xbox 360 was. Can you build a 2500$ PC that will be a lot better in terms of brute force? Yeah, sure you can. I am one of those people that will build such machines, but that doesn't mean that I don't enjoy consoles....PCs and consoles aren't mutually exclusive. Perhaps you should expand your horizons a bit, you can easily enjoy both for their strengths and weaknesses.
 
Last edited:

itsmydamnation

Diamond Member
Feb 6, 2011
3,114
3,965
136
And while the console has been stuck with that kind for perfomanc ever since...I moved on.
The G80 laughed it's ass of looking at the consoles...

It also cost just as much if not more then a console. Here is the trick, explain how you are going to brute force latency, your going to start getting complex DX11+ shaders where the CPU's resource can be used for complex critical paths, you wont be able to do that on a Discrete GPU. That will be a significant performance penalty for PC on top of the normal benefits targeting a fixed platform has.

Incredible exaggeration on your part. The actual difference is minute.
What do you mean "without PC's bloated dx11 api" ?????
Wouldn't a console with or without CPU/GPU on same die utilize DX11 api?

The actual difference can be massive and if people actually did Research instead of just jabbering you would know this. The biggest bottleneck PC's have is draw calls. a 7 year old console leaves modern PC's will major DX updates to try and increase the amount of draw calls it makes for dead.

To much selective bias here, not enough objective analysis, G80 for example had more transistors then Cell and RSX combined, twice the memory throughput. But more importantly to build a balanced PC around G80 had you spending 4-5 times as much money.

gpu's have also hit the power wall hard, your not going to see the performance scaling like we did from R580>R600>RV770>evergreen . what we are going to see is more efficient use of peak flops and a big console APU with lots of bandwidth has a big head start.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
Is it really? Both the 360 and the Wii used an ATI GPU. How much good did it do them?
Totally different. The APU in the PS4 (the Xbox next also uses a variant) is GCN/x86/AMD64 part which is essentially the same as what we'll see in their desktop/mobile parts. AMD will benefit from architectural commonalities on the PC side, imagine being able to optimize on the PS4 and the same optimizations/visuals/physics paying off on the same PC hardware. This is truly excellent.

And keep in mind this will not only benefit AMD, because if you're running an Nvidia GPU you will still be benefiting from the x86 commonality, although Nvidia may have to work extra harder to make up for the GCN specific coding. It remains to be seen how big an issue that might be.
 

gorobei

Diamond Member
Jan 7, 2007
4,097
1,599
136
Is it really? Both the 360 and the Wii used an ATI GPU. How much good did it do them?

nintendo doesnt really count in the scheme, they have always been a gen or 2 behind in hardware.

ps3: cell(powerpc) + nv(whatever shader unit)
xbox360: powerpc + ati(vliw5?)
pc: x86 + dx9/10/11(ati or nv)

no real overlap in architecture so cross platform is exercise in code once recode 3 to 4 times for each platform



but now....

ps4: x86 apu + gcn
xbox720: x86 cpu(?) + amdgpu(gcn)
laptops: x86apu + gcn
pc: x86 cpu + dx11(amd gcn or nv cuda)

when both console and commodity laptops are all running similar architecture and same graphics transistor family, it means you can extract the benefits of direct to metal programming/optimization for 3times the number of target consumers(turn key know nothing about computers shoppers.)

if things had gone the other way: nv in xbox720 and ps4 and wiiU, all the gamedev studios would have to learn to program/optimize for cuda shaders and since physx is already included and available on 80% of all the platforms there would be far less of a penalty for requiring gpgpu physics in a cross platform game(pc included)
 
Last edited:

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
It also cost just as much if not more then a console. Here is the trick, explain how you are going to brute force latency, your going to start getting complex DX11+ shaders where the CPU's resource can be used for complex critical paths, you wont be able to do that on a Discrete GPU. That will be a significant performance penalty for PC on top of the normal benefits targeting a fixed platform has.

Graphics came after CPU workload.
If there are "complex DX11+ shaders" then all information will be transfer to the GPU.

The actual difference can be massive and if people actually did Research instead of just jabbering you would know this. The biggest bottleneck PC's have is draw calls. a 7 year old console leaves modern PC's will major DX updates to try and increase the amount of draw calls it makes for dead.
Sure. Have you an example of this? Looking at modern console titles there is no game which has ground breaking KI, physics or other CPU based "effects".
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
if things had gone the other way: nv in xbox720 and ps4 and wiiU, all the gamedev studios would have to learn to program/optimize for cuda shaders and since physx is already included and available on 80% of all the platforms there would be far less of a penalty for requiring gpgpu physics in a cross platform game(pc included)
On this topic, suppose Nvidia had a CPU and GPU in PS4 and Xbox 720. And people said well that doesn't matter, consoles are too far behind PCs anyway, the hardware specific tweaks won't matter, won't help their bottom line, AMD didn't want part of that market anyway no money to be made no synergy realized on the PC side.

Imagine the response.
 

itsmydamnation

Diamond Member
Feb 6, 2011
3,114
3,965
136
Graphics came after CPU workload.
If there are "complex DX11+ shaders" then all information will be transfer to the GPU.
Yes and? , They are still connected to the same memory controller, reread what i said, which was CPU resources, CPU can be used for what GPU's are bad at, branching, pre-fetching and predicting to ensure data on the critical path for a complex shader is at the GPU when it is needed.

What would happen on a Discrete GPU is that there would be a miss and you would have to wait a while, this is normal GPU behaviour, the scheduler just schedules something else. But when you have lots of branches in your shader code your avg utilisation drops because so much time is spent waiting and there is nothing else to go on with. Console APU will be able to alleviate this problem because the CPU can fetch and predict this data for the GPU way better then the GPU can, There is already published research on this kind of methodology.


Sure. Have you an example of this? Looking at modern console titles there is no game which has ground breaking KI, physics or other CPU based "effects".

what are you asking , what does KI even mean? what does physics have to do with draw calls???
 

gorobei

Diamond Member
Jan 7, 2007
4,097
1,599
136
On this topic, suppose Nvidia had a CPU and GPU in PS4 and Xbox 720. And people said well that doesn't matter, consoles are too far behind PCs anyway, the hardware specific tweaks won't matter, won't help their bottom line, AMD didn't want part of that market anyway no money to be made no synergy realized on the PC side.

Imagine the response.

it was all fine and well when powerpc cpu made a clear defining line between pc and console. it was fine when sony used nv and ms used amd. it all evens out the opportunity costs for any 1 platform from the developers view: pure architectural detente.

ms and sony on gcn, and even nintendo on vliw5/4 just makes the developer math very ominous for nvidia. devs optimize for whatever gets them the most number of sales(largest audience). previously, you could count on a platform exclusive game title to break up the target consumers into favoring one platform, but with this much similarity in hardware it will be very expensive to buy a studio off. (look at bayonetta2)

the benefit to amd wont be immediate, but round about when kalveri laptops are hitting mid sales lifetime the hardware inside the consoles wont look too different from an apu. if you market an amd apu laptop as being able to play your console games at "equal performance", that is a selling point.

i imagine hiring amd's former head of console relations is no accident. nv will not let amd have 2 generations of consoles on gcn. even if they only get cuda on either one of ps5 or xb1080 but not both, the architecture cold war will go back to neutral.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
On this topic, suppose Nvidia had a CPU and GPU in PS4 and Xbox 720. And people said well that doesn't matter, consoles are too far behind PCs anyway, the hardware specific tweaks won't matter, won't help their bottom line, AMD didn't want part of that market anyway no money to be made no synergy realized on the PC side.

Imagine the response.


Really don't have to imagine. When AMD got the Apple gpu contracts in 2011, we heard it, broadcast as if it was a big deal, the biggest. It didn't matter that Nvidia had the contract a year prior, and then when the contract was lost. All we read about in these forums, is marketing shenz by Apple, and how consumers are brainwashed in to those products, and comparisons to Nvidia marketing. Since there can be no other reason, Nvidia is leading in all gpu segments.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Yes and? , They are still connected to the same memory controller, reread what i said, which was CPU resources, CPU can be used for what GPU's are bad at, branching, pre-fetching and predicting to ensure data on the critical path for a complex shader is at the GPU when it is needed.

You don't want communicate with the CPU off the chip. That will not resolve your problem of higher latency.

What would happen on a Discrete GPU is that there would be a miss and you would have to wait a while, this is normal GPU behaviour, the scheduler just schedules something else. But when you have lots of branches in your shader code your avg utilisation drops because so much time is spent waiting and there is nothing else to go on with. Console APU will be able to alleviate this problem because the CPU can fetch and predict this data for the GPU way better then the GPU can, There is already published research on this kind of methodology.
Shader decompiling happens while loading the game. There is no need for using the CPU besides the decompiling stuff. nVidia even went back with Kepler on the sheduling process and offload one of the earlier stage to the CPU.

what are you asking , what does KI even mean? what does physics have to do with draw calls???
I'm asking for an example of that problem.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
Really don't have to imagine. When AMD got the Apple gpu contracts in 2011, we heard it, broadcast as if it was a big deal, the biggest. It didn't matter that Nvidia had the contract a year prior, and then when the contract was lost. All we read about in these forums, is marketing shenz by Apple, and how consumers are brainwashed in to those products, and comparisons to Nvidia marketing. Since there can be no other reason, Nvidia is leading in all gpu segments.
You really think this is comparable to multiple console contracts that last 6-7 years, utilizing a design that will be shared across platforms and devices?
 

Revolution 11

Senior member
Jun 2, 2011
952
79
91
Obviously PCs will scale faster over time, but at the same time a PC costs a lot more. They both have their place....I think the PS4 will be a cool piece of tech at launch just like the xbox 360 was. Can you build a 2500$ PC that will be a lot better in terms of brute force? Yeah, sure you can. I am one of those people that will build such machines, but that doesn't mean that I don't enjoy consoles....PCs and consoles aren't mutually exclusive. Perhaps you should expand your horizons a bit, you can easily enjoy both for their strengths and weaknesses.
I agree with you that PCs and consoles are not mutually exclusive and you can enjoy both. But why do you need a $2500 PC to compete with a console?:confused:
 

itsmydamnation

Diamond Member
Feb 6, 2011
3,114
3,965
136
You don't want communicate with the CPU off the chip. That will not resolve your problem of higher latency.

yes exactly, right now the GPU stalls instead, on an APU instead of stalling the CPU can fetch the data for the GPU.

Shader decompiling happens while loading the game. There is no need for using the CPU besides the decompiling stuff. nVidia even went back with Kepler on the sheduling process and offload one of the earlier stage to the CPU.
what does shader COMPILING have to do with anything? everything a GPU does is controlled by a driver, guess what controls the driver :colbert:


I'm asking for an example of that problem.

its a throughput "problem", its about sustained utilisation. here is an awesome thread about it: http://forum.beyond3d.com/showthread.php?t=61567

you have a time budget to render a frame, PC's cant issue draw calls as fast as Consoles, this is waisted ms.