PS4/X1 may already have hit performance well - how does it affect PC GPU development?

Mondozei

Golden Member
Jul 7, 2013
1,043
41
86
So PCPerspective have a really interesting write-up about that we may already have seen the peak of the new consoles.

This would be interesting, since we saw continous improvements on the last generation. But it would also make sense, since this time, they are very similar to each other and they are basically PCs under the hood, de-mystifying the whole process a good amount.

Nevertheless, if this is as good as it gets, how would this affect PC development, i.e. capping new games to the lowest denominator? Crytek's GPU guru says it is already harder than ever to wow gamers with visual fidelity. He makes the comparison with how Crysis 1 was received and that today there are no such games(I'd probably argue SC, although probably not a leap like C1).
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
Seems like a hit piece tbh. The situation is no different now than when we were in the 360/ps3 era. The game developers adapt and find more efficient means to use the hardware as time goes on, not sure how that is any different now than it was then.

The new consoles suck all round though and released nowhere close to the level of performance disparity between PCs that existed when the 360/PS4 released. That is the reality of the PS3 and 360 selling at a loss for years because of the cost of hardware and them not wanting to repeat that mistake. Is there any information available on these new consoles yet ? My guess is that they were turning a profit from day 1 on the current consoles. :whiste:

As far as the Crytek developer, he is being disingenuous. Unlike a game like Crysis that was built from the ground up for the PC, the flagship platform for performance capability, the games today are focused on console. When Crysis released there was no hardware to run that game on its best settings and get anywhere over 10-20fps. If a developer made a game targeted at people running PCs with over a $1000 of GPU power and $300 of CPU power we would see some amazing visuals again. They don't because they would take a bath on development cost and go under with that business model. Consoles and the developers focusing on them is what is holding us back.

Take a look at some of the Star Citizen demos. A game built using the best game engine around only for PC and for high end hardware. They will blow you away.

http://www.youtube.com/watch?v=yUvwba4DMLY&list=UUTeLqJq1mXUX5WWoNXLmOIA
 
Last edited:

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
It just rushed through the article... It seems to be based entirely on AC:Unity, which we know is a turd when it comes to optimization.

Games like Ryse show there is more potential in GCN. There can be a big benefit in coding for the specific hardware. Its all there on the charts where 290x beats 980, 270X giving a run gtx770 (that is old hd7870 vs gtx680 FFS)

http://forums.anandtech.com/showthread.php?t=2403455

One can say it is the peak of what can be achieved, or is it the low hinging fruit only? Will more developers follow the suit and optimize heavily for GCN?

Most games are not well optimized yet. But having crytek pushing the boundaries with their game engine will probably bring more good optimized games from other developers using CryEngine. This is why taping both consoles was so important for amd. Developers optimize for consoles, if both consoles are based on GCN, there is heavy GCN bias. It will pay off.

As a side note: XBONE will get DX12 sorta api. That is clear indication of better performance down the road.

Nevermind all the missinformation spilled around:
The custom built parts from AMD both feature an 8-core Jaguar x86 architecture and either 768 or 1152 stream processors. The Jaguar CPU cores aren't high performance parts: single-threaded performance of Jaguar is less than the Intel Silvermont/Bay Trail designs by as much as 25%. Bay Trail is powering lots of super low cost tablets today and even the $179 ECS LIVA palm-sized mini-PC we reviewed this week. And the 1152/768 stream processors in the GPU portion of the AMD APU provide some punch, but a Radeon HD 7790 (now called the R7 260X), released in March of 2013, provides more performance than the PS4 and the Radeon R7 250X is faster than what resides in the Xbox One.

You click the 7790 review in which the same reviewer writes 1.79GFLOPS for 7790 which is apparently less than the PS4 number (1.840GFLOPS in the OP article), opposite to what they claim in OP article). Not to mention 2 times wider bus on PS4 more ROPs etc..

I don't want to say this, but it looks like pure clickbait article. I just said it...Oh well
 
Last edited:

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
There is room for incredible innovation and leaps that make everyone's jaws go slack. Wait another 5 or 10 years for Occulus to be refined to the point of having dual 4K 120hz screens for each eye and GPUs that can push that with high fidelity.

I think Occulus and VR is going to reinvigorate the market and we'll see sales really ramp on high end hardware to a much larger customer base when VR devices achieve a really pristine production and technical capability. Virtual worlds that immerse you will sell themselves to anyone who tries one on.

Anemic consoles will not be able to stand up to the experience of a VR system with those sorts of advancements and people will be comfortable with a $2000 price tag for that sort of experience.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
In addition to what Groover mentioned -- these consoles are a lot less powerful compared to PCs than PS3/360 were relative to flagship PCs -- there is also the concept of diminishing returns.

Let's just assume PS4 is 10X faster than PS3 in terms of graphics capability, let's put this into context:

These level of graphics in 2002 on FX5800 Ultra (Voodoo power rating of 4.6)

1343730217Q8scDBKVfx_1_1_l.png


to this on GTX670 SLI (Voodoo power rating of ~ GTX690 = 383) in 2012

1343730217Q8scDBKVfx_1_2_l.png


A move from 4.6 --> 383 or a GPU performance increase of 83X

The problem are not just consoles but the rate of growth in graphics capabilities is now not enough to create a big difference in next generation graphics.

Just read this quick article of how more advanced the New Dawn demo is vs. the original Dawn and you realize that to improve graphics to WOW us from the New Dawn level, we will need a graphics card 50-100X faster than a GTX980. Until then it'll be small incremental annual steps.
http://www.hardocp.com/article/2012/08/01/new_dawn_dx11_demo_compared_to_old/1#.VFAAAvmUfsd

Point is the 10X increase PS4 brought over PS3 is a drop in the bucket compared to what's necessary for a graphics revolution compared to what we have now at Crysis 3 graphics level.

Having said that I do not believe that PS4 and XB1 are fully maxed out. We should see better looking games in the next 2-3 years, at least in the form of Gran Turismo 7 and Uncharted and other gems from Sony's 1st parties who will optimize better for the console. As usual I expect PS4 to be 99% tapped out in 2.5-3 years and then we'll face 3-4 years of stagnation until PS5 in 2019-2020, unless developers like CDPR or Dice, etc. keep pushing the PC independently of consoles.

If we look at next wave of games like Far Cry 4, AC Unity, Dragon Age Inquisition, they look good, but not jaw dropping unlike how Unreal 2, Far Cry 1 or Crysis 1 or even Crysis 3 were relative to the other games in their generation around their releases. Crysis 1 really wowed me when I first tried playing it on my 8800 GTS 320mb.

Diminishing returns is going to kick in and advancements in 4K and eventually 8K gaming will put even more pressure on GPU developers to exponentially increase graphics performance in the next 10 years. With 4K, any doubling or quadrupling of GPU performance will be wiped out by more pixels, higher resolution geometry and textures alone. Then there is the problem that adding more polygons is not going to provide a huge difference anymore.

In Ryse Son of Rome the character models are very detailed at 85000 polygons and moving to 150000, while an improvement, is nowhere near 2x more detailed. Simply throwing more polygons at the game isn't going to double or triple the realism of graphics.

Old - 150000 Polygons per character
New - 85000 Polygons per character

You just barely see the difference now. Diminishing returns!
RyseComparison2.jpg

http://www.dualshockers.com/2013/09/28/ryse-old-and-new-builds-compared-polygons-vs-shaders/

6vCXW0G.jpg


For the next 5 years until GPUs catch up, developers should focus on making games more fun/unique because until TSMC or GloFo get their acts together, we will only see a performance increase of 4-5X (if we are lucky) in graphics in that time from a GTX980, which is going to be nothing compared to what's necessary for a graphics revolution.

*** But we can still extrapolate game engine and in-game optimizations by say comparing how good a game like AC Unity looks vs. the hardware requirements and say Crysis 3 or Metro Redux series or Ryse Son of Rome. I wouldn't exactly use Ubisoft's optimizations in AC Unity as the epitome of extracting maximum value out of PS4 due to 'console parity' with MS on that game and generally speaking Ubisoft's lack of proven track record to optimize games well.
 
Last edited:

CakeMonster

Golden Member
Nov 22, 2012
1,630
810
136
I know I should be frustrated about this generation, but I'm not that frustrated... We did get 8Gb RAM (rumors suggests we were very close to not getting it) which helps longevity a lot as well as multitasking and textures for the future. We also got a GCN heavy parallelized architecture (which we would not have gotten 2 years earlier) that can benefit from DX12/Mantle-ish tweaks in the future. And we got x86 (huge), as well as a GPU based on a real graphics card.

This means that as of now the only difference for PC games is the scaling, (not major hardware features, or memory). This is a huge advantage compared to the previous generation.

What I miss that I think the consoles will suffer from shortly if not already:
- HDMI 2.0 (for 4K output for UI/menus, 3d (not games obviously), movies, and pictures). The UI will look ugly as a media center in only a couple of years because of the resolution.
- Lossless graphics memory compression (like NV 9x0), this would have been a cheap and easy way to get more out of less hardware.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
It just rushed through the article... It seems to be based entirely on AC:Unity, which we know is a turd when it comes to optimization.

Games like Ryse show there is more potential in GCN. There can be a big benefit in coding for the specific hardware.

Ya, that's a good point. When Ubisoft chose to run global illumination on the CPU and failed to take any advantage of PS4's 50% higher compute performance whereby they could have run global illumination via DirectCompute on the GPU's remaining shaders (i.e., after-all if PS4 has 1152 of them and XB has just 768, but their perfomance is identical, then where did the power of the remaining shaders go? Unused...), it's no wonder 50% of their CPU performance got wiped out. :whiste:

"Half of the CPU compute time is being used to help the rendering engine by unpacking pre-baked lighting models for the global illumination implementation and thus the game is being limited by the 50% remaining performance power the AI, etc."
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
Pffft pre-baked lighting. If you recommend a 780 at least make it fully dynamic on PC. If more ports turn out like Ryse I may as well buy a console. That game still runs like a dog on this 780Ti GHz @ 1215MHz - assuming the GPU is even used beyond 70% the FPS dips from 60+ to 40+ were enough to make me uninstall it. GCN optimized sure but Nvidia users get rubbish performance.
 

Insomniator

Diamond Member
Oct 23, 2002
6,294
171
106
While I understand the concept of diminishing returns... I'm just not buying it yet. Games still look very sterile to me. Both visually and interactively. HL2 brought the gravity gun 10 years ago and it still seems like plenty of games don't have that kind of lifelike physics. Crysis 3 looks excellent, not as amazing as Crysis did back in the day but still great to me. Why are there no games that can even match Crysis 3 yet?

The bottom line is that games are made for consoles first, and we are wasting our time fiddling with $1000 worth of SLI cards and bad drivers trying to get these games to run with settings they were never designed to do. A 4k game with textures pulled from a console version... still looks like a polished turd.

If someone made a game that brought 980GTX SLI to its knees at a normal resolution like Crysis did back in the day, the jump would look just as amazing as C1 -->C3.
 

NTMBK

Lifer
Nov 14, 2011
10,452
5,839
136
Ubisoft developer spreading FUD to pre-emptively defend their game... big surprise.

I suspect that we will still see improvements from both consoles, but I would certainly hope that it is easier for developers to extract performance from them. That was one of the key design goals for the PS4, after the disaster of the PS3.
 
Dec 30, 2004
12,553
2
76
While I understand the concept of diminishing returns... I'm just not buying it yet. Games still look very sterile to me. Both visually and interactively. HL2 brought the gravity gun 10 years ago and it still seems like plenty of games don't have that kind of lifelike physics. Crysis 3 looks excellent, not as amazing as Crysis did back in the day but still great to me. Why are there no games that can even match Crysis 3 yet?

The bottom line is that games are made for consoles first, and we are wasting our time fiddling with $1000 worth of SLI cards and bad drivers trying to get these games to run with settings they were never designed to do. A 4k game with textures pulled from a console version... still looks like a polished turd.

If someone made a game that brought 980GTX SLI to its knees at a normal resolution like Crysis did back in the day, the jump would look just as amazing as C1 -->C3.

I wish more people would use the speach/face algorithm from HL2, I still swear that Gman's facial movements are the most realistic of anything I've ever seen.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
The consoles was in the lower end to begin with. And they are based on already known platforms. Black bars, 720p, 900p, 30FPS is already the norm and heavily neutered games.

Its a race to the bottom.

On top of that MS and Sony is putting pressure on game devs so the PC gets sabotaged in some desperate hope to make consoles look better than they are.

So while the consoles may improve slightly (like the PC), I wouldnt expect anything like previous.
 

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
expect **** pc ports, that is all. I don't expect anything from the console devs anymore. if you really want a console pc port, wait till it is in the 5 or 10$ bargain bin on steam or any other digital retail service.

and make sure your wallet is ready to support devs like cdp who makes the witcher series.
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
expect **** pc ports, that is all. I don't expect anything from the console devs anymore. if you really want a console pc port, wait till it is in the 5 or 10$ bargain bin on steam or any other digital retail service.

and make sure your wallet is ready to support devs like cdp who makes the witcher series.

ports dont seem bad so far...ryse is pretty decet and soo was SoM...
 

SlickR12345

Senior member
Jan 9, 2010
542
44
91
www.clubvalenciacf.com
Well the PS3 cell processor is the same power as the new AMD apu in the Xone and PS4, that tells you something.

The graphics are massively improved over the last generation, but there is something called diminishing returns for visuals.

I don't think it translates to the PC development though, I think if anything we've hit a performance wall in the PC hardware and we are getting barely 10%-15% improvements per generation of gpu's, instead of the 50%+ we were getting 5-6 years ago.

So for PC's in the short term it will actually help as GPU developers will need to develop a lot better and cheaper GPU's to properly run unoptimized console ports.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
Most people are forgetting that the lowest common denominator is still the xbox 360 and PS3.

There aren't really all that many ps4 and xbone exclusives yet.
 

DarkKnightDude

Senior member
Mar 10, 2011
981
44
91
The performances aren't just going to magically appear. They can slowly make it more efficient but the problem is its not like last gen. PS3 was very hard to developed for because of the cell thing and only in the last years of development did it start becoming more efficient. Xbox 360 was a bit easier, but had the same problem.

These new consoles however are based on x86 architecture and I'm not sure how much they can optimize. They'll probably eventually figure out a way to balance it a little more like the RAM usage, but its not like there's a ton of hidden performance all ready to go as soon as you hit a button. Though M$ is definitely counting on DX12 to make things easier and Sony has a company actively looking to increase performance on dev kits.
 

showb1z

Senior member
Dec 30, 2010
462
53
91
Why are people expecting big things from DX12 on console? DX12 is an attempt to bring a low-level console-style API to PC. But it seems highly unlikely that it will be more efficient than what consoles already have now.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Well the PS3 cell processor is the same power as the new AMD apu in the Xone and PS4, that tells you something.

You can't be serious. While the PS3's SPEs were strong, they were very limited in their capability. They basically helped out the GPU with rendering and other heavily parallel processing. The Cell CPU itself was weak sauce, and the Jaguar CPUs in the consoles totally outclass and outperform it by a crap load.

As to the topic, the Xbox One and PS4 haven't hit the wall yet. When developers start fully exploiting the compute units, then we will know they can't go any further..
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Ya, that's a good point. When Ubisoft chose to run global illumination on the CPU and failed to take any advantage of PS4's 50% higher compute performance whereby they could have run global illumination via DirectCompute on the GPU's remaining shaders (i.e., after-all if PS4 has 1152 of them and XB has just 768, but their perfomance is identical, then where did the power of the remaining shaders go? Unused...), it's no wonder 50% of their CPU performance got wiped out. :whiste:

This accusation is too simplistic to be true. Ubisoft's programmers would have to be complete idiots to use the CPU instead of the compute units for unpacking the lighting models.

What I think is more likely, is that because there's so much prebaked lighting data, they have no choice but to use the CPU to help unpack it all. 25GB of lighting data is a helluva lot..
 

dacostafilipe

Senior member
Oct 10, 2013
805
309
136
Why are people expecting big things from DX12 on console? DX12 is an attempt to bring a low-level console-style API to PC. But it seems highly unlikely that it will be more efficient than what consoles already have now.

Maybe they are talking about X1's new API version that frees up resources that where bound to Kinect before?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
This accusation is too simplistic to be true. Ubisoft's programmers would have to be complete idiots to use the CPU instead of the compute units for unpacking the lighting models.

Not really because if they chose the GPU for DirectCompute it would have meant:

1) Faster performance on GCN 290X over 970/980 which conflicts with their GW agreement with NV;

2) Required a lot of extra optimization which means more time and money. Since Ubisoft wants to hit the holiday release mark, and their game ran at just 9 fps 9 months ago, they probably had room to optimize more but management forced them to release it for the holiday 2014 season. There is no way PS4's 50% extra graphics performance (and that means 2x ROPs, nearly 2x TMUs and 50% more shaders) went completely unused for even better textures, shading effects and anti-aliasing at least. They could have at least used those shaders for 'free' SSAA.

What I think is more likely, is that because there's so much prebaked lighting data, they have no choice but to use the CPU to help unpack it all. 25GB of lighting data is a helluva lot..

What is the last modern PC game that used pre-baked global illumination vs. dynamic global illumination? Why in the world is Ubisoft using pre-baked lighting model for a next generation PC game and at the same time calls for ludicrous specs like GTX680 as minimum!?

The only way for Ubisoft to be right that they have "crazily optimized for current gen consoles" and that they have tapped out the full power of consoles is if no game ever to be released for PS4 will blow AC Unity away in the next 6-7 years. Do you actually believe that? :biggrin:

For example, if you just looked at Dark Souls PC ports, and in isolation that Japanese developer could have claimed that they maxed out a GTX780Ti SLI on the PC. Some Japanese developers actually prefer 30 fps in 3rd person action games and target that from the beginning.
http://www.playstationlifestyle.net...st-for-action-games-claims-producer/#/slide/1

You will see once Naughty Dog starts optimizing for PS4, it'll blow away games like Bloodborne and AC Unity graphically because they'll start using low-level API of PS4 and using GCN for compute too. Maybe not for Uncharted 4 but in the 2nd half of PS4's life they will produce superior graphics to any game out on PS4 now. Of course we won't have a situation where PS4 will look better some future PC games but considering Xbox 1 with 2 AC games is $350 and PS4 is only $400, the graphics they produce for the cost is very good.

The problem with Ubisoft no game they ever made even came close to Metro LL or Crysis 3 in graphics and yet they require top of the line hardware to run well. There is a huge disconnect time and time again with Ubisoft's optimizations.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Why all the flak for Ubisoft? Just because they came somewhat clean.

What about all the other devs with 30FPS, black bars and utterly shitty dumbed down games filled with uncompressed junk?

The entire console generation is a disaster. And its not something that will magically turn around. Because this time the processing power simply isnt there.