PS4/X1 may already have hit performance well - how does it affect PC GPU development?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

MeldarthX

Golden Member
May 8, 2010
1,026
0
76
Why all the flak for Ubisoft? Just because they came somewhat clean.

What about all the other devs with 30FPS, black bars and utterly shitty dumbed down games filled with uncompressed junk?

The entire console generation is a disaster. And its not something that will magically turn around. Because this time the processing power simply isnt there.

They've come clean? They were flat out caught lieing about Watch Dogs; after what they did there; you are saying oh we can trust them.

Ubisoft's for a long time has hated PC gamers; they've tried their hardest with the most drm and its come back to bite them time after time.

Rest of developers have come said they aren't having issues with the consoles; there is a lot of power still left to be tapped........Yet Ubisoft *surprise surprise* is having issues coding.

I have friends in the business; I know who I believe....certainly not Ubisoft....
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
They've come clean? They were flat out caught lieing about Watch Dogs; after what they did there; you are saying oh we can trust them.

Ubisoft's for a long time has hated PC gamers; they've tried their hardest with the most drm and its come back to bite them time after time.

Rest of developers have come said they aren't having issues with the consoles; there is a lot of power still left to be tapped........Yet Ubisoft *surprise surprise* is having issues coding.

I have friends in the business; I know who I believe....certainly not Ubisoft....

The other companies are purely innocent I guess? Just enjoy the the new "features" like "cinematic effect" then. Because its certainly not a console limitation, right?

I dont know if you had expected companies whos bread and butter is the consoles to come out saying much different than praising it. MS and Sony puts a lot of pressure on them. And they have direct economic interrest in not saying anything else. However behind the doors its a completely different talk.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
As a side-note I think part of the article I agree with: they should have released PS4/XB1 1-1.5 years earlier or waited 1-1.5 years to launch them in late 2014/early 2015. The timing was all wrong because they knew we would be stuck on 28nm for a while. Also, 99% of developers didn't even start working on 1st party exclusives considering the drought we see on this end 1 year later.

Essentially Sony used a slightly de-tuned version of HD7970M which has been available all the way back to May 1, 2012.

The refresh of HD8970M released on May 15, 2013 only brought a 50mhz increase in GPU core clock performance wise. That's it.

Shockingly, by January 7, 2014, AMD just rebadged HD8970M into M290X. Performance and feature set was identical to HD8970M, which itself was just 6% faster than HD7970M from May 1, 2012.

Essentially, the only thing of benefit waiting an extra 1.5 years from May 1, 2012 was the lower cost of GDDR5 which allowed PS4 to go from 4GB to 8GB. However, in 1 year from the time PS4 launched, we now have M295X which is probably even slower than 970M. However the increase in memory bandwidth from 153GB/sec of 7970M to 176GB/sec was simply a waste of $ and power consumption. Neither the CPU nor the GPU can benefit from so much bandwidth. It's just marketing.

Also, NotebookCheck claims that R295X has a TDP of 125W, instead of 100W for the 7970M/8970M/290M, which suggests it could have been a problem fitting into today's PS4.

The biggest bottleneck is the CPU though. Jaguar is still stuck at around 2Ghz for low power applications. AMD has not moved past 28nm. They would have had to fuse one of their FX8000/9000 series CPUs or even 6000 series with M295X. There is no way they would have been able to hit $399 price with those components considering how much more power FX8000/9000/6000 CPUs use.

PS4 in total peaks at 180W.

A single FX6000/8000 series CPU is already near that level or exceeding it.

power-3.png


So no, it was simply not possible to build a much faster console still without Intel CPU or a custom made IBM design. Custom made IBM design was not really an option considering how expensive the Cell was and given how much $ Sony lost on PS3. That leaves us with Core i5/i7 because a Core i3 is not much better than 8 Jaguar cores. Core i5 from Intel is not going to be cheap forcing Sony to go way past $400 MSRP. Therefore, there was little Sony could do to improve the CPU design by 50-100%. Adding a faster GPU wouldn't have helped them solve the CPU bottleneck even now. Going with a much faster AMD CPU and M295X would have blown their power consumption budget and for sure the console would have cost $500-600. Xbox 1 is now just $350 with 2 games.

The way to solve this is to update consoles every 5 years like it was when I was growing up instead of dragging their lifetime by 7-8 years as was the case with PS3/360. I wouldn't mind at all if PS5 came out in 4 years but they are probably going to milk this generation for another 7-8 years as well. Game development is too costly now and it takes longer to make games. As a result, they need a large userbase to make game development worthwhile. With 7-8 year lifecycles, they can sell games to 70-80 million userbase by the end of the console lifecycle with a ton of profits.
 
Last edited:

CakeMonster

Golden Member
Nov 22, 2012
1,630
810
136
The way to solve this is to update consoles every 5 years like it was when I was growing up instead of dragging their lifetime by 7-8 years as was the case with PS3/360. I wouldn't mind at all if PS5 came out in 4 years but they are probably going to milk this generation for another 7-8 years as well. Game development is too costly now and it takes longer to make games. As a result, they need a large userbase to make game development worthwhile. With 7-8 year lifecycles, they can sell games to 70-80 million userbase by the end of the console lifecycle with a ton of profits.

I'm not a business school graduate, but why do they need such a larger user base this time around if they are not losing money on the hardware? By all measure, shouldn't they be able to afford a shorter cycle on this generation compared to the last?
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
I'm not a business school graduate, but why do they need such a larger user base this time around if they are not losing money on the hardware? By all measure, shouldn't they be able to afford a shorter cycle on this generation compared to the last?

They still lose money on the hardware. People easily confuse BOM with total cost.

Also both companies got billions to recover from old consoles. Plus they actually want a profit this time around, rather than another money sink.

If there is no profit, then these will be the last consoles. Both divisions is quite unpopular with stockholders.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I'm not a business school graduate, but why do they need such a larger user base this time around if they are not losing money on the hardware? By all measure, shouldn't they be able to afford a shorter cycle on this generation compared to the last?

It's not only about Sony/Ms/Nintendo but developers. Look at Wii U -> Nintendo decides to release it earlier than MS/Sony, underpowers it, and 3rd party developers hardly care for it. The cost to develop games for it doesn't justify it as just porting games on PS4/XB1.

Let's say in 4 years PS4 will have 50 million users and XB1 35 million, if MS releases XB2 or Nintendo Wii 3 too early, most developers will focus mostly on 85 million they can sell to. You now have to be very careful when you are launching a new console or you'll pull a Sega Dreamcast/Wii U.

Think about it if XB1 is already $350 and PS4 is $400, in 3-4 years these consoles will be $199-249. New XB2 or PS5 would launch at $400-500. In 3-4 years hardware will not have advanced 10X to replace PS4 at $399. You now have to wait much longer for CPU and GPU hardware to increase 10X in power. Combined with diminishing returns of graphics, how many billions Sony and MS lost on PS3/360, I can see this generation lasting to 2019-2020.

And if Sony/MS are thinking of going APU design again, it is highly likely they will consider AMD once more for XB2/PS5. Look how hot and power hungry AMD CPUs are and they just now released a true performance increase over 2.5 (!) years old HD7970M. M295X is not even 2X the performance of HD7970M either. It's going to take AMD a long time to increase PS4's GPU 10X in power at the pace they are moving. Neither Sony not MS have even shrunk the existing APUs 1 year later.

Look at memory too. We went from 256/512MB to 8GB, or 16X the increase. Next console generation will need 64-128GB of VRAM and incredible bandwidth to drive native 4K. We are a long time away from PS5 for technological reasons and pressure from Sony/MS shareholders to recoup R&D, make profits, and from developers to make a lot of $ selling to 80-150 million users.

Nintendo is in one of the worst position since their console is already unpopular and badly outdated. But if they release something 2X more powerful than PS5 in 1-2 years, 3rd party developers will mostly ignore its power since it will have lower userbase which won't justify taking advantage of the extra power.
 
Last edited:

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Why all the flak for Ubisoft? Just because they came somewhat clean.

What about all the other devs with 30FPS, black bars and utterly shitty dumbed down games filled with uncompressed junk?

The entire console generation is a disaster. And its not something that will magically turn around. Because this time the processing power simply isnt there.


So you're saying there will never be a game that has more happening on screen, improved graphics, or better AI than what games are available for the consoles now? Because that would pretty well have to be the case if the hardware is already maxed out and it isn't a developer issue working with new consoles, right?
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
So you're saying there will never be a game that has more happening on screen, improved graphics, or better AI than what games are available for the consoles now? Because that would pretty well have to be the case if the hardware is already maxed out and it isn't a developer issue working with new consoles, right?

Thats not what I said. They will improve slowly like PC games. But there isnt any large magic untapped potential waiting this time.

They started this time in the lower end with a known platform.
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
What about all the other devs with 30FPS, black bars and utterly shitty dumbed down games filled with uncompressed junk?

The entire console generation is a disaster. And its not something that will magically turn around. Because this time the processing power simply isnt there.

I was going to reply to junk you said earlier but figured it wasn't worth it. But here you are again spouting rubbish.

You keep mentioning black bars as if it's the norm. The Evil Within does not represent the next gen consoles, the game is atrocious. Other games do not have black bars. I can't think of even one.

There are a ton of 60fps games, 1080p games, and 60fps 1080p games on the next gen consoles. Are you not paying any attention whatsoever?

And finally, without taking into account anything else about the whole pie that is a game's visual fidelity, final resolution and frame rate is an absolutely lousy metric for measuring how good a game looks. It's possible and not even uncommon for a 900p 30fps game to look better than a completely different game at 1080/60.

And then for a lot of these games such as the PC version of Ghosts, Titanfall, Watch Dogs, and Ryse do not look a whole lot better on PC than on console regardless of your rig.

The diminishing returns of increased processing power vs increased visual fidelity was touched on quite thoroughly earlier in the thread. This is why new games aren't blowing you away anymore, it doesn't have anything to do with the consoles.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I was going to reply to junk you said earlier but figured it wasn't worth it. But here you are again spouting rubbish.

You keep mentioning black bars as if it's the norm. The Evil Within does not represent the next gen consoles, the game is atrocious. Other games do not have black bars. I can't think of even one.

There are a ton of 60fps games, 1080p games, and 60fps 1080p games on the next gen consoles. Are you not paying any attention whatsoever?

Pretty much. It's ludicrous to conclude that PS4 is maxed out based on 2 of the most unoptimized turds to have come out in recent history:

Evil Within, Watch Dogs and AC Unity (most likely).

Let me put it this way, Evil Within looks worse than Crysis 1 from 2007 and wait for it chokes on a i7 3770K @ 4.3Ghz and a 980.

"Is it possible to run The Evil Within on PC at a locked 1080p60? Not even an overclocked i7 and the fastest GPU on the planet can manage it. In the video below, you'll find a broad overview of the components we tested and the results gained. In addition to the GTX 750 Ti, 760 and 980, we've also played the game on other enthusiast-level top-line GPUs, including the GTX 780 and the AMD Radeon R9 290X - and unfortunately none of them sustain anything approaching a flawlessly smooth 1080p60 update."

https://www.youtube.com/watch?v=FPfmi-jcAg4
and
http://www.eurogamer.net/articles/d...possible-to-run-the-evil-within-pc-at-1080p60


The game is an absolutely joke of an optimization example. GTX980 goes as low as 38 FPS at 1080P. Atrocious. The id5 game engine has been trash from day 1 in every game ever made. As far as Ubisoft goes, maybe Far Cry 3 and Watch Dogs will run smoothly in 2020.
 
Last edited:

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Thats not what I said. They will improve slowly like PC games. But there isnt any large magic untapped potential waiting this time.

They started this time in the lower end with a known platform.


I don't believe too many games have been made to take advantage of HUMA or coded explicitly for GCN. Not to mention that until fairly recently, even on the PC, seeing a game effectively use more than a couple cores has been somewhat of a rarity. I don't see why there isn't performance left to be had as developers work more with the hardware.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Not really because if they chose the GPU for DirectCompute it would have meant:

1) Faster performance on GCN 290X over 970/980 which conflicts with their GW agreement with NV;

Not really. First off, DirectCompute is no longer dominated by AMD. The GTX 980 and 970 have very strong compute performance now.

Also, you don't need strong DirectCompute performance for something as simple as data decompression/unpacking. Even Kepler should have no performance issues with that using DirectCompute.

2) Required a lot of extra optimization which means more time and money. Since Ubisoft wants to hit the holiday release mark, and their game ran at just 9 fps 9 months ago, they probably had room to optimize more but management forced them to release it for the holiday 2014 season. There is no way PS4's 50% extra graphics performance (and that means 2x ROPs, nearly 2x TMUs and 50% more shaders) went completely unused for even better textures, shading effects and anti-aliasing at least. They could have at least used those shaders for 'free' SSAA.

This is a more likely scenario. They didn't have the time to fully optimize for the CUs. This game has been in development for 3 years now, which was before the PS4 and Xbox One's specs and capabilities even became known.

What is the last modern PC game that used pre-baked global illumination vs. dynamic global illumination? Why in the world is Ubisoft using pre-baked lighting model for a next generation PC game and at the same time calls for ludicrous specs like GTX680 as minimum!?

Because pre-baked lighting potentially looks much better as they don't have to worry about whether the machines can handle it. To do the kind of global illumination found in AC Unity in real time, would require tremendous processing power that would certainly be out of reach of the PS4 and Xbox One, and probably all but the high end gaming PCs with SLI/Xfire..

The only way for Ubisoft to be right that they have "crazily optimized for current gen consoles" and that they have tapped out the full power of consoles is if no game ever to be released for PS4 will blow AC Unity away in the next 6-7 years. Do you actually believe that? :biggrin:

Tapping out the PS4 and Xbox One isn't difficult; especially on the CPU side. Lots of developers have been complaining about being CPU bottlenecked, so it's not just Ubisoft.

And unless a Ubisoft dev comes forth and says that they are, or aren't using the compute units, we will never know for sure so it's just speculation on our part.

You will see once Naughty Dog starts optimizing for PS4, it'll blow away games like Bloodborne and AC Unity graphically because they'll start using low-level API of PS4 and using GCN for compute too.

The PS4 is already using a low level API, and so is the Xbox One. Using a low level API isn't magic. The PS4 and Xbox One will still be very limited by shader count, bandwidth and ROPS just like anything else.

The problem with Ubisoft no game they ever made even came close to Metro LL or Crysis 3 in graphics and yet they require top of the line hardware to run well. There is a huge disconnect time and time again with Ubisoft's optimizations.

Comparing large open world games like AC Unity to linear shooters like Metro LL and Crysis 3 isn't really practical. They are completely different.

AC Unity is a far bigger game than any shooter, with lots more to process and render. Personally, I think AC Unity is way more technically advanced than Metro LL or Crysis 3. They aren't even remotely comparable.

As for optimization, Ubisoft's track record is spotty I agree. AC IV ran very well for me, and so did Far Cry 3. But Watch Dogs was a disaster, which I think was due to how inefficient the engine is; even on the current gen consoles. The Disrupt engine simply wasn't ready for prime time use.. To their credit though, Ubisoft have been releasing patches to help improve performance and playability for the game so at least their support is decent.

Luckily AC Unity uses a different engine, an updated version of the AnvilNext engine which has a proven track record so it should perform much better than the Disrupt engine.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
no one should trust ubisoft at this point. they are doing everything for the $$$...

Yup. Ubisoft is the worst developer out there for PC. Trust nothing they say and buy nothing they peddle imo.
 
Last edited:

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
And finally, without taking into account anything else about the whole pie that is a game's visual fidelity, final resolution and frame rate is an absolutely lousy metric for measuring how good a game looks. It's possible and not even uncommon for a 900p 30fps game to look better than a completely different game at 1080/60.
Only if you have a 1600x900 monitor hooked up, where 1080P would be oddly downscaling to 900P, or upscaling from 720P. Not using common native display resolutions, in the era of discrete pixel displays, is a design failure.
 

TheSlamma

Diamond Member
Sep 6, 2005
7,625
5
81
I saw an article back in January that said PC gaming is now outselling consoles worldwide. With things like steam box and that new Alienware console coming out I honestly think Sony and ms shot themselves in the foot on this one. People moved more to PC as PS3 and 360 got so long in the tooth. Now they are starting new consoles and are already way more behind than when the last gens came out

I could be wrong but I think in the long game this will play out just fine if not better for PC gamers as more and more people move to them

Now of course there is that huge surge in trolls we are seeing now in places like steam and the PC forums. Especially from people who should not be building their own pc's and then the games don't run or crash
 

el etro

Golden Member
Jul 21, 2013
1,584
14
81
AMD does have better GPU(but not better CPUs) to put in the consoles hardware. Consoles makers chosed the cat(Jaguar cores) plus Pitcairn because they know peak TDP of the console would not be so great to cool and to power it up("Hunger" components need better cooling system and more complex electrical system to be built).
That's why they targeted the both consoles at ~150W peak power rating.


7648.png




67236.png



From 7800 GTX size(smaller than G92)/power consumption to Tahiti size(close to G80 size)/power consumption has a considerable difference. Both console makers did the choice that will be easier to cool, power up, and to shrink(PS4 and XB1 slim is already on the works).

DieSize_575px.png
 
Last edited:

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
Has anyone here seen the GCN optimization in Rome?

http://gamegpu.ru/action-/-fps-/-tps/ryse-son-of-rome-test-gpu.html

A single 290X beats 2 x 780Ti SLI which is ridiculous. It even cracks the 980. If this is future ports, I'd like to see Nvidia's response.

Then, AC Unity has likely been in dev for years, I will be very surprised if it runs smoothly on PC like Mordor. If Unity was being developed now with the hardware we now have it would likely have a way better optimised engine.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Has anyone here seen the GCN optimization in Rome?

http://gamegpu.ru/action-/-fps-/-tps/ryse-son-of-rome-test-gpu.html

A single 290X beats 2 x 780Ti SLI which is ridiculous. It even cracks the 980. If this is future ports, I'd like to see Nvidia's response.

Then, AC Unity has likely been in dev for years, I will be very surprised if it runs smoothly on PC like Mordor. If Unity was being developed now with the hardware we now have it would likely have a way better optimised engine.

SLI isnt working. You may have missed the single 780TI performs the same as SLI. Or even better, the 290X outperforms the 295X2 since crossfire wasnt working either.

And a patch have increased nVidia performance.

http://www.gamepur.com/news/16691-n...ow-improves-gpu-performance-nvidia-cards.html
 
Last edited:

MisterLilBig

Senior member
Apr 15, 2014
291
0
76
None of the games coming out are optimized.

The CPU, GPU, RAM and all the additional extra hardware and features and new API's can't possibly be used the best as possible in such a short period of time.

Just because the CPU is X86, just because the GPU is GCN and just because the RAM increase is incredible, does not mean that they know how to use it efficiently.

The first console games are simple ports with added features, it has always been like that, as weird as it might sound to some. XOne didn't even ship with a low level API. That's why DX12 will be supported on it, that is it's low level API.


X1/PS4 aren't maxed out yet. No way they could be.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126

1. That's projected hardware sales by 2017. We are already at $21 billion in 2014. The growth for the next 3 years is small.

2. It's reasonable to believe that PC hardware sales will exceed console hardware. PS4 costs $400, XB1 $350 and PS3 and 360, Wii are all ~$100-200. Modern GPUs at launch cost $300-700 now. There are 13-15 million add-on graphics cards sold per quarter.

3. The more important metric are software sales. If you take away Blizzard, major MMOs, console sales on AAA games destroy PC sales. Sports games like FIFA 2015 make BF4's sales look like a joke in comparison. In the launch week FIFA 2015 will sell as much as BF4 in months on all platforms.