Are next-gen consoles gonna be gimped by their weak cpu's?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

futurefields

Diamond Member
Jun 2, 2012
6,470
32
91
Especially when the majority of PC gamers are only rocking dual core systems and Intel HD 3000 graphics.

Most PC users have those computers. Most PC gamers I know have fairly decent systems ranging from gtx 560's to multi-gpu configurations.
 

zerocool84

Lifer
Nov 11, 2004
36,041
472
126
Most PC users have those computers. Most PC gamers I know have fairly decent systems ranging from gtx 560's to multi-gpu configurations.

But most PC gamers in general to not and they hold back everyone else. As always, program for the lowest common denominator.
 

futurefields

Diamond Member
Jun 2, 2012
6,470
32
91
Yet anybody running those dual core HD4000 systems is going to have a hell of a time playing PC games like The Witcher series, Stalker series, Crysis series, Far Cry series, GTA series, etc...

So which lowest common denominator are they programming for?

I think they are doing their best to push the PC end. They are limited by the engines and the CORE MECHANICS of the game play that must fit into the capabilities of the 7 year old hardware of current gen consoles.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
The CPU's in the next gen consoles are 8 cores of tablet CPU welded with the GPU to form the APU that powers these things. Single thread performance is likely to be very low while forcing developers to come up with creative ways to multithread the engine so they can get decent performance out of 8 weak cores.

What do you guys think about that?



"At the Ubisoft E3 event, the PC version of The Crew was running at 30 frames per second, but the first working compilation of the PS4 codebase wasn't quite so hot, operating at around 10fps."

http://www.eurogamer.net/articles/digitalfoundry-how-the-crew-was-ported-to-playstation-4


30 fps on the PC and 10 fps on the PS4. Sounds like ubisoft has far deeper problems if the PC version is running at 30 fps....
 

zerocool84

Lifer
Nov 11, 2004
36,041
472
126
Yet anybody running those dual core HD4000 systems is going to have a hell of a time playing PC games like The Witcher series, Stalker series, Crysis series, Far Cry series, GTA series, etc...

So which lowest common denominator are they programming for?

I think they are doing their best to push the PC end. They are limited by the engines and the CORE MECHANICS of the game play that must fit into the capabilities of the 7 year old hardware of current gen consoles.

They could easily at the low resolution most play at.
 

purbeast0

No Lifer
Sep 13, 2001
53,038
5,920
126
Racing games like Forza have got to be the easiest type of game to deal with CPU wise. Almost nothing going on really, that's a GPU-dependant game through and through

which is exactly the point that i was making - its all relative, based on what devs want to do with their games.
 

Malak

Lifer
Dec 4, 2004
14,696
2
0
Right.

I'm wondering about the impact on game design mostly. Things like physics, LOD scaling, AI, environment interactivity etc... these are all heavily impacted by cpu performance are they not?

Current gen doesn't have problems with any of that. Next gen has way more memory. Everything will be better because of the memory, not better CPU's. Memory is what gimped current gen.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
This is a troll thread, right? Is this where PC snobs troll about consoles?...yawn....:rolleyes:

If you think this isn't a common occurrence with pre-alpha developer builds, you're reading into things way too much. Besides, the PC version is running at a scant 30 fps. That should tell you laughably early in development the game is..
 
Last edited:

KentState

Diamond Member
Oct 19, 2001
8,397
393
126
One game in an alpha state does not make the future of console gaming. Even Crysis 3 plays at a higher fps at 1440p at ultra settings on my system. This game in particular didn't blow me away so I doubt it's optimized in any way.
 

Spjut

Senior member
Apr 9, 2011
931
160
106
I think it'll take at least 2-3 years for the studios to begin getting their engines optimal for the PS4/Xbone, but Crysis 3 and BF3 have already shown to benefit from more than four cores.

Most multiplats will probably still be relatively meh until support for the 360 and PS3 is dropped.

Multiple developers have also said that the CPU overhead on PC is huge, and that's where consoles lower level APIs benefit the most.
Capcom once compared the Xbox 360's CPU to a Pentium D, pretty impressive that games like Crysis 3 even are running on those old boxes.
 

Subyman

Moderator <br> VC&G Forum
Mar 18, 2005
7,876
32
86
Why has no one mentioned that power creates heat and consoles are known for having heat issues as of late. It's not like a PC where you can watercool it or add 20 fans to the case. They still need to be able to operate in peoples cabinets.

With today's technology you can only make consoles so fast and tolerate so much heat. They already learned from last gen the issues.

As stated, these weren't designed for games to maximize all cores, they were designed for multitasking (hell even PC's are designed that way). They were also designed to be as cheap as possible. Really I don't see a problem with it (other than the marketing speak that they are as good as top of the line computers). The advantage they have is what has been stated repeatedly (and every gen). They are a single hardware spec that can be optimized for over the life of the console. They don't need to hit a moving target to get the best out of it which in turn can give them some performance boosts.

I tend to agree. The only other tech AMD could have offered was the Piledriver cores and they are power hungry. With the next gen consoles needing multiple OSs, "hypervisors", overlays, background downloading, steaming video, capturing gameplay, and all that other secondary stuff, they had to go with more cores. If you want to hear something really depressing, the devs probably only really see 4 of those Jag cores.
 

Bman123

Diamond Member
Nov 16, 2008
3,221
1
81
As long as they make good use of those cores then all should be good, I still think they should of went for a higher clock speed tho. These new systems aren't that small and damn near the whole back of the PS4 is open. A decent fan in there and heat shouldn't be a problem
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
As long as they make good use of those cores then all should be good, I still think they should of went for a higher clock speed tho. These new systems aren't that small and damn near the whole back of the PS4 is open. A decent fan in there and heat shouldn't be a problem

They also wanted a quiet system, so a super high performance fan was not an option. Big and slower is what they went with on the XB1.
 
Last edited:

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
That was a good read filled with some interesting technical tidbits. Since a different team is working on the PS4 version, I'm rather interested to see the PS4 vs. X1 match-up. Oh, and I like this part...

Another key area of the game is its programmable pixel shaders. Reflections' experience suggests that the PlayStation Shader Language (PSSL) is very similar indeed to the HLSL standard in DirectX 11, with just subtle differences that were eliminated for the most part through pre-process macros and what O'Connor calls a "regex search and replace" for more complicated differences.
I like how they make the bolded sound like some crazy, esoteric thing. I use 'em quite often at work, and I almost always have Microsoft's Visual Studio Regex Reference Page open as not all implementations share the same coding. Also, Visual Studio's escape character ('\') is rather lame as it doesn't work with parenthesis. So, you're pretty much forced to use :ps and :pe, and since you can't use them in the replace string, you have to encapsulate (i.e. put { and } around it) the parenthesis and use the inject feature (\ with a number representing which encapsulation) to put them back. Fix yer #$@%, Microsoft! :|

Besides, the PC version is running at a scant 30 fps. That should tell you laughably early in development the game is..

The one issue though is what does "PC" refer to? PCs aren't like consoles where the hardware is mostly static (storage tends to change). I doubt they're running it on a dog of a PC, but it would be good to know what it took to get 30 FPS.
 

Bman123

Diamond Member
Nov 16, 2008
3,221
1
81
They also wanted a quiet system, so a super high performance fan was not an option. Big and slower is what they went with on the XB1.

The Xbox 1 design is better in my eyes for sure when it comes to cooling. I want to see a tear down of the systems to see what they did for the cooling.
 

Subyman

Moderator <br> VC&G Forum
Mar 18, 2005
7,876
32
86
The Xbox 1 design is better in my eyes for sure when it comes to cooling. I want to see a tear down of the systems to see what they did for the cooling.

I thought there were already some pics of it. Its a big ass Intel stock cooler lol.

xbox-one-hardware.jpg
 

Rakehellion

Lifer
Jan 15, 2013
12,181
35
91
forcing developers to come up with creative ways to multithread the engine

The entire software industry has been multithreaded for the past 15 years. You can't even buy a single-core computer anymore.

1. The processor isn't that slow. It does what it's intended to do.
2. Most processing tasks, like AI and physics, run quite well on the GPU.
3. Consoles drive the market. Call of Duty, Tomb Raider, and even Crysis 3 are designed from the ground up to run on consoles first.
 

zebrax2

Senior member
Nov 18, 2007
974
66
91
Would it be feasible for them to release a console then slightly upgrade them later on (faster clocks due to shrinks) during their lifetime similar to what happened to Nintendo DS?
 

Ancalagon44

Diamond Member
Feb 17, 2010
3,274
202
106
As far as I understand, a 1.6Ghz AMD Jaguar is faster in single threaded performance than a 3.2Ghz Xbox 360 CPU (whatever it was called, Xenon or Xenos or whatever).

So, if decent games were playable current gen, and next gen includes not only a bump in single threaded performance, but more than double the cores, much more memory and memory bandwidth, and massively increased GPU power, we should be good.

Also, AMD was probably the frontrunner in terms of cost - but the problem is, which CPU to choose. For the same die area and energy consumption, let us assume that AMD could over the following CPU options:
1. 8 AMD Jaguar cores
2. 1-2 PileDriver modules
3. 2-4 upgraded Stars cores

Of those, AMD Jaguar is the most modern and most efficient. PileDriver is faster in single threaded performance owing to its clock speed, but in performance per watt, its a lot less efficient. That means increased heat output, which we know MS does not want.

2-4 Stars cores would have been an interesting alternative, but I think the cost in retooling them to get fabbed on a more advanced TSMC process would have been prohibitive. Just not worth the investment versus what Jaguar cores already offer.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
The CPU and GPU inside of the consoles were chosen on the basis of cost and heat not performance. Unlike with the 360 and the PS3 the hardware we see in these consoles is just PC parts packaged up on a custom board and they are midrange/low end parts. These don't rival a modern high end PC, whereas the xbox360 and PS3 in their day did. The CPU is kind of weak compared to a PC CPU today, I would call those cores very low end whereas the GPU is more mid range. That balance of resources doesn't sound too bad in a gaming machine but with PCs today we have CPU performance issues in games, not because of the draw overhead but because the games can't use many cores and they really eat clock cycles when you are aiming for 60 or 120 fps. At 30fps I guess the CPU clock speed matters a lot less, and this is where most console games will still be targeted, but I feel that is likely because the CPU is weak not because that was always the plan.

The PC games have been trying to use multithreading and so far haven't faired any better with CPU usage than the console ports. The typical xbox360 game only uses 2 threads worth of processing etc. Despite PC games knowing they will mostly have 4 cores on offer in practice they aren't finding it easy to just use it, despite the performance problems they are having with the CPU. The industry is far from working out how to use 4 cores, let alone 6 or 8.

The developers of Sim City for example had this developer blog post about the technicalities of trying to multithread the game engine. In the end they determined it was too interconnected and hence even though they were using individual agents (a technique for multithreading) in practice they actually couldn't use threads to do the work in the end.

When I say its difficult to do I might actually mean its impossible. We aren't going to see a massive change in this soon, our tools for multithreading are terrible as are our languages. We need a significant jump in tooling and that is going to require someone with great insight to design a general purpose language that is really good for multithreading. I doubt such a thing would take off in the next 7 years, it would take about that long for the games to be ported from C++ into a new language anyway, especially considering such a language would be parallel by design and hence very different to work with. The hardware might be there but the software and techniques to use it aren't. Only trivial algorithms can currently be multithreaded well, and game engines are not one of the obviously multithreadable computations, they are too complex and varied.