For what purpose does a PS4 need 8 weak cores?

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Tuna-Fish

Golden Member
Mar 4, 2011
1,672
2,546
136
Xbox 360 is older than Core 2 Duo, high end PC was dual k8 at 2.4GHz +- when the xbox 360 was released, actually most PC gamers had single core K8 and Netburst at the time... the Xbox 360 CPU was somewhat impressive when released (late 2005/early 2006)

No. It wasn't. It had impressive numbers, but once you started writing code for it, the effect faded out really fast. It was the dumbest in-order speed-racer design released at that point, and it lacked basics like store forwarding. This meant that the second you deviated from straight-line code, it got really slow. By Microsoft's own numbers, you can expect about 0.2 IPC for each thread in normal game code. In the end, for most tasks the cpu was slower than the typical Pentium 4 out at the time of release. (There were a few things you could do really well, like mixing audio, but the gains were wholly lost on the parts of game code that it just ran terribly.)

At the tail end of the console cycle, advances in compilers, hand-optimization, and changes in datastructures made it not suck quite so bad, but at no point during the console cycle was it better than the absolute minimum PC cpu you designed games for.

same for the PS3 a year later, the Cell SPEs had some respectable numbers, considering it happened before GPGPU was really a thing.

The exact same was true for the Cell. It had impressive numbers, was a total dog in practice.
 

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
No. It wasn't. It had impressive numbers, but once you started writing code for it, the effect faded out really fast. It was the dumbest in-order speed-racer design released at that point, and it lacked basics like store forwarding. This meant that the second you deviated from straight-line code, it got really slow. By Microsoft's own numbers, you can expect about 0.2 IPC for each thread in normal game code. In the end, for most tasks the cpu was slower than the typical Pentium 4 out at the time of release. (There were a few things you could do really well, like mixing audio, but the gains were wholly lost on the parts of game code that it just ran terribly.)

At the tail end of the console cycle, advances in compilers, hand-optimization, and changes in datastructures made it not suck quite so bad, but at no point during the console cycle was it better than the absolute minimum PC cpu you designed games for.



The exact same was true for the Cell. It had impressive numbers, was a total dog in practice.

as limited as the 360 CPU might have been, by the end of 2005 I think it was better than the XO CPU by the end of 2013 relative to low-mid range PCs

late 2005 equivalent in price to late 2013 Haswell i3/FX 6300 was probably single core k8 at 2GHz.

as for the Cell potential, it can go as far as matching the current console CPUs for a few tasks like this cloth simulation

acu.png


sure it's massively slower under other conditions, but we are talking about something from a late 2006 console;


well, my point was, C2D and Nehalem didn't exist when the Xbox 360 was launched, gamers were basically using single core CPUs at that point and k8 was the king.

as a whole I remember the 360 being far more impressive at launch than the PS4 was last year...
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
cell didn't have a GPU. PS3 did, but cell didn't. and it wasn't bleeding edge, it was an NVidia 7800GTX with less bandwidth

I was referring to the 360. And god no the RSX wasn't bleeding edge by any stretch. Xenos on the other hand was for it's time.....

From a performance perspective, is it reasonable to conclude that Xenon was a better gaming CPU with it's excess SIMD versus something like a 2005 era Athlon x2? Seemed like the SIMD advantage was what kept Xenon and Cell relevant as PC games began needing quad cores as games got more advanced with the likes of titles such as GTA4 and BF3. Of course, either console would've died running a 64 player game of BF3, but so did Athlon II x2s and Core 2 Duos. Being much smaller [Xenon] in die size yet reasonably capable in performing what it needed is a big plus.

I think PS3 discussion is interesting, simply because the PS3 so ahead of it's time yet incredibly lacking. The amount of "hardware patching" Sony had to do was insane when they figured out that a dual Cell system couldn't compare to having a real GPU. The fact that Cell really can keep up with some current PC CPUs goes to show where gaming was going. Sony and Toshiba sorta had it predicted with the Emotion Engine (Vector processing), it's just that CPU tech was progressing way too fast for Sony and Toshiba to keep up when the EE was put in the PS2.

The PS4 I think is a reasonable evolution of Sony's vector processing mindset. I just hope that such innovations pour into the PC space.
 
Last edited:
Aug 11, 2008
10,451
642
126
"In the coming years" seems a little too late.

In insight i hear it before, i dont want to keep hearing, i want less talk, less promises and more results, now, no when the consoles are 5 year old, if takes that long to get results its not worth it.

Meanwhile i keep seeing people doing crazy and increible amazing things with a hell of a old game engine, did you checked what egosoft did with X3:AP? that game is running on a 90s game engine that is ST!!!!! it does not use more than 1 core, im yet to see something that is 1% of that complexity running on a console.

Them i look what the folks of the community HLP managed to do with the original 1998 Freespace game engine that is also ST, "Diaspora: the shattered armistice" is running on that thing improved by amateurs!

Them i look at the PS4, problems to hit 1080p, problems to mantain 30fps, "30 fps is more cinematic", quality is less than in pc with the addon on dirty tricks, and i keep hearing how good it is to have a low end api and lots of tablet cpu cores, sorry but no, i want to see results, talking about how good it is does not cut it for me anymore.

Im sorry if i sound agressive, but im tired of this already.


Yes, I agree. The "next gen" consoles are now "current gen". It is time for some results, not just vague promises of some great advance in the future. If these thing pan out, my apologies to the devs, and they can say "I told you so", but I will remain skeptical until we actually see the results in a playable game.

Edit: I cant help but remember how the consoles were touted as going to be better than a top end PC. But they have been a year on the market, and can barely do 1080p, while gaming PCs are becoming practical at 1440, and who knows, in a couple of years, maybe even 4k.
 
Last edited:
Dec 30, 2004
12,553
2
76
Xbox 360 is older than Core 2 Duo, high end PC was dual k8 at 2.4GHz +- when the xbox 360 was released, actually most PC gamers had single core K8 and Netburst at the time... the Xbox 360 CPU was somewhat impressive when released (late 2005/early 2006)

same for the PS3 a year later, the Cell SPEs had some respectable numbers, considering it happened before GPGPU was really a thing.

the PS4/XO CPU was underpowered (compared to mid range PCs) since the start, unlike the previous gen.
I think he just likes trying to generate discussion
 
Dec 30, 2004
12,553
2
76
Yes, I agree. The "next gen" consoles are now "current gen". It is time for some results, not just vague promises of some great advance in the future. If these thing pan out, my apologies to the devs, and they can say "I told you so", but I will remain skeptical until we actually see the results in a playable game.

Edit: I cant help but remember how the consoles were touted as going to be better than a top end PC. But they have been a year on the market, and can barely do 1080p, while gaming PCs are becoming practical at 1440, and who knows, in a couple of years, maybe even 4k.

yeah, cause we're all playing 1080p games these days :rolleyes:
 

SlickR12345

Senior member
Jan 9, 2010
542
44
91
www.clubvalenciacf.com
1. Cost. It was/is the cheapest CPU for the things they needed.

2. Power. Again it was the least power consuming for the money and what they had in mind.

3. More cores. Its better to have 8 slower cores than 4 faster cores since in a console you can use 1 or 2 cores just for OS processing, meaning you could talk with friends, had internet in the background, use the streaming feature, etc...

With more cores you can do all that and still have the game use 4 cores to run good, while with only 4 faster cores, the game would end up limited to 1 core and no matter how fast it is, it will be slower.
 

Fjodor2001

Diamond Member
Feb 6, 2010
4,224
589
126
you can use 1 or 2 cores just for OS processing

Just wondering, what actual tasks do you envision those 1-2 cores would be doing while playing a game? It's not like a normal PC reserves 1-2 cores for running the Windows/Linux OS while using MS Word, playing a game or surfing the web.
 

BSim500

Golden Member
Jun 5, 2013
1,480
216
106
With more cores you can do all that and still have the game use 4 cores to run good, while with only 4 faster cores, the game would end up limited to 1 core and no matter how fast it is, it will be slower.

This gets oft-repeated but still isn't true since if the cores are faster the per core load will subsequently be lower. ie, if something eats up 50% of a slow core, it'll only eat up 17-20% of a faster one that's literally triple the speed and can handle 3x background tasks per core instead of 1. If something uses 1.5-2x slow Jaguar cores leaving 6 for games, it's no real difference than if something uses up 0.5-1x fast core leaving 3-3.5x for gaming. There's more to life than core count and 15% of an 1.5GHz Jaguar is more like 5% of a 3.4GHz Haswell. And any console which uses 4/8 cores just for "background" stuff plainly and simply needs a much less bloated OS. Even if they did use 4/8 (which they don't) common sense alone would solve that issue (ie, you simply programme the OS to look for and silently install updates overnight at 2-5am then go back into standby when done, freeing up not only daytime CPU cycles but daytime Internet bandwidth on slower / more rural net connections too).

Just wondering, what actual tasks do you envision those 1-2 cores would be doing while playing a game? It's not like a normal PC reserves 1-2 cores for running the Windows/Linux OS while using MS Word, playing a game or surfing the web.

Indeed. I'm still bewildered at the "visions" some have of PC's with "fewer faster cores" suddenly idling at 50-75% background CPU usage the day they personally bought a 6-8 "many slow core" CPU / console... I'm not even seeing 5% even on the slower i3 based HTPC of my two rigs, with web browser and 50 tabs open + HTPC software running (NextPVR) + Word + Adobe Reader open, let alone the main i5 gaming rig. In fact a while ago I posted a screenshot of taking an i5-3570, disabling 2 cores in the BIOS, forcing the clock speed down to 1.6GHz (essentially mimicking a 1.6GHz Pentium that's half the speed of a G3258), then running a 1080p Youtube chip whilst refreshing a web browsing whilst playing an MP3 whilst recording a TV programme to disk from a DTT tuner, and CPU usage was still only 26% (ie, half of 1x 1.6GHz Haswell core or 1/4 of 1x full speed Haswell core). So God knows what genuinely "average" background tasks people are doing to rack up 3x Haswell cores (equivalent of 9x 1.5GHz Jaguar cores before you even load a game) which 2x Jaguar cores will supposedly magically handle...
 
Dec 30, 2004
12,553
2
76
Youtube is just sending data to the GPU for decoding
MP3 takes about 5mhz to decode and play
recording the TV program might be a load worth mentioning (unless you have built in hardware encode)
refreshing the web page is what's the heaviest load there.

in other words all those were possible and I regularly did on my AMD Sempron64 @ 2.3ghz
 

njdevilsfan87

Platinum Member
Apr 19, 2007
2,342
265
126
Russian, I agree with almost everything you said about the PS3 and 360. Them being "more powerful" than PCs at the time? Then why were they running less than 720p with no filtering, at 30fps max while PCs were moving on 1680x1050 and higher resolutions with AA, AF, and better frame rates?

Consoles have for a long time now tried to hide behind low resolution, lower resolution textures, lower LOD, little to no filtering, and low frames with "similar" graphics to the PC (when the images were shrank to hide some of the console ugliness) to try and boast their power. Now they're finally being exposed for it. But no worries, it's more cinematic that way. :p

It's the result of Sony and MS wanting to turn profits from day one. 360 was selling for a loss for quite some time. I'm not if PS3 was since that came out later and for a little bit more.
 
Last edited:

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
Russian, I agree with almost everything you said about the PS3 and 360. Them being "more powerful" than PCs at the time? Then why were they running less than 720p with no filtering, at 30fps max while PCs were moving on 1680x1050 and higher resolutions with AA, AF, and better frame rates?

And it took a while for PC games to really pull away from the consoles. It was both a multiplatforming issue and one where the PC really was challenged in terms of prowess. Crysis and strategy games are the only notable examples where the PC really showed it's stuff in the first couple years. Both consoles were ROP limited, however in the case of Xenos, it was almost perfect for 720p rendering with very fast z-buffering and MSAA. I wouldn't play up the PS3 however, since it was a year late to the party. The 360 on the other hand, WAS equivalent to high end gaming PCs in 2005 when taking into account 720p resolution. And not everyone bought an 8800GTX or GTS when it came out in November of 2006. I think console gamers got quite a bit of value out of their systems over the past 9 years. I'm a PCer but I won't deny how successful both consoles were at swaying gamers from the PC world, most notably when you consider great console exclusives, ease of use, and a decent equivalent experience to the PC.
Consoles have for a long time now tried to hide behind low resolution, lower resolution textures, lower LOD, little to no filtering, and low frames with "similar" graphics to the PC (when the images were shrank to hide some of the console ugliness) to try and boast their power. Now they're finally being exposed for it. But no worries, it's more cinematic that way. :p

Some games rely on being "cinematic", where I honestly do prefer 30 FPS (at least for cutscenes). You wouldn't want to watch a movie in 60 FPS would you? Not all games are FPS or racing titles where 60 FPS really is desired.

It's the result of Sony and MS wanting to turn profits from day one. 360 was selling for a loss for quite some time. I'm not if PS3 was since that came out later and for a little bit more.

I think it's certainly healthier to sell at a profit or with minimal subsidizing. The PS4 and Xbone have been successful so far, and while they may be limited versus current PC hardware, plenty of console gamers are happy to see a new generation and improvements to what they saw last gen. Things will only get better as native PS4 and Xbone titles hit the shelves. The long previous generation also helped the PC, as gamers started to learn once again that it is the home to the highest framerates, resolutions, and customization. I think this generation will be much like the last except the PC ecosystem will be much healthier due to the upsurge in PC gaming thanks to Youtube stars, the PC holding the performance lead by a huge margin from the get go, and the continuing rise of indie and small studios.
 
Last edited:

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
the Xbox 360 is 2 months older than the X1900XT... 6600 GT was still a nice gaming card when the 360 was launched.
people here are talking as if the Xbox 360 was launched in 2008...
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Some games rely on being "cinematic", where I honestly do prefer 30 FPS (at least for cutscenes). You wouldn't want to watch a movie in 60 FPS would you? Not all games are FPS or racing titles where 60 FPS really is desired.

Personally I want to puke my guts up every time some movie pans around the scene at 24 fps. I can't look at the screen, its nothing but blur.

Most people are saying this because they are used to 24 fps; its the standard and thus is 'normal'. Had 60 fps become normal people would say the same about 24 fps.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
Personally I want to puke my guts up every time some movie pans around the scene at 24 fps. I can't look at the screen, its nothing but blur.

Most people are saying this because they are used to 24 fps; its the standard and thus is 'normal'. Had 60 fps become normal people would say the same about 24 fps.

Like any standard, it came down to economics, feasibility, engineering, etc. back when celluloid film was developed, which has dictated what people are now used to. Even now though, getting that doubled or more framerate requires twice as much processing power, memory, etc. much like back then it required twice as much film, faster film spooling, shutter speeds.
 

SlickR12345

Senior member
Jan 9, 2010
542
44
91
www.clubvalenciacf.com
This gets oft-repeated but still isn't true since if the cores are faster the per core load will subsequently be lower. ie, if something eats up 50% of a slow core, it'll only eat up 17-20% of a faster one that's literally triple the speed and can handle 3x background tasks per core instead of 1. If something uses 1.5-2x slow Jaguar cores leaving 6 for games, it's no real difference than if something uses up 0.5-1x fast core leaving 3-3.5x for gaming. There's more to life than core count and 15% of an 1.5GHz Jaguar is more like 5% of a 3.4GHz Haswell. And any console which uses 4/8 cores just for "background" stuff plainly and simply needs a much less bloated OS. Even if they did use 4/8 (which they don't) common sense alone would solve that issue (ie, you simply programme the OS to look for and silently install updates overnight at 2-5am then go back into standby when done, freeing up not only daytime CPU cycles but daytime Internet bandwidth on slower / more rural net connections too).



Indeed. I'm still bewildered at the "visions" some have of PC's with "fewer faster cores" suddenly idling at 50-75% background CPU usage the day they personally bought a 6-8 "many slow core" CPU / console... I'm not even seeing 5% even on the slower i3 based HTPC of my two rigs, with web browser and 50 tabs open + HTPC software running (NextPVR) + Word + Adobe Reader open, let alone the main i5 gaming rig. In fact a while ago I posted a screenshot of taking an i5-3570, disabling 2 cores in the BIOS, forcing the clock speed down to 1.6GHz (essentially mimicking a 1.6GHz Pentium that's half the speed of a G3258), then running a 1080p Youtube chip whilst refreshing a web browsing whilst playing an MP3 whilst recording a TV programme to disk from a DTT tuner, and CPU usage was still only 26% (ie, half of 1x 1.6GHz Haswell core or 1/4 of 1x full speed Haswell core). So God knows what genuinely "average" background tasks people are doing to rack up 3x Haswell cores (equivalent of 9x 1.5GHz Jaguar cores before you even load a game) which 2x Jaguar cores will supposedly magically handle...

If a game uses 6 cores, no matter how slow they are they are going to be better to process a game, than just one core. The one core would have to work at 100% which is when the slowdown occurs, since the information for processing is coming in faster than the cpu can process it.

in 6 cores the load will be shared and thus the 100% limit won't be reached.

If a game is not demanding enough to run on 1 fast core, then its fast enough to run anyways and we are not talking about that.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
I just hate 30fps cut-scenes in DA3 it's a terrible stutter fest, if the game on the consoles looks like that all the time it would be totally unplayable for me.

the Xbox 360 is 2 months older than the X1900XT... 6600 GT was still a nice gaming card when the 360 was launched.
people here are talking as if the Xbox 360 was launched in 2008...

Yes, Xbox360 had probably the best GPU there was when it was launched, better than both 1800XT and 7800GTX and it was only surpassed by X1900XT but only in brute strength because it could draw much more power and thus could be clocked higher but architecturally it still lagged behind. OTOH RSX was meh at the time of PS3 release.
 

ctsoth

Member
Feb 6, 2011
148
0
0
Did any of you railing against the 8x Jag cores ever consider that Sony & Microsoft did not decide on their design via some mook's arbitrary blubbery?

It strikes me as inherently obvious that Microsoft and Sony have enough first hand experience with game and software development, coupled with close third party relationships, that they designed systems that were able to meet their performance metrics at a given price.

I could see people railing against the PS4 or XB if they were totally different products as far as hardware goes, but you really ought to be capable of reading the writing on the wall if your consider yourself a discerning individual. Two competing companies decided on very similar designs at roughly the same time. If you can't imagine why on your own I don't know what I or anyone else can do to open your eyes.

I am utterly amazed by the prevalence of arguments that read like "zomg what if they made their super magic game box out of...." Do you really think the decision makers didn't dutifully consider all products available on the market?
 
Last edited:
Aug 11, 2008
10,451
642
126
I am utterly amazed by the assumption that because a bunch of engineers and marketing staff made a decision, it had to be the right one. The decision may or may not have been a good choice, but the people making such decisions are certainly capable of making the wrong choice.

You have to base the evaluation of a decision on objective data, the results of the product, not just with the naive assumption that because a company made the decision it had to be the right one. Was the Pentlium 4 a good cpu choice? How about bulldozer? How about the Edsel, anybody remember that fiasco? Obviously companies considered the other alternatives when making these decisions and still made the wrong one.
 
Mar 10, 2006
11,715
2,012
126
I am utterly amazed by the assumption that because a bunch of engineers and marketing staff made a decision, it had to be the right one. The decision may or may not have been a good choice, but the people making such decisions are certainly capable of making the wrong choice.

I think this "made the wrong choice" is far more of a risk for merchant chip vendors in a competitive market rather than for device vendors contracting out a chip design.

That said, you're right. It's pretty silly to assume that the product planners "must" have made the right decision because they're the product planners. I'm sure the folks at Intel wish they'd gone a different route with its mobile chips or AMD with its "big core" designs.

Healthy discussion of alternatives -- even if those alternatives ultimately prove sub-optimal -- is key to interesting discussions on forums, in real life, and so on. If we all just said, "well it was made this way, so no sense thinking about how it could have been done differently and what those trade offs may have meant," then nobody would exercise their critical thinking skills and we'd all be brain-dead idiots.

I think that the proponents of the "they made the right decision because...they know more than us" aren't able to articulate what they really mean...what I think they mean is that sometimes a solution that seems like it would have been "obviously better" may not have been if we consider all of the engineering/time-to-market/budget constraints. Tech enthusiasts on forums sometimes forget that there are other considerations to these decisions.

I also find it interesting that when certain posters bring a "financial" perspective to these sorts of things (which are actually critical to product decisions), these posters are immediately labeled "stock market speculators" and "shills".
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
An i3 can't even stream twitch with any intensive game going using a PC. Going with a dual core is a horrible idea. With twitch taking up most of a core on an i3, add OS, what do you have left?

i3? You're out of your mind thinking that's a good idea.
An i3 should not pose any bottleneck to streaming video. It should only need to provide a little memory bandwidth for the task, and manage a small amount of DMA work. What it can do on Windows, compressing video with the CPU, is not a fair way to compare.

Well apparently there are A LOT of PC gamers on our forum that think a G3258 OC is better than FX8000/9000 series for gaming and think dual core is still awesome for modern gaming.
They are ignoring minimums. There was a time when max and average framerates were never enough, so we stuck to average. That's no longer a reasonable assumption, and at the least, min FPS/max frame time is a necessarily useful comparison point, if not a frame time distribution graph. I only bother looking for minimums, these days, if results are not presented with more data. They aren't popping up every day, but those not entranced with the performance of a Pentium had the information to make a better choice, but succumbed to hype and low cost.

(single-die, cost v. performance stuff)

I mean we will have Witcher 3 and Uncharted 4 on PS4. To max out Witcher 3 on the PC will require a GPU that costs more than PS4 alone. The diminishing returns of highest level of graphics beyond medium/high are extremely costly, requiring 3-4X the graphical power of PS4's GPU for less than commensurate increase in graphical quality.
So, don't use them except to submit sceenshots. At better settings, you'll get better IQ by miles, and smoother gameplay, assuming the game isn't a stutter-fest with any system.
 
Apr 20, 2008
10,067
990
126
An i3 should not pose any bottleneck to streaming video. It should only need to provide a little memory bandwidth for the task, and manage a small amount of DMA work. What it can do on Windows, compressing video with the CPU, is not a fair way to compare.

It absolutely is though. Even those who stream with an i5 often have a separate system for streaming twitch.tv streams. If you're unfamiliar you should look it up before claiming it shouldn't pose any bottleneck. It really does. You need dedicated cores for it. Even watching a Starcraft 2 stream from someone with an i3/Pentium is generally choppy, low IQ, and lower resolution. SC2 uses only two threads so you'd think that HT would be good enough... nope.

Anyone who streams typically has an i7, Phenom X6 or FX. It's far too choppy otherwise with any sort of demanding game. Weirdly the consoles can do it very well with such a "slow" CPU.
 

BSim500

Golden Member
Jun 5, 2013
1,480
216
106
If a game uses 6 cores, no matter how slow they are they are going to be better to process a game, than just one core.
Games don't use "1 core" anymore and haven't in almost a decade. And PC's & consoles still don't use the equivalent of 3 large cores for background tasks no matter how often you repeat it. I simply don't understand where you're getting any of that "quad core = 1 core for games, 3 for background tasks" from if a game is supposed to be running on "fewer larger cores" (ie a "big" core quad instead of a "small" octo-core CPU) as that simply isn't true on ANY platform - PC or console alike... :confused:
 
Last edited:

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136
Do you believe both Microsoft and Sony made a bad choice?

It was a well thought design decision for obvious reasons that dont seems to be obvious for everybody.

If one has 100W dedicated to the SoC then what is more efficient, a 20W CPU + 80W GPU or a 40W CPU + 60W GPU.?.

And then for the CPU part, what is more efficent at equal throughput , 8 cores or a dual/quad.?.

The answer is straightfoward, a 20W 8 cores CPU + 80W GPU, overall i think that Sony made a more balanced box, 10% less CPU frequency with substancialy bigger GPU than MS.

I think that the proponents of the "they made the right decision because...they know more than us" aren't able to articulate what they really mean..

You disagree with the above.?.
 
Last edited: