• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Will PS4/Xbox One increase the need of Cores?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Thats secondary chip is simply a DSP for the streaming feature (Encoding/decoding). Since the CPU is too weak to handle that.
The Encoding/Decoding part is done with VCE and UVD. The secondary chip is the primary operating system processor and it appears on both the Xbox One and the Playstation 4.
 
And atleast 1 core will be reserved for the OS.

I'm not saying you're wrong, but why would you statically want to allocate one or more cores to the OS? While gaming, wouldn't it make sense to use all cores for the game if needed, and let the OS idle (or only run non-CPU intensive background tasks, and let the OS scheduler allocate the remaining CPU cycles to the game)?

Also, what is actually meant by "allocating one core to the OS"? I mean the OS kernel doesn't do much by itself. It's the actual programs running on that OS that does stuff.

It's not like you say that "one core is allocated to the OS" on a Windows gaming PC, so why should that be different for a console?
 
Try run a game and utilize those on the same time while streaming it to the network. And UVD can still only handle 1 stream if I recall right.
VCE can stream 1080p @ 60 FPS to remote places for CAD and Gaming. UVD(Fixed-function Hardware) can handle two streams but with the OVD API(OpenCL Software UVD) it could handle more. I'm also sure there is a OpenCL VCE API as well just like OVD. Everything is really computed on the Graphic Core Next units rather on the CPUs.

CPUs for organization of tasks.
GPUs for execution of tasks.
 
Last edited:
Just so you guys know.

An eight core Jaguar at 2 GHz which would consume around 28 watts. Would score 4 pts in Cinebench, beating Phenom II 975 X4 BE and A8-3850 with fractions of the power consumption. If Sony/Microsoft/AMD up'ed that clock to 3.2 GHz, it would beat the FX-8350 in Cinebench r11.5 MT. Beating the FX-8350, would probably place the possible TDP around 65 Watts.1 GB and 3 GB.
1 Gigabit is 128 MB, 3 Gigabits is 384 MB.

Also, the Playstation 4 and Xbox One both have special purpose units dedicated for the operating system. Which in turn I hear is ARM Cortex-A9s.

Jaguar can go way above 2 GHz, it isn't a hard limit. It is a soft limit imposed by AMD, to appeal to said market.


Think about what you are saying here. AMD has a processor that essentially demolishes their high(er)-end chips in perf/watt, but they refuse to release it because they are... appealing to a market?

If AMD could actually do what you claim, I am sure they would. They could cut down R&D costs significantly (only one line), while increasing perf/watt.
 
The operating system is based on ARMv7, while the games are based on x86-64.

So you think the ARMv7 is going to control all the (GDT, LDT, TSS, IDT, etc)?

Shenanigans

It is nothing more than a small subsystem on the main system. Logically it should be disabled or at the very least virtualized when the main system is active.
 
Last edited:
This has been pounded to death... and the answer is no one really knows for sure but probably more cores will be more desirable.

http://forums.anandtech.com/showthread.php?t=2304345

There are probably other threads on this somewhere too.

Pretty much this.

Personally, I think it will make i3's as well as AMD quads struggle with new games but I think SB/IB/Haswell i5's and i7s will be fine. I DO however think there will be a larger parity between i5's and i7's in future games than what we see currently.
 
Think about what you are saying here. AMD has a processor that essentially demolishes their high(er)-end chips in perf/watt, but they refuse to release it because they are... appealing to a market?
Jaguar has low clocks because it is marketed as a low power core. I don't expect Jaguar to replace Bulldozer, because TSMC LP has a theoretical max of ~2.8 GHz for quad cores. Another reason, is AMD doesn't want to fill all the baskets with the same architecture.
Logically it should be disabled or at the very least virtualized when the main system is active.
The ARMv7 is the always on main system. While, the x86-64 is the on only when gaming portion. The GPGPU appears to be shared between both systems.

Playstation 4: FreeBSD Main OS - ARMv7(1 GB), FreeBSD Sub OS - x86-64(7 GB)
Xbox One: Windows RT MOS - ARMv7(3 GB), Windows x86 SOS - x86-64(5 GB)
The GPGPU is the only thing that has coherency across everything.

Level 0: ARM
Level 1: ARM/x86
Level 2: x86
Level 3: x86
 
Last edited:
The ARMv7 is the always on main system. While, the x86-64 is the on only when gaming portion. The GPGPU appears to be shared between both systems.

Playstation 4: FreeBSD Main OS - ARMv7(1 GB), FreeBSD Sub OS - x86-64(7 GB)
Xbox One: Windows RT MOS - ARMv7(3 GB), Windows x86 SOS - x86-64(5 GB)
The GPGPU is the only thing that has coherency across everything.

NO!

It's not possible. I don't know where you get this crap but there is no way. They are separate OSs, there is no way they could coexist in the same memory map, let alone efficiently route interrupts between them.

Link or shut up!!!
 
Sorry if this questionhas been pounded to death...I do not have a great understanding on the developement side of PC's and how they work... but I have a quick question.

Since the PS4/Xbox One will be using 8 core AMD CPU's, does this translate to AMD CPU's being utilized better for future gaming? (multi-threading) Or can we expect the PC Version of games to be still designed to utilize 2-4 cores, thus giving Intel the advantage still due to Single Threaded performance being greater? I imagine this will be the case, and my question may be silly, but was just wondering. Crysis 3 is an example that runs very well on Vishera.

I do not want this to be another AMD vs Intel thread - I'm not biased to either, although right now I wanted to change things up and am using all AMD hardware. (Which explains my curiosity for this question)

The short answer is "yes, AMD CPUs will perform better".

Current games are developed for existent consoles, which are single (PS3) and triple core (Xbox 360). There are also difficulties with their programming, which explain why developers prefer rely on few threads. The 8-core design of the next consoles (PS4 and Xbox One) changes this radically and games will be heavily multithreaded.

We approached a number of developers on and off the record - each of whom has helped to ship multi-million-selling, triple-A titles - asking them whether an Intel or AMD processor offers the best way to future-proof a games PC built in the here and now. Bearing in mind the historical dominance Intel has enjoyed, the results are intriguing - all of them opted for the FX-8350 over the current default enthusiast's choice, the Core i5 3570K.

http://www.eurogamer.net/articles/digitalfoundry-future-proofing-your-pc-for-next-gen
 
You've already been embarrassed time and again with yoru silly arguments and claims. When will it end?
 
Sorry if this questionhas been pounded to death...I do not have a great understanding on the developement side of PC's and how they work... but I have a quick question.

Since the PS4/Xbox One will be using 8 core AMD CPU's, does this translate to AMD CPU's being utilized better for future gaming? (multi-threading) Or can we expect the PC Version of games to be still designed to utilize 2-4 cores, thus giving Intel the advantage still due to Single Threaded performance being greater? I imagine this will be the case, and my question may be silly, but was just wondering. Crysis 3 is an example that runs very well on Vishera.

I do not want this to be another AMD vs Intel thread - I'm not biased to either, although right now I wanted to change things up and am using all AMD hardware. (Which explains my curiosity for this question)

Nope. The CPU handles linear computing, which benefits from fewer cores.

If you have a 4-core CPU at 3GHz, it will almost always underperform a 2-core CPU at 6GHz. The only problem is, 6GHz is pretty much impossible to manufacture due to the heat output, so multicore CPUs became mainstream 10 or so years ago.

The GPU has thousands of cores. The more, the better.
The CPU has a handful of cores. The fewer, the better.
 
So the short answer from me would be no.

You can't do an apples to apples comparison here with the 360 and PS3, as programming and porting from them was a far different process than it will be with the next generation. Essentially, porting from a PS3 title to the PC meant you had to start from SCRATCH - you would obviously maintain graphical assets but the coding itself would be quite different.

This won't be the case with the next generation. I don't think next-gen games taking advantage of additional cores is far fetched, i'm quite sure it will happen since x86 is the baseline for coding. That wasn't the case in years past, *especially* with the PS3.
 
It's not possible. I don't know where you get this crap but there is no way. They are separate OSs, there is no way they could coexist in the same memory map, let alone efficiently route interrupts between them.
http://www.infoq.com/news/2011/07/Barrelfish
http://www.computerweekly.com/blogs...barrelfish-experimental-operating-system.html
http://www.barrelfish.org/
http://research.microsoft.com/apps/pubs/?id=81154
http://research.microsoft.com/apps/pubs/default.aspx?id=101903

As far as I can go before I have to use advanced search engines. TL;DR version: HSA
 
Last edited:
Pretty much this.

Personally, I think it will make i3's as well as AMD quads struggle with new games but I think SB/IB/Haswell i5's and i7s will be fine. I DO however think there will be a larger parity between i5's and i7's in future games than what we see currently.

Not trying to be picky, but I think what you mean is a larger disparity between i5 and i7, correct?
 

Dude? That is not in either system.

It's a neat system more along the lines of highly integrated routing. Certainly not anything that could allow the level of interrogation you're talking about.

From what I've read the ARM chip controls some communication via Trustzones. I'm sure at some point it acts as a launcher/browser/multi media device, I can guarantee it has no control over the AMD chip's OS when it is active, the technology just isn't there.
 
Probably.

Both points have been in this thread, while weak cores, it will force devs to think in more threads, yes the old consoles supported 3+ cores/threads(ps3 "threads" is in limbo in this comparison, spes <> cores) but they wasnt x86, and yes a weak quad could rival the upcoming 8 core consoles.. but the architecture and code designs/patterns that is going to emerge from the new consoles should be easily transferable to our desktops.. On *this* side of things, i predict good times ahead, and I hope it will push/set a standard for future code parallelism.
 
Fact is next gen consoles have 8 cores, and logically, developers will obviously squeeze as much performance out of those (slow tablet) cores. On a PC more threads and more cores will naturally be the answer and dual's will begin to die out.
 
I don't know, but my guess would be no, games won't be using 8 cores to good effect any time soon. We might start to creep into that territory a little more, but I don't think we'll get there in the next 5 years.
 
With some games already showing how they use 6 cores of the PS4 I'd say it at least means PC gamers should expect to see more performance extracted from 4 and 6 core CPUs. Only having 2 cores has been a questionable gaming PC decision for several years now, we may see 2+HT CPUs added to that list in the next year or two.
 
I doubt it will change much. Xbox360 supported 6 threads, PS3 supported..8?

The cores in the consoles are extremely weak. (Equal to a fast dualcore or slow quadcore.) And atleast 1 core will be reserved for the OS. I doubt any game will actually use more than 6 on the consoles in the best case. Not to mention Amdahls law regarding scaling on those cores.

AMD is also going from 8 threads to 4 threads on the desktop with its next uarch.

Remember the Crysis 1.3 patch:
http://www.pcgameshardware.de/Crysis-3-PC-235317/Tests/Crysis-3-CPU-Test-1068140/

So the short answer from me would be no.

The baseline been moved up so now instead of a i7-3770k being around 12x faster than the Xbox 360 CPU now you're only about 2.5x faster than the six Jaguar core(current rumors suggest that 2 cores will be reserved in both consoles).
 
The baseline been moved up so now instead of a i7-3770k being around 12x faster than the Xbox 360 CPU now you're only about 2.5x faster than the six Jaguar core(current rumors suggest that 2 cores will be reserved in both consoles).

Can you explain the 2.5x?
 
Back
Top