R Read - "Our semi-custom APUs" = Xbox 720 + PS4?

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
When Cell was announced till it actually materialized. It was a performance difference of a factor 37 or so. Quite hillarious. And why the nVidia GPU was suddenly needed.

Even in the end they had to cut frequency and SPEs.
 
Last edited:

NTMBK

Lifer
Nov 14, 2011
10,411
5,677
136
When Cell was announced till it actually materialized. It was a performance difference of a factor 37 or so. Quite hillarious. And why the nVidia GPU was suddenly needed.

Even in the end they had to cut frequency and SPEs.

Actually, the main reason that they needed to go to NVidia at the 11th hour wasn't because of the Cell- the original plan was for Sony to develop an in-house graphics chip, with the same team who developed the graphics chip for the PS2. This is the team that fell way behind schedule and never delivered, which is why NVidia were called in.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
November 1, 2012 Rumor: PS4 based on AMD A10 series, new dev kits shipping
- Console features revamped UI.
- Travel "anywhere" feature on the system mid-game
http://www.vg247.com/2012/11/01/ps4_details_playstation_4/
and
http://www.slashgear.com/sony-ps4-orbis-based-on-tweaked-amd-a10-tip-devs-01255214/

Dev Kits
A new version of the dev kit, housed in a standard PC case and codenamed Orbis, is being shipped to developers. There will be four versions of the dev kit. The previous version was just a graphics card. The new version is of a “modified PC.” The next version, which will be sent out in January, will have close to final specs. The final version will be sent out “next summer.”

Developer Meetings
Sony held a “disclosure meeting” with U.S. developers this week to discuss the machine, what it’s designed to do, detail its hardware, and show a set of presentations. It will hold another meeting in “the coming weeks.” The name “PlayStation 4″ was never used in these meetings. Sony has always referenced the machine as “Orbis.”

Hardware
Orbis is based on the AMD’s A10 APU series, which is a combined CPU and GPU. The system’s APU is “derivative” of existing A10 hardware, and is “based on A10 system and base platform.” The “ultimate goal” for the hardware is to run 1080p, 60fps game in 3D with “no problem,” and to create a machine powerful enough for “today and tomorrow’s market.” Current dev kits for the platform have either “8gb or 16gb of RAM.” The system will feature a Blu-ray drive, and offer a 256gb “standard” hard-drive. However, it was not clear if it will be a normal or solid state drive. It will also have Wi-Fi, and Ethernet and HDMI out ports. There is no difference between Orbis and PlayStation 3′s outputs, according to the source. With Orbis, Sony aims to avoid the problems it encountered when launching PlayStation 3. It aims to create a console that’s “very affordable,” but “isn’t a slouch.”

The machine is not being made in Japan.

User Interface
Orbis’ user interface has been revamped. Players will be able to press the PlayStation button mid-game and travel “anywhere” on the system. The source cites buying download content from the PlayStation Store mid-game, then seamlessly returning to the game. “They’re trying to make it as fluid as possible,” said the source. The system will also be able to accept system and product updates in the background, and will “always be in standby mode,” if you choose to enable that option.

Announcement
Sony is expected to announce the new console “just before E3″ next year, according to the source.

-------------

The minute I read that Sony is aiming at a "very affordable" console, I knew that's not a good sign for good hardware, even mid-range hardware for that matter. An A10-based PS4 with no discrete GPU sounds like Sony is going Nintendo style with their next console by aggressively cost cutting the components. I bet they are going to try to lose very little money on PS4 or even make $ in the first year. If they use a 256GB SSD for main storage and stick with the A10 APU, that is the worst engineering decision ever! Flash storage is very expensive. They should go with the Wii U route and let consumers use any SSD/HDD they choose for upgrading later. With constant price drops on SSD, it would be a huge waste of $ to put a 256GB SSD in lieu of spending that $ on a faster GPU instead.

This next generation of consoles is shaping up to be very underwhelming.
 
Last edited:

thilanliyan

Lifer
Jun 21, 2005
12,040
2,254
126
With constant price drops on SSD, it would be a huge waste of $ to put a 256GB SSD in lieu of spending that $ on a faster GPU instead.

I doubt they would put a 256gb SSD in there as standard...as you said that would be a large cost to them, which would then be passed to us. Maybe there will be different versions with SSDs. A 7200rpm laptop drive would be plenty I think. From what I read, even putting as SSD into a PS3 doesn't give much benefit as some sort of encryption needs to take place before writing, which becomes the bottleneck. I have a 250gb laptop drive in my PS3 and never thought to myself that the machine feels slow.
 

BlockheadBrown

Senior member
Dec 17, 2004
307
0
0
November 1, 2012 Rumor: PS4 based on AMD A10 series, new dev kits shipping
- Console features revamped UI.
- Travel "anywhere" feature on the system mid-game
http://www.vg247.com/2012/11/01/ps4_details_playstation_4/
and
http://www.slashgear.com/sony-ps4-orbis-based-on-tweaked-amd-a10-tip-devs-01255214/

Remember, these are not near final specs. Both links seem to be similar enough that you may be looking at one "source" piggy-backing on the other. VG247 claims "multiple", so who knows how deep/true that is.

That said, Nintendo's system does look good, from what a number of devs are saying. There's more to read in the NeoGAF forums on the Wii U, Orbis and Durango. There's also a possibility that we're not getting a complete story from these "sources". There may be a separate GPU. Lots of grey here. :)
 

BlockheadBrown

Senior member
Dec 17, 2004
307
0
0
I doubt they would put a 256gb SSD in there as standard...as you said that would be a large cost to them, which would then be passed to us. Maybe there will be different versions with SSDs. A 7200rpm laptop drive would be plenty I think. From what I read, even putting as SSD into a PS3 doesn't give much benefit as some sort of encryption needs to take place before writing, which becomes the bottleneck. I have a 250gb laptop drive in my PS3 and never thought to myself that the machine feels slow.

The architecture between Orbis and the PS3 are going to be different anyway. The old bottlenecks may be irrelevant.
 

thilanliyan

Lifer
Jun 21, 2005
12,040
2,254
126
The architecture between Orbis and the PS3 are going to be different anyway. The old bottlenecks may be irrelevant.

I agree, but there certainly COULD be the same (or similar) bottleneck. I was just pointing out that there is(?) a bottleneck in the PS3, and there could be a similar one in the PS4, making an SSD pointless.
 

BlockheadBrown

Senior member
Dec 17, 2004
307
0
0
I agree, but there certainly COULD be the same (or similar) bottleneck. I was just pointing out that there is(?) a bottleneck in the PS3, and there could be a similar one in the PS4, making an SSD pointless.

From what I've read from a number of devs, bottlenecks are one of the things that Sony was acutely aware of.

From what I've read, PC and Console hardware should not be taken in a 1 for 1 comparison.
 

cplusplus

Member
Apr 28, 2005
91
0
0
-------------

The minute I read that Sony is aiming at a "very affordable" console, I knew that's not a good sign for good hardware, even mid-range hardware for that matter. An A10-based PS4 with no discrete GPU sounds like Sony is going Nintendo style with their next console by aggressively cost cutting the components. I bet they are going to try to lose very little money on PS4 or even make $ in the first year. If they use a 256GB SSD for main storage and stick with the A10 APU, that is the worst engineering decision ever! Flash storage is very expensive. They should go with the Wii U route and let consumers use any SSD/HDD they choose for upgrading later. With constant price drops on SSD, it would be a huge waste of $ to put a 256GB SSD in lieu of spending that $ on a faster GPU instead.

This next generation of consoles is shaping up to be very underwhelming.

They have to aim for a "very affordable" console, they basically have no choice. And if 1080p60/3D is their goal (which is basially 1080p at 120fps, right?), would an APU even be able to pull that off? It seems more likely that it will be the APU + a mobile GPU in crossfire, even if only a weak one. Even the quote, that it's based on an A10 APU, doesn't exclude the idea of a graphics card, it just says that there will be an APU in there at least.

And while there may be 256GB of space, it doesn't necessarily have to be 1 256GB hard drive. They could take a cue from Apple and Intel (Fusion Drive and SRT) and have a small SSD/Flash cache backed up by a larger mechanical hard drive. Not to mention that one of the biggest reasons we keep getting different model PS3s is because they keep changing the size of the hard drive to keep up with the current manufacturing trends to make their bulk purchasing easier (there's probably a better way to say that). Going to Flash Memory would mean that as the size of the chips increase, they'd just have to buy less of them instead of having to buy whatever the currently mass-produced 2.5" laptop drive is. Also, as far as I've read, the WiiU thing is about letting consumers choose an EXTERNAL drive for use with the system. The internal drive is non-replaceable and only 8GB or 32GB. Even after the WiiU comes out, there will only be one system that allows you to swap the internal drive with any drive you can purchase that will fit, and that will be....the PS3.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Remember, these are not near final specs. Both links seem to be similar enough that you may be looking at one "source" piggy-backing on the other. VG247 claims "multiple", so who knows how deep/true that is.

Agree, those specs are still rumors. What I don't understand is the same rumor says the connections won't change between PS3 and PS4 but if it only uses an A10 APU, how will Sony support 4K resolution through HDMI? I thought HD6000 series (HD7660) could only support 4K displays from DP 1.2 and only with Fast HDMI (3Ghz) of GCN HD7000 series can the new GPUs output 4K via HDMI?

The other piece of the puzzle is how can they deliver good graphics with only an APU? 7660D is nearly 2x slower than HD4770. Based on these specs, this thing is weaker than Wii U. Doesn't make sense to me. I think there is a missing dedicated GPU component somewhere Sony is hiding. :p

They have to aim for a "very affordable" console, they basically have no choice. And if 1080p60/3D is their goal (which is basially 1080p at 120fps, right?), would an APU even be able to pull that off?

Definitely not, not even in today's games, nevermind next gen DX11 games.
http://www.hardwareheaven.com/revie...pu-performance-review-gaming-performance.html

However, the A10 Trinity CPU, paired with say HD7850 GPU will provide very good gaming performance for a console (but you can't hybrid CF 7660 and 7850)
http://www.hardwareheaven.com/revie...rformance-review-gaming-performance-7850.html

It seems more likely that it will be the APU + a mobile GPU in crossfire, even if only a weak one. Even the quote, that it's based on an A10 APU, doesn't exclude the idea of a graphics card, it just says that there will be an APU in there at least.

Ya, but I think 7660D can only be cross-fired with HD6570 or HD6670 based on what Xbitlabs says. HD7660 + HD6570 still can't get 60 fps in old games. Even if they go with APU + dedicated GPU CF, it could still be way too weak.

Also, as far as I've read, the WiiU thing is about letting consumers choose an EXTERNAL drive for use with the system. The internal drive is non-replaceable and only 8GB or 32GB. Even after the WiiU comes out, there will only be one system that allows you to swap the internal drive with any drive you can purchase that will fit, and that will be....the PS3.

Ya, you are right that you cannot upgrade Wii U's internal storage. What I was saying I personally much prefer Wii U's approach of letting consumers upgrade the HDD even if it's external. The Wii U's issue is that it has USB 2.0 ports not 3.0 or eSATA. The issue of going with a mechanical disk drive again is a tricky one. Not sure if it's better for the consumers. Both PS3 and 360 charge a lot of extra $ for a small bump in HDD space. MS's HDD upgrade is especially a rip-off.
 
Last edited:

jpiniero

Lifer
Oct 1, 2010
16,493
6,987
136
Based on these specs, this thing is weaker than Wii U

I was under the impression that the Wii U is using the 5570 (400 shaders).

Now that I think about it, HCF would be a mistake really. It'd be like the Saturn's second processor where most games simply wouldn't use it.

Perhaps something like the A10 5700 except 640 shaders? 8 GB of ram is nice though. The 256 GB drive is almost definitely an SSD, but that won't be in the base model.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I was under the impression that the Wii U is using the 5570 (400 shaders).

The most recent information I saw after the Wii U motherboard leaked was this:

captura-de-pantalla-2012-10-11-a-las-20-25-11.png

Source

Not sure if it's accurate, but it shows the GPU is an RV740 (that's either HD4770 or HD4860). Both of those are nearly 2x more powerful than the GPU in A10-5800K when they run at full GPU clocks. Of course Nintendo might have downclocked the GPU to make sure that the entire Wii U stays under 75W power envelope. The biggest problem for the Wii U is its CPU - complete dog!

Here is the shocker, even if Wii U has HD5570, it's still faster than HD7660D. D:

Radeon HD 5570 1GB (GDDR5) (DX11) -- 33 VP
AMD (Trinity APU) A10-series Radeon HD 7660D (dual channel DDR3-1866) (DX11) -- 31 VP
Source
 
Last edited:

cplusplus

Member
Apr 28, 2005
91
0
0
I was under the impression that the Wii U is using the 5570 (400 shaders).

Now that I think about it, HCF would be a mistake really. It'd be like the Saturn's second processor where most games simply wouldn't use it.

Perhaps something like the A10 5700 except 640 shaders? 8 GB of ram is nice though. The 256 GB drive is almost definitely an SSD, but that won't be in the base model.

I think the trick is that while they're making the dev kits with 256GB SSDs, the actual console won't have that much flash storage. Right now, they already have a PS3 out in Europe with 12GB (I think) of Flash storage but still having the HDD tray (but when you put a hard drive in, you stop being able to read the flash storage because of limitations with the PS3s firmware). I could see them doing something along the lines of having 32-64GB of flash onboard, with either an additional mechanical hard drive (for premium versions), or space for it. And like I said before, if they could implement some version of smart SSD caching in their OS/firmware, it could actually be a best-of-both worlds solution in terms of speed and size (and allowing people to install their own hard drives into the system). The system comes with the flash memory right out of the box so that you can just plug and play, but if you want a lot of space all you have to do is buy a laptop hard drive and plug it in, and you get the benefits of all that extra space with a SATA connection instead of USB.

Definitely not, not even in today's games, nevermind next gen DX11 games.
http://www.hardwareheaven.com/revie...pu-performance-review-gaming-performance.html

However, the A10 Trinity CPU, paired with say HD7850 GPU will provide very good gaming performance for a console (but you can't hybrid CF 7660 and 7850)
http://www.hardwareheaven.com/revie...rformance-review-gaming-performance-7850.html



Ya, but I think 7660D can only be cross-fired with HD6570 or HD6670 based on what Xbitlabs says. HD7660 + HD6570 still can't get 60 fps in old games. Even if they go with APU + dedicated GPU CF, it could still be way too weak.

Well, both links don't specifically say A10, but an "A10 derivative". Since this thing isn't going to be mass manufactured until the middle of next year at the earliest, isn't there a chance they could put a different graphics core into the system, something that could be crossfired with the current GCN architecture (because that's the problem, right? That the Trinity APUs are VLIW4 instead of GCN?).

Actually, reading up on it, it looks like their next version of their APUs with GCN isn't expected to come out until after this launches, so it's probably unlikely (though not impossible). I wonder how well they could tweak asymmetric crossfire to work if they only had to worry about one specific set of hardware? But it looks like we'll find out early next year (probably around GDC) what the final dev kits have in them, so not that much longer to wait.

Agree, those spec
Ya, you are right that you cannot upgrade Wii U's internal storage. What I was saying I personally much prefer Wii U's approach of letting consumers upgrade the HDD even if it's external. The Wii U's issue is that it has USB 2.0 ports not 3.0 or eSATA. The issue of going with a mechanical disk drive again is a tricky one. Not sure if it's better for the consumers. Both PS3 and 360 charge a lot of extra $ for a small bump in HDD space. MS's HDD upgrade is especially a rip-off.

What I do like about Nintendo's approach is allowing customers to actually run their programs off of an external drive. That's something both companies could stand to adopt.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Actually, reading up on it, it looks like their next version of their APUs with GCN isn't expected to come out until after this launches, so it's probably unlikely (though not impossible). I wonder how well they could tweak asymmetric crossfire to work if they only had to worry about one specific set of hardware? But it looks like we'll find out early next year (probably around GDC) what the final dev kits have in them, so not that much longer to wait.

I am not even sure you can do that since GCN and VLIW are different GPU architectures. It'll be a driver mess! A more ideal solution would be Kaveri + HD7750/7770 in Crossfire. For that to happen, they'd need to launch PS4 in 2014 I bet. Also, if they will have final hardware finalized by Summer 2013, I can't see how Kaveri can fit into the picture.

The CPU will still be a huge improvement over the Cell but I am still worried about that GPU. I mean we criticize the CPU for our high-end gaming rigs, but it can deliver > 60 fps in modern games with a modern GPU onboard (unless it's a strategy game).
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
I am not even sure you can do that since GCN and VLIW are different GPU architectures. It'll be a driver mess! A more ideal solution would be Kaveri + HD7750/7770 in Crossfire. For that to happen, they'd need to launch PS4 in 2014 I bet. Also, if they will have final hardware finalized by Summer 2013, I can't see how Kaveri can fit into the picture.

The CPU will still be a huge improvement over the Cell but I am still worried about that GPU. I mean we criticize the CPU for our high-end gaming rigs, but it can deliver > 60 fps in modern games with a modern GPU onboard (unless it's a strategy game).

A Steamroller + GCN APU with 640 stream processors and stacked DRAM is not a very unrealistic expectation. remember that by H2 2013 Globalfoundries would be in volume production on 28nm with through silicon vias (TSV) also available for volume production

http://www.eetimes.com/electronics-news/4371752/GlobalFoundries-installs-gear-for-20nm-TSVs

Given that PS4 is scheduled for Q1 2014 launch behind Xbox Next, its quite probable that it would be a very well integrated high performance APU.
 

NTMBK

Lifer
Nov 14, 2011
10,411
5,677
136
The idea of an APU plus a Crossfire'd GPU is ridiculous. This is going to be a semicustom chip, guys. They can put more graphics resources onto the chip produced, instead of producing a separate chip. A fixed platform console is the ideal candidate for a slightly tweaked chip- they're going to produce millions of the things.

Don't read too much into early devkits. Remember that the initial 360 devkit was a PowerMac G5 with two off the shelf PowerPC processors, and the final hardware was a tricore single chip solution and a completely different core. The only similarity was that they both used PowerPC ISA, and both had >1 core. The graphics card was a different architecture to the one they used in the 360, too.

The earliest devkits are just so developers can cut their teeth on the software APIs and get used to developing for a new type of architecture before the final hardware is done. There's a reason why the initial set of games released tend to be very poorly optimized. ;) An A10 based devkit is what I'd expect- an off the shelf processor with roughly the same characteristics (x86, APU) that they can quickly slap together into a system and ship to developers. It doesn't define that that's what's going to be in the final machine.

I will eat my hat if the shipped hardware has Trinity level graphics.
 

Spjut

Senior member
Apr 9, 2011
931
160
106
I am not even sure you can do that since GCN and VLIW are different GPU architectures. It'll be a driver mess! A more ideal solution would be Kaveri + HD7750/7770 in Crossfire. For that to happen, they'd need to launch PS4 in 2014 I bet. Also, if they will have final hardware finalized by Summer 2013, I can't see how Kaveri can fit into the picture.

The CPU will still be a huge improvement over the Cell but I am still worried about that GPU. I mean we criticize the CPU for our high-end gaming rigs, but it can deliver > 60 fps in modern games with a modern GPU onboard (unless it's a strategy game).

Instead of the normal Crossfire, what about putting all GPGPU tasks on the APU, and the "normal" GPU stuff on the stronger, discrete GPU?

I don't think the CPU will be an issue either. Capcom compared the 360's CPU to a Pentium dual core, so that's a big upgrade either way.
Then put some console optimisations for it and it will pack quite a punch.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Actually, the main reason that they needed to go to NVidia at the 11th hour wasn't because of the Cell- the original plan was for Sony to develop an in-house graphics chip, with the same team who developed the graphics chip for the PS2. This is the team that fell way behind schedule and never delivered, which is why NVidia were called in.

I thought it was because Crazy Ken put two cells in the original design and was telling people to get a second job to afford the PS3.

Once Sony caught whiff of Ken's craziness, they axed him, scaled down the design and eventually needed a GPU to fill the role of the missing second Cell.
 

Bobisuruncle54

Senior member
Oct 19, 2011
333
0
0
I thought it was because Crazy Ken put two cells in the original design and was telling people to get a second job to afford the PS3.

Once Sony caught whiff of Ken's craziness, they axed him, scaled down the design and eventually needed a GPU to fill the role of the missing second Cell.

Yep, that's the real reason AFAIK.
 

NTMBK

Lifer
Nov 14, 2011
10,411
5,677
136
I thought it was because Crazy Ken put two cells in the original design and was telling people to get a second job to afford the PS3.

Once Sony caught whiff of Ken's craziness, they axed him, scaled down the design and eventually needed a GPU to fill the role of the missing second Cell.

Not according to The Race For A New Game Machine, which recounts the Cell and 360 core developments in a lot of detail. (Go read it, people!) That explicitly says that there was an in house Sony graphics chip whose development failed and needed replacing by NVIdia.

Of course, Ken had plenty of other... erm... "interesting" moments. The gem recounted in aforementioned book was that the Cell was meant to have 6 SPEs on die originally, but Ken decided waaaay after chip layout was complete to up that to 8 - because "8 is beautiful". (And as we all know the 8th one was fused off anyway, to improve yields.)

EDIT: Edited for clarity.
 
Last edited:

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
This next generation of consoles is shaping up to be very underwhelming.

I am cautiously optimistic that with the combination of underwhelming hardware + the general (yet small) revivial in PC gaming taking place, plus with the small but albeit push to move towards the cloud, PC gaming will continue to move forward and not be substantially held back by these next gen consoles. In other words, I don't think these next gen consoles are going to have the longevity or success the xbox360, wii, and PS3 enjoyed.
 
Last edited:

DrBoss

Senior member
Feb 23, 2011
415
1
81
The recent specs which have been released certainly seem underwhelming. I can’t help but think that (as mentioned above) there is critical GPU information missing from the developer kits (or at least the information that’s been released about the developer kits). Sony going the way of Nintendo and producing a generic do-all social networking play toy just doesn’t add up to me.

PS3 was a powerful albeit complex piece of hardware when it was released. Despite the faults of the cell architecture it was “next gen”, it was something new and exciting. Without a discrete or hybrid GPU addition, these A10 developer kits just don’t strike me as powerful enough to hit their “1080p60 3d” goal.

I fully expect (hope) Sony to push the envelope with their next console. While it’s not selling well, Vita is a beautiful and very powerful handheld that’s packed full of exciting technologies. I just can imagine the PS4 won’t follow suit.