When Cell was announced till it actually materialized. It was a performance difference of a factor 37 or so. Quite hillarious. And why the nVidia GPU was suddenly needed.
Even in the end they had to cut frequency and SPEs.
With constant price drops on SSD, it would be a huge waste of $ to put a 256GB SSD in lieu of spending that $ on a faster GPU instead.
November 1, 2012 Rumor: PS4 based on AMD A10 series, new dev kits shipping
- Console features revamped UI.
- Travel "anywhere" feature on the system mid-game
http://www.vg247.com/2012/11/01/ps4_details_playstation_4/
and
http://www.slashgear.com/sony-ps4-orbis-based-on-tweaked-amd-a10-tip-devs-01255214/
I doubt they would put a 256gb SSD in there as standard...as you said that would be a large cost to them, which would then be passed to us. Maybe there will be different versions with SSDs. A 7200rpm laptop drive would be plenty I think. From what I read, even putting as SSD into a PS3 doesn't give much benefit as some sort of encryption needs to take place before writing, which becomes the bottleneck. I have a 250gb laptop drive in my PS3 and never thought to myself that the machine feels slow.
The architecture between Orbis and the PS3 are going to be different anyway. The old bottlenecks may be irrelevant.
I agree, but there certainly COULD be the same (or similar) bottleneck. I was just pointing out that there is(?) a bottleneck in the PS3, and there could be a similar one in the PS4, making an SSD pointless.
-------------
The minute I read that Sony is aiming at a "very affordable" console, I knew that's not a good sign for good hardware, even mid-range hardware for that matter. An A10-based PS4 with no discrete GPU sounds like Sony is going Nintendo style with their next console by aggressively cost cutting the components. I bet they are going to try to lose very little money on PS4 or even make $ in the first year. If they use a 256GB SSD for main storage and stick with the A10 APU, that is the worst engineering decision ever! Flash storage is very expensive. They should go with the Wii U route and let consumers use any SSD/HDD they choose for upgrading later. With constant price drops on SSD, it would be a huge waste of $ to put a 256GB SSD in lieu of spending that $ on a faster GPU instead.
This next generation of consoles is shaping up to be very underwhelming.
Remember, these are not near final specs. Both links seem to be similar enough that you may be looking at one "source" piggy-backing on the other. VG247 claims "multiple", so who knows how deep/true that is.
They have to aim for a "very affordable" console, they basically have no choice. And if 1080p60/3D is their goal (which is basially 1080p at 120fps, right?), would an APU even be able to pull that off?
It seems more likely that it will be the APU + a mobile GPU in crossfire, even if only a weak one. Even the quote, that it's based on an A10 APU, doesn't exclude the idea of a graphics card, it just says that there will be an APU in there at least.
Also, as far as I've read, the WiiU thing is about letting consumers choose an EXTERNAL drive for use with the system. The internal drive is non-replaceable and only 8GB or 32GB. Even after the WiiU comes out, there will only be one system that allows you to swap the internal drive with any drive you can purchase that will fit, and that will be....the PS3.
Based on these specs, this thing is weaker than Wii U
I was under the impression that the Wii U is using the 5570 (400 shaders).
I was under the impression that the Wii U is using the 5570 (400 shaders).
Now that I think about it, HCF would be a mistake really. It'd be like the Saturn's second processor where most games simply wouldn't use it.
Perhaps something like the A10 5700 except 640 shaders? 8 GB of ram is nice though. The 256 GB drive is almost definitely an SSD, but that won't be in the base model.
Definitely not, not even in today's games, nevermind next gen DX11 games.
http://www.hardwareheaven.com/revie...pu-performance-review-gaming-performance.html
However, the A10 Trinity CPU, paired with say HD7850 GPU will provide very good gaming performance for a console (but you can't hybrid CF 7660 and 7850)
http://www.hardwareheaven.com/revie...rformance-review-gaming-performance-7850.html
Ya, but I think 7660D can only be cross-fired with HD6570 or HD6670 based on what Xbitlabs says. HD7660 + HD6570 still can't get 60 fps in old games. Even if they go with APU + dedicated GPU CF, it could still be way too weak.
Agree, those spec
Ya, you are right that you cannot upgrade Wii U's internal storage. What I was saying I personally much prefer Wii U's approach of letting consumers upgrade the HDD even if it's external. The Wii U's issue is that it has USB 2.0 ports not 3.0 or eSATA. The issue of going with a mechanical disk drive again is a tricky one. Not sure if it's better for the consumers. Both PS3 and 360 charge a lot of extra $ for a small bump in HDD space. MS's HDD upgrade is especially a rip-off.
Actually, reading up on it, it looks like their next version of their APUs with GCN isn't expected to come out until after this launches, so it's probably unlikely (though not impossible). I wonder how well they could tweak asymmetric crossfire to work if they only had to worry about one specific set of hardware? But it looks like we'll find out early next year (probably around GDC) what the final dev kits have in them, so not that much longer to wait.
I am not even sure you can do that since GCN and VLIW are different GPU architectures. It'll be a driver mess! A more ideal solution would be Kaveri + HD7750/7770 in Crossfire. For that to happen, they'd need to launch PS4 in 2014 I bet. Also, if they will have final hardware finalized by Summer 2013, I can't see how Kaveri can fit into the picture.
The CPU will still be a huge improvement over the Cell but I am still worried about that GPU. I mean we criticize the CPU for our high-end gaming rigs, but it can deliver > 60 fps in modern games with a modern GPU onboard (unless it's a strategy game).
I think there is a missing dedicated GPU component somewhere Sony is hiding.![]()
I will eat my hat if the shipped hardware has Trinity level graphics.
I am not even sure you can do that since GCN and VLIW are different GPU architectures. It'll be a driver mess! A more ideal solution would be Kaveri + HD7750/7770 in Crossfire. For that to happen, they'd need to launch PS4 in 2014 I bet. Also, if they will have final hardware finalized by Summer 2013, I can't see how Kaveri can fit into the picture.
The CPU will still be a huge improvement over the Cell but I am still worried about that GPU. I mean we criticize the CPU for our high-end gaming rigs, but it can deliver > 60 fps in modern games with a modern GPU onboard (unless it's a strategy game).
Actually, the main reason that they needed to go to NVidia at the 11th hour wasn't because of the Cell- the original plan was for Sony to develop an in-house graphics chip, with the same team who developed the graphics chip for the PS2. This is the team that fell way behind schedule and never delivered, which is why NVidia were called in.
I thought it was because Crazy Ken put two cells in the original design and was telling people to get a second job to afford the PS3.
Once Sony caught whiff of Ken's craziness, they axed him, scaled down the design and eventually needed a GPU to fill the role of the missing second Cell.
I thought it was because Crazy Ken put two cells in the original design and was telling people to get a second job to afford the PS3.
Once Sony caught whiff of Ken's craziness, they axed him, scaled down the design and eventually needed a GPU to fill the role of the missing second Cell.
This next generation of consoles is shaping up to be very underwhelming.