Separate names with a comma.
Discussion in 'CPUs and Overclocking' started by inf64, Nov 4, 2012.
Read rest on the source page.
Even with optimizations on a closed platform, a 7660d is extremely optimistic to reach 1080p @ 60 frames - is it not?
and with release sometime in 2014 or 15?
It's a custom chip,but based on A10. We have no idea how many SPs it has.
As for release date,maybe late 2013 at best.
It might feature a GPU as well.
It's also not bad. Considering how closely knit the developers are with the hardware it wouldn't be difficult to imagine decent frame rates at 1080p from a Trinity A10. Think about how crappy current console hardware is and how smoothly games run, some even at 1080p.
Most games run at 640p and are upscaled. A few games run at 720p, and even fewer actually run at 1080p.
True. PS3 Crysis 2 runs @ 1024x720, for instance.
I think it's safe to assume that the development kits are shipping with A10 derivatives simply because it's the best APU tech that's currently available. Just as Apple's x86 dev boxes were Netburst Pentium based.
No doubt, with retail release as far off as it still is, Sony will be getting from AMD either the best then-available or initial production of soon-to-be-released APUs.
Then why not simply ask the developers to go and get whatever recent x86 CPU and GFX card they like, if it's not going to match the actual PS4 hardware in the end anyway. Why bother sending them a special "PS4 development PC"?
That what the original xbox basically was.
Thinking that while it may not be final silicon, the architecture is. That means devs can start handcrafting optimized code (asm, c ...whatever) relative to the platform and it will still be good when the final chip lands.. it will just be faster.
Oh .. and if this is not 1st of April or somethin... I shall enjoy watching the hat-eating ... and we want pics.. you know who you are.
-maybe to show the dev's the high tech. world of DX11.
Console definitely won't have 8GB RAM.
Surprised they're shipping a PC as that's not going to have anywhere near as much bandwidth for the GPU as a console [hopefully] would.
If you go by the same rumourmill crap and "insider sources" and "already shipped"
Then it seems Xbox720 will be Intel and nVidia.
But thats the problem in the first place, nonsense rumours. One would think people learned better by now, after having their fingers burned several times.
The xbox360 devkit had a x800 card btw. And the xbox was released with something between a x1800/x1900 GPU.
I believe they ship with extra ram because they have to run the development software on it.
BTW can an x86 CPU be still be considered x86 if you rip out the x86 decoder?
Precisely, this means more about general hardware than anything specific, AMD x86 and DX11-class graphics hardware. I'd be VERY surprised if there wasn't a discrete GPU included in the final hardware. It would be interesting to see if they stay with an APU and perhaps use crossfire or perhaps even a setup with one GPU used for GPGPU while the other handles standard GPU duties (or even shut one of the chips down during video playback/web browsing/*insert light usage mode here* to stay cool and quiet).
The current xBox 360s use a PowerPC "APU" where the AMD GPU is on the same die as the CPU.
I'm hoping the final systems will both have at least 8GB of ram since its so cheap now.
8GB would be nice, but I think even 4GB would do the trick provided it's hellishly fast ram. A10 is incredibly sensitive to memory speed in desktop benches, with DDR3-2133 making a monster of an improvement over 1066 and 1333.
If they managed to get something north of 2133 on a double-wide bus it'd be pretty solid.
Well Sony did use XDR in the PS3, if they used XDR2 in the PS4 it would give insane performance. If they use XDR2 I would live with 4GB of ram.
It still means the architecture is roughly the same, though. They wouldn't send out dev kits for an AMD APU and then use an nVidia GPU + Intel chip.
We do know that AMD is being used for all 3 next gen consoles to supply GPUs, we just don't know what else if anything else.
The Xenos GPU in the xbox is based on R600, the first of the unified VLIW5 series. Nothing like X800/1800
- Why would you rip out the decoder? Is there any good reason not to leverage the vast x86 optimizations that goes into todays x86/amd64 compilers?
It had 240 shaders total, 3x what Brazos currently has and the same number as the A4-3400. An A10 is only 2x-3x the performance of an A4, so no way Sony would just use the A10 IGP. Still, I'd expect the A4 performance level will be occupied by Jaguar and its successors in the near future.
I wouldn't be that surprised if the PS4 and Xbox 3 end up with much RAM, the consoles simply aren't only "gaming consoles" anymore.
It would make sense to have Trinity the basis of current devkits, with Kaveri or an even more expanded SIMDed APU the basis for the actual PS4. If it is an APU only system with no dedicated GPU, AMD would surely modify the memory controllers for GDDR5 and either use very high speed GDDR5 (1200 MHz+) or perhaps have 3 or 4 memory controllers, depending on how comprehensive the graphics array is, or maybe eDRAM for the entire APU via an MCM?. Full Cape Verde performance is enough to match today's console graphics at 1080p and 60 FPS, with room left for some growth, but it wouldn't be competitive with the rumored Tahiti-level GPU in the Nextbox.
Yields could be a concern, even at 28nm for 2 Steamroller x86 modules @ 4.0 GHz + 640 or more GCN2 Processors @ 1 GHz, it would be a decent sized chip, over 250 mm² I bet, but the savings in mobo complexity and need for only a single type of memory would be worth it. I think the ideal PS4 APU would be (if feasible):
2 Steamroller modules
256 bit FPU per core
960 GCN2 Radeon Cores
x3 GDDR5 memory controllers (192 bit) tied to 3 GB of GDDR5 memory @ 1200+ MHz
If this isn't feasible in terms of yields, then Trinity or Kaveri + a dedicated GPU like Tahiti would make more sense for the PS4, especially if Sony expects developers to make use of GPGPU processing on the APU for physics, animation, sound, etc. It would be best to keep those processes on their own die with actual graphics unaffected.