PS4 graphics array is essentially Pitcairn (Radeon 7870) with two Compute Units disabled for yields, making for 1152 Stream Processors, 72 Texture Units, 32 ROPs. Clocked at 800 MHz for ~1843 GFLOPS.
Xbone graphics array is essentially Bonaire (Radeon 7790) with two Compute Units disabled for yields, making for 768 Stream Processors, 48 Texture Units, 16 ROPs. Clocked at 853 MHz for ~1310 GFLOPS.
Along with a vast disparity in GFLOPS, each APU uses a different overall memory setup. The PS4 APU uses GDDR5 memory with a 256-bit bus. Xbone APU uses DDR3 memory on a 256-bit bus, with 32 MB of integrated embedded SRAM right on the APU die that is much faster than the DDR3. Each setup has it's strengths and weaknesses.
Bringing the Wii U into the conversation. The Wii U has separate CPU and GPU dies: Wii U GPU is reportedly based on AMD's Radeon 4000 Series architecture, which was already 4 years old by the time of the Wii U's release. It has 320 Stream Processors, 16 Texture Units, and 8 ROPs. Clocked at 550 MHz for 352 GFLOPS. I don't know what Nintendo was thinking there, but at least it's not as bad as the Wii U CPU. The GPU also contains the GPU used in the Wii (and by extension the one in the Gamecube) which includes 3 MB of framebuffer and texture cache.
As for the Wii U memory setup, it's functionally very similar to the Xbone (or rather the Xbone is similar to the Wii U), while being integral for Wii/Gamecube game backwards compatibility. The GPU contains 32 MB of eDRAM that serves to operate in substitute of the 24 MB of 1T-SRAM found in the Wii and Gamecube consoles when running in backwards compatibility mode. In Wii U mode, it's a very fast cache, used for the framebuffer and texture caching primarily (I would assume). There is 2 GB of DDR3 main memory connected via a 64 bit bus. Developers have access to 1 GB of it, while the rest is for the OS and background tasks.