• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Anandtech XB1 PS4 CPU/Video/Memory Performance Evaluation

avtek21

Member
Before the next gen game consoles from Sony and Microsoft were released, there were heated debates about design decisions, but the initial games seem they get similar performance while generating high quality imagery.

The decision to mate a AMD Jaguar tablet 8-core with a Radeon 7970 class graphics tied to either GDDR5 or EDRAM will continue to generate arguments.

Whatever ones allegiance, Battlefield 4 at high res and 60 fps on a $400 box is amazing!

http://anandtech.com/show/7528/the-xbox-one-mini-review-hardware-analysis

Console thread, moving from CPUs forum to Console forum
-ViRGE
 
Last edited by a moderator:
Before the next gen game consoles from Sony and Microsoft were released, there were heated debates about design decisions, but the initial games seem they get similar performance while generating high quality imagery.

The decision to mate a AMD Jaguar tablet 8-core with a Radeon 7970 class graphics tied to either GDDR5 or EDRAM will continue to generate arguments.

Whatever ones allegiance, Battlefield 4 at high res and 60 fps on a $400 box is amazing!

http://anandtech.com/show/7528/the-xbox-one-mini-review-hardware-analysis
Actually posting this would create a lot more chaos than you'd imagine, anyways the GPU is not much above the current 7850 or 7870 mid range cards but consoles make better use of game code & as such should outperform similar specced PC configs by quite a wide margin, hence the 60fps near full HD gaming experience.
 
Plus I'm not sure running BF4 at 720p or even 900p could be considered "high resolution" when my 7950 runs BF4 at 1200p solidly at 60FPS with high-to-ultra settings.

Unless you just typo'd 7970 and meant 7870 (which would be accurate for the PS4).
 
Just to add the PS4 is definitely that much better, as far as visuals are concerned, than the Xbox at this point in time ~
xbone-ghosts.png

The top one is from PS4 while the one below is Xbox
ps4-ghosts.png
 
The single fact that neither the PS4 or Xbox One can run a growing number of titles in 1080p. Not to mention the medium settings or below mixed with for example 720p. Plus the dynamic downscaling even further when the game gets more demanding. Thats seriously a dissapointment. Specially when both consoles have been announced from both companies to last atleast 10 years.

And to the OP, BF4 runs at medium settings at best on consoles. And lower res.

K3m2jZ9.png

2564953-5729570096-25649.png
 
Last edited:
The single fact that neither the PS4 or Xbox One can run a growing number of titles in 1080p. Not to mention the medium settings or below mixed with for example 720p. Plus the dynamic downscaling even further when the game gets more demanding. Thats seriously a dissapointment. Specially when both consoles have been announced from both companies to last atleast 10 years.

Because launch titles are always a solid assessment of a consoles capabilities :|

Maybe I just imagined the gigantic worlds apart difference in graphics fidelity when comparing Call of Duty 3 to The Last of Us.
 
PC gaming for life!


You see the power/perf numbers on the cpu, I'd vomit if I owned a console and Mantle took over for all vendors on PC... You'd be in a world of worthless.
 
Last edited:
The first few big games will look like late 360/PS3 games with extra "shine".

They will both look better in the future.
Exclusives will look great on each.
Non exclusives will look the same on both.

And the glorious PC gaming master race will forever rule over the dirty console peasants.

hth
 
I played some BF4 on the PS4 over the weekend and it was pretty freaking nice. Certainly nothing like the game is on Ultra on the PC, but my buddy who had just gotten the console was gushing like nobody's business about how amazing the graphics were.

For console gamers these machines are incredibly huge upgrades over PS3/360. BF4 was actually pretty decent. I was impressed how good it looked considering the hardware it is running off is a lot weaker than a PC equipped with a $200 GPU.

It's a shame Anand used COD: Ghosts for that article. The game is a rotten turd and looks like total garbage even on the PC. Battlefield 4 would of been a much better choice!
 
For console gamers Doom 95 still looks good 😀 😀 😀

By all means the peasants think it looks STUPENDOUS but compared to a PC, don't even go there.

Oh, and those screenshots, my eyes are bleeding. Those are next gen consoles?
 
Based on the load vs idle numbers I'm going to guess that the only thing that happens for "idle" is that the GPU clocks and load are reduced.
 
The decision to mate a AMD Jaguar tablet 8-core with a Radeon 7970 class graphics tied to either GDDR5 or EDRAM will continue to generate arguments.

7970? Not even close.

The XB1 GPU has 768 stream processors and 16 ROPs, with a clock speed of 853 MHz. The PS4 GPU has 1152 SPs and 32 ROPs running at 800 MHz.

Here are a few discrete AMD GPUs in comparison:
7770 - 640 SPs, 16 ROPs, 1000 MHz
7790 - 896 SPs, 16 ROPs, 1000 MHz
7850 - 1024 SPs, 32 ROPs, 860 MHz
7870 - 1280 SPs, 32 ROPs, 1000 MHz
7970 - 2048 SPs, 32 ROPs, 1000 MHz

As we can see from these comparisons, the XB1 GPU falls about halfway between the 7770 and 7790 in terms of raw computing horsepower, but it will suffer from lack of GDDR5 and the lower clock rate. The PS4 GPU is probably about on par with a 7850: it has a few more SPs, but also a slightly lower clock rate.

The 7970 is more than twice as powerful as the XB1 GPU.
 
16ROPs? that's not a good sign I think.
it looks like the Xbox One was designed for 720p
Yup, that's really going to hold the Xbone back. Silly how Xboners are touting 276GB/sec peak bandwidth lumping in the embedded RAM. Likelihood of devs really crafting crossplatform games to take advantage of the eSRAM is nil. I am not impressed by the Xbone. The power consumption of the PS4 worries me a bit though as I am just too traumatized by the RRoD. MS handled that situation very poorly. Any failure by design from any maker is unacceptable.
 
Last edited:
I wouldn't worry too much regarding the PS4 and heat, neither console maker skimped on the cooling solution.
 
I think it needs to be taken into consideration that a good majority of people don't even sit close enough to their TV's to discern the difference between 720p and 1080p.
 
I wouldn't worry too much regarding the PS4 and heat, neither console maker skimped on the cooling solution.
Aside from a good thermal protection that prevents the system from exceeding a temperature that would melt POS leadless solder, I'm still concerned. Then again I follow directions and keep consoles with adequate air gaps on all sides and even still 360's have red ringed on me. Alot of people put consoles in closed cabinets and on plush carpet or other suboptimal areas.

I think it needs to be taken into consideration that a good majority of people don't even sit close enough to their TV's to discern the difference between 720p and 1080p.
Pfff, there is no reason to keep games at 720p any longer. Xbone at 720p with no antialiasing and a higher pricetag... so much fail.
 
I wish I could use one of these as a desktop
Not even interested in gaming on it, just want to install an OS and run it.

Anyone heard anything about them being unlocked or allowing this ?
 
Just to add the PS4 is definitely that much better, as far as visuals are concerned, than the Xbox at this point in time ~
xbone-ghosts.png

The top one is from PS4 while the one below is Xbox
ps4-ghosts.png

you've got that reversed. Take a look at the links. The bottom one is ps4-ghosts.png, top one is xbone-ghosts.png.
You can tell by the (lol) pixels in the textures, eg the sign on the right. Aliasing on the grass is better on the bottom one. In motion the aliasing differences are pretty noticable esp if you have a nice TV.
 
Last edited:
I wish I could use one of these as a desktop
Not even interested in gaming on it, just want to install an OS and run it.

Anyone heard anything about them being unlocked or allowing this ?

Why would you want to use a slow (for desktop apps) cpu with most of the power directed to the gpu for a desktop that you dont want to game on?

For normal desktop apps that are not heavily multithreaded, I would think that a pentium or for sure an i3 would be much more pleasant to use.
 
Why would you want to use a slow (for desktop apps) cpu with most of the power directed to the gpu for a desktop that you dont want to game on?

For normal desktop apps that are not heavily multithreaded, I would think that a pentium or for sure an i3 would be much more pleasant to use.

I like the idea of it too, simply to be able to use the $400 PS4 as a daily PC would be great.
8 Cores would render lots of Chrome tabs pretty quickly, and that RAM bandwidth would be wonderful for Winrar (because that's all I do all day), Photoship, and video encoding. Mainly, I'd like to see how it "stacks up". Kinda like the fun I get out of having a quad core 2.3ghz phone with 2GB RAM. Just something great about having so many cores...
 
Pfff, there is no reason to keep games at 720p any longer. Xbone at 720p with no antialiasing and a higher pricetag... so much fail.

I don't disagree, and while I doubt any of us tech-types would consider an XB1 for this reason alone, the majority of people out there won't notice a difference. Average joe notices the graphical effects, and whether or not the game runs smoothly. If he's sitting 10 feet away from his 55 inch TV, he's not going notice the difference in resolution, and probably wouldn't notice the difference even if he was close enough to the TV.

If the games all have the same graphical effects, and the same framerates, most people aren't going to be able to tell a difference between the 2 consoles.
 
I don't disagree, and while I doubt any of us tech-types would consider an XB1 for this reason alone, the majority of people out there won't notice a difference. Average joe notices the graphical effects, and whether or not the game runs smoothly. If he's sitting 10 feet away from his 55 inch TV, he's not going notice the difference in resolution, and probably wouldn't notice the difference even if he was close enough to the TV.

If the games all have the same graphical effects, and the same framerates, most people aren't going to be able to tell a difference between the 2 consoles.
I get what you're saying but even my average Joe friends see the difference from ten feet on my 50 inch. The jaggies at 720 are made all the more obvious.
 
XBone strikes me as a better living room box than the PS4, which strikes me as more gaming focused (due to specs).

Should be interesting to watch this play out while sitting on my throne on top of PC Master Race Mountain.
 
As we can see from these comparisons, the XB1 GPU falls about halfway between the 7770 and 7790 in terms of raw computing horsepower, but it will suffer from lack of GDDR5 and the lower clock rate. The PS4 GPU is probably about on par with a 7850: it has a few more SPs, but also a slightly lower clock rate.
Memory bandwidth isn't really a big deal here. It would have been, had the XB1 come with 32 ROPs, but with 16, the problem is essentially alleviated. ROPs are huge consumers of bandwidth, and when a GPU is bandwidth starved, it's almost always because of the ROPs. There's a good reason that AMD chose to feed their 32 ROPs in Tahiti with an extra 64 bit memory controller -- they couldn't get their memory bus speeds high enough to keep the ROPs fed, so they had to get extra bandwidth from somewhere. (This was likely more of a design decision rather than a "couldn't do it" sort of thing, but my point remains: ROPs are very bandwidth-hungry).

My biggest issue with the consoles, at least from a performance perspective, is their idle power. I don't actually care about the consoles at all, so the specs are rather meaningless to me, but that idle power draw is something out of pre-power-gating 2006. It's bothersome for me to see processors that are going to be in tens of millions of systems, if not more, that have such awful power draw at idle. It's just plain wrong that so many years have gone by and these things still hog that much power doing next to nothing. I blame the software though, and not the silicon. I really hope that there are some software updates that clamp down on idle and standby power.
 
Back
Top