Digital Foundry: next-gen PlayStation and Xbox to use AMD's 8-core CPU and Radeon HD

Page 35 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Obviously if u turn off effects that are present in DX10 while gaming in DX9 it would would run better, its a no brainer. But have DX9 render the same effects & it would be much less efficient.

If you're having to turn effects off that kind puts a dent in the PS3 doing more work theory. Your example is flawed in that it rarely ever happens. PC versions of games almost always have more eye candy then their console cousin. It could be in the form of higher FPS, bigger textures, more advanced lighting, higher resolution, longer draw distances, all or a combination of those things. All of which translates to the PS3 doing a heck of a lot LESS work.

Find me a PC game that has identical effects in DX9 as it does in DX10, that also has nothing above and beyond what the PS3 version of the game has, and you'll have found a game that the PS3 is working harder in when run in DX10 mode on the PC. I'm not aware of any games that meet this criteria. Like I said, it was a nice theory, but it doesn't happen in practice.
 

itsmydamnation

Diamond Member
Feb 6, 2011
3,056
3,865
136
It's actually not correct to compare at vastly different frequencies, because performance does not scale linearly with frequency.

For example, the 2GHz Q9000 gets 2.4 points. That's 20% better.

I know but i couldn't find any Core 2 QUAD cine 11.5 at lower clocks ( granted i spent a whole 5 minutes googling)
 

poohbear

Platinum Member
Mar 11, 2003
2,284
5
81
They're not running the same effects. The PS3's version of BF3 has much much scaled back versions of the graphics compared to the PC version.

The lowest settings on the PC is Low but the PS3's graphical settings would be equivalent to Very Low.


So the PS3 version has to cut back in field of view, polygons, textures, draw distances, shaders, framerates and resolution to get an experience lower than that of a 2006 PC (intel dual core + 8800gts) that you could have bought 3 days before the PS3 came out. Obviously the PC was (much) more expensive ($600 just for a C2D and 8800GTS alone) but the PS3 isn't some kind of monster and even all the clever (and pretty amazing) programming tricks they've done isn't going to magically make the PS3 better than it is.

Ok i'm not sure if you read the other posts, but seems like ure coming into this discussion missing alot of the other stuff we said. the context was how 7850/7870 specs in a PS4 was NOT the equivalent of a 7850/7870 performance in PC, i said it would be the equivalent of a 7950/7970. We referenced the PS3 and said while it was released 8 years ago, it can render an incredible amount of detail that belied its PC equivalent specs of 8 years ago.

I never used BF3 as my example and i sure as hell never used modern PC hardware to say its the equivalent of an 8 year old console. DICE themselves said BF3 was created for PC first THEN they scaled it down to the console (very rare to to, but the results were awesome for PC).

i used Crysis 3, which was rendered with an incredible amount of detail and optimized for consoles first. The Crytek 3 engine was created to squeeze out every drop of performance from the PS3 and the results were quite impressive. It shows what an 8 year old console can achieve with optimized and efficient code that an 8 year old PC with similar or slightly higher specs simply can't achieve (hence why i say a 7850 in a PS4 is more like a 7950/7970 due to this optimized code). So yes, 8 year old PC hardware and even 7 year old PC hardware can NOT match it in terms of level of detail in optimized games like Crysis 3 because that is an example of a highly optimized engine that makes those consoles shine despite their dated hardware. A PC from 8-7 years ago simply can't render the same amount of detail in Crysis 3. I'm NOT comparing a PS3 to modern PC hardware.


I'm sure 8 years from now (2020) the PS4 will be rendering stuff with an amount of detail that a 7850/7870 simply can't do. That's my point.
 
Last edited:

ChronoReverse

Platinum Member
Mar 4, 2004
2,562
31
91
I think you're the one confused? I think everyone knows that consoles tend to get more out of the same hardware. It's kinda obvious. But it's still going to be less than top of the line PC capabilities from the same period.

The 8800gts was top of the line 3 days BEFORE the PS3 came out. The PS3 has not matched that level. You simply don't seem to get that the PS3 has to cut tons of corners just to get playable framerates. It hilarious to use FC3 as another example since that one was cut back on the PS3 as well (it's running at 704p like BF3 and can't even maintain 30FPS). Looks like an 8800gts can at least get 30FPS on medium settings at 720p from a bit of googling (people complaining the PS3 version looks horrible in comparison too). So even this unoptimized for PC game is beaten.



The top of the line right now is the GTX Titan. It's out at least a few months before the PS4 will be out. The "7860" in the PS4 at the end of its lifetime will at best only match the Titan.
 
Last edited:

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Good luck finding a single PC game that can run on an 8800GTX(I could make it really hard and say Radeon X1900XT since the 8800 was not out when the 360 came out) that looks as good as Halo 4.
 
Last edited:

Tuna-Fish

Golden Member
Mar 4, 2011
1,653
2,494
136
Good luck finding a single PC game that can run on an 8800GTX(I could make it really hard and say Radeon X1900XT since the 8800 was not out when the 360 came out) that looks as good as Halo 4.

The Xenos was better than anything that was out when it was released. No-one sane contests that. It was only outclassed when the G80 came out.

As for games that run well on the 8800GTX: Basically every single PC game that has a console version. 8800GTX has more than twice the raw power that Xenos has for everything, with more than 3 times pixel pushing power. And it's actually more efficient and flexible. Everything that runs on console will run on it, only better.

As for games that run on it and are prettier than Halo 4, how about the Witcher 2? It runs just fine on a 8800GTX, and is considered rather pretty on the PC.
 

ChronoReverse

Platinum Member
Mar 4, 2004
2,562
31
91
As for games that run on it and are prettier than Halo 4, how about the Witcher 2? It runs just fine on a 8800GTX, and is considered rather pretty on the PC.

Plus native 720p and all that jazz that seems to be ignored. Even in 2006 people mostly played in 1280x1024 not 1280x720.

(It shouldn't need to be mentioned all the visual shortcuts you can see Halo 4)
 
Last edited:

inf64

Diamond Member
Mar 11, 2011
3,884
4,692
136
Tressfx: A new frontier of realism in pc gaming

tressbanner.jpg



http://blogs.amd.com/play/tressfx/
http://havok.com/news-and-press/releases/havoks-cutting-edge-physics-technology-showcased-playstation%C2%AE-meeting-2013

21st February 2013 ‐ San Francisco ‐ Havok, a leading provider of 3D interactive game development technology, today announced that its cutting edge Physics technology was showcased at PlayStation® Meeting 2013 in New York. At the event, Sony Computer Entertainment presented a live demo of Havok’s Physics technology running on the PlayStation®4 computer entertainment system. At the event Sony Computer Entertainment also showcased a number of Havok Physics powered games. Havok has fully optimized its leading Tools and Middleware line‐up for PlayStation®4, which is now available to all licensed PlayStation®4 developers and publishers.
“We are very pleased to offer support for PlayStation®4,” said David Coghlan, Managing Director at Havok. “With the features that PlayStation®4 offers, we have been able to push our technologies to new limits and we are honoured to be showcased as part of the PlayStation® Meeting 2013. We are really looking forward to seeing game developers deliver some amazing Havok‐powered games for PlayStation®4.”
Big win for AMD's GPU department. Looks like PS4 titles that support Havok engine will have GPU accelerated physics on PC ports too ;).
Thanks to Final8ty on XS for finding the info and posting it.
 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
I don't know what so many of you are complaining about. Stop looking at just hardware, and certainly don't try to compare console hardware with PC hardware. PCs will always have better hardware on paper, but they're simply just not as "efficient" and designed specifically for games like consoles are (this is, of course, a simplification, but you get the point). Have fun trying to run Crysis 3 on medium settings at 720p at 30fps on the same type of hardware used in the PS3 or 360 on a PC. Console hardware can simply be taken further than PC hardware due to standardized hardware, optimization, and such...there's no way around it.

Just look at the new Killzone game. 1080p, 30fps (admittedly could be better), and the game looks gorgeous. Good lighting, very large scale, lots of particle effects, seemed like character models were quite detailed, and I even noticed features like ambient occlusion being used. It was tough to say based on the video, but some of the smoke might have even been volumetric. IMO, the game is graphically on par with BF3 or Crysis 3 at high settings (maybe not necessarily absolute highest, and it's tough to compare directly due to the Killzone demo being set in a city without vegetation and such). And this is only a launch title. Do you remember how poor most launch PS3 and 360 titles looked compared to games we're getting now?

There's nothing wrong with having no interest in consoles or preferring PCs. Just understand that you're dumping way more money into your PC over time to get a similar experience as consoles. This does change over time with a graphical advantage for PCs, but only because PC hardware is constantly moving forward. All of this bickering, though, just shows an unwillingness of many people on both sides to actually look at this from a logical point of view. If you aren't interested in something, that's fine. Just don't use it as an excuse to downplay something else simply because you don't want to like it. Also, being a PC gamer in no way shape or form makes you a better or smarter person or gamer than a console gamer...many PC gamers have a weird elitist attitude.

I, for one, will be welcoming all of the new consoles into my home along with my gaming PC. I am very excited for this generation and am already impressed by what I'm seeing. Not to mention sharing the x86 architecture should spell good news for PC gamers, especially if the new Xbox is rumored to be using a version of Windows.

This was not directed at any one person, BTW.
 

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
great post Hans. PS4 specs are more than adquate since it's pretty much a very optimized platform, no sense in comparing to the latest and greatest PC's out there just to prove a point. Sony would not do it unless they saw positives, they're not dumb and after all they understand this concept much better than most of us, after all this the PS 4th gen,
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Would have been a great post of he didnt miss the point completely. Just about everyone agrees that consoles can be taken further than PCs with the same/similar hardware. The debate is, can they be taken further than PCs with hardware that has twice the capabilities.
 

ChronoReverse

Platinum Member
Mar 4, 2004
2,562
31
91
Besides, it's not like we're saying the PS4 is going to fail or do bad because of its processing power. I'm actually quite interested in how the viewing someone playing capabilities will turn out for instance.

We're just pointing out where the limits are where some have been exaggerating.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
How likely is it that the background download cpu is the same arm cortex a5 processor that amd announced to be for TrustZone Capabilities last summer
And if so what else can be implemented in it.

http://www.anandtech.com/show/6007/...cortexa5-processor-for-trustzone-capabilities
Why bother? Background downloading will, with a real hardware NIC, take zilch for CPU time. With a good choice of ethernet and wireless NICs, spare CPU not being used by your game will be more than enough. If they're going to have dedicated hardware for that, it will likely be real dedicated hardware, but with 8 cores, why do that? Dedicating a core to system tasks, or requiring x% of 1 CPU never be left open for system tasks, would be plenty.

I'm surprised the rumors panned out, too, but a Bobcat with better SIMD performance, and big shared L2 (hopefully that means 4MB+) should have plenty of spare room and time for idle DLing, or really, any idle background tasks, provided the OS has a good scheduling API exposed (likely, though it will probably be crude), and good default scheduling rules; or if they go with the, "this core is for the OS and stuff," method.

A Cortex-A5 is a 32-bit uC that happens to have an ARMv7 front end. Doing any real work on it would be pointless, when a 1-2 IPC core, that actually will get >1 IPC, and has a chance of 2 IPC with optimized code (well, realistically, >1.5; tight int+FP that could do better are not very common, and really AGU-heavy code will tend to be low-ILP with lots of stalling), is sitting there with idle time. File serving and downloading take extremely little CPU time. As long as there is some spare L2 to use, they wouldn't even need to dedicate a core to OS tasks (though they might do that, just to simplify things on their end).
 
Last edited:

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
Tressfx: A new frontier of realism in pc gaming

tressbanner.jpg



http://blogs.amd.com/play/tressfx/
http://havok.com/news-and-press/releases/havoks-cutting-edge-physics-technology-showcased-playstation%C2%AE-meeting-2013

Big win for AMD's GPU department. Looks like PS4 titles that support Havok engine will have GPU accelerated physics on PC ports too ;).
Thanks to Final8ty on XS for finding the info and posting it.

I see nothing that suggests this will leveraging the GCN cores to run the physics?

Havok has traditionally been done on the CPU.
 

lainse

Junior Member
Feb 1, 2013
4
0
0
A Cortex-A5 is a 32-bit uC that happens to have an ARMv7 front end. Doing any real work on it would be pointless, when a 1-2 IPC core, that actually will get >1 IPC, and has a chance of 2 IPC with optimized code (well, realistically, >1.5; tight int+FP that could do better are not very common, and really AGU-heavy code will tend to be low-ILP with lots of stalling), is sitting there with idle time. File serving and downloading take extremely little CPU time. As long as there is some spare L2 to use, they wouldn't even need to dedicate a core to OS tasks (though they might do that, just to simplify things on their end).

Well you are properly right was just thinking if the arm a5 core in jaguar could be used for more then TrustZone alone they probably have 2 a5 cores on the dye.
How difficult wold it be to swap the a5 to a a9 from what I can see the buss would be the same and jaguar is a synthesized design.
 
Last edited:

HeXen

Diamond Member
Dec 13, 2009
7,835
37
91
So i guess this is the demo Sony showed, better than Heavy Rain's close ups i suppose.

55-45192-screen_shot_2013-02-20_at_7-1361422499.png
http://www.tested.com/tech/gaming/453666-playstation-4-press-conference/
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Well you are properly right was just thinking if the arm a5 core in jaguar could be used for more then TrustZone alone they probably have 2 a5 cores on the dye.
How difficult wold it be to swap the a5 to a a9 from what I can see the buss would be the same and jaguar is a synthesized design.

I think you miss the entire point of the TrustZone. Chipsets today also contains such a CPU in some form. But its isolated. Else it could be compromised. And there wouldnt be much trust left in TrustZone.

And thats the entire reason why the ARM core is there in the first place. Else you would use the x86 cores.

Also your scenario would also only work in a case were power was severely limited. But this is on grid power.
 

lainse

Junior Member
Feb 1, 2013
4
0
0
I think you miss the entire point of the TrustZone. Chipsets today also contains such a CPU in some form. But its isolated. Else it could be compromised. And there wouldnt be much trust left in TrustZone.

And thats the entire reason why the ARM core is there in the first place. Else you would use the x86 cores.

Also your scenario would also only work in a case were power was severely limited. But this is on grid power.
You are properly right. But I do understand trustzone you could use the a5 for other functions becaus :


“The second aspect of the TrustZone hardware architecture is the extensions implemented in some of the ARM processor cores. These additions enable a single physical processor core to execute code safely and efficiently from both the Normal world and the Secure world in a time-sliced fashion”