[Techspot]- AC:Unity "too much" for console CPUs

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Did you ever think that they could actually have exceeded the CPU capacities of both console chips?


Do you think we'll ever see a game at some point over the lifetime of the consoles that exceeds AC Unity's AI and what is on screen? Or do you think that game is the absolute upper limit of what the PS 4 and Xbox One will ever be able to do?
 
Last edited:

jhu

Lifer
Oct 10, 1999
11,918
9
81
The problem is they are already familiar with the hardware from the PC side.

Even aspect ratio is now being chopped into 2.35:1 and the same 30FPS nonsense.

Halo vs. Doom 3 on the original XBox. Still it's loe-end PC hardware, yet they were able to cram Doom 3 onto it.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
1) Why do the OSes take 2 cores? What can they possibly be doing in a bare-metal programming model that needs 2 cores? Give the game 7 of 8 cores, and 7 of 8 GB RAM. Given how little RAM and CPU the 360 and PS3 OS worked on, surely 4-5x more RAM (1GB) and 4-5x more CPU (1 jaguar core @1.6-1.7) is enough. Potential throughput with 7 cores instead of 6 obviously increases.
Because consoles are about multitasking. You can play, have a skype call, update game and stream to twitch and have a camera scan your motion at the same time. If dev is targeting with his game to use every single bit that is available, as soon as something happens in the background, the gaming experience suffers. There needs to be guaranteed performance - that is why the are cores dedicated only to background tasks, while the rest are in 100% for game.

2) Why did they downclock Jaguar from its design target of 2.0 ghz? They have a thermal budget, yes, but AMD clearly already did the optimization math of speed vs. TDP and settled around 2.0ghz for its top models of jaguar. It's already a very low power CPU... Maybe the process the fab the console chips on has a inferior power curve?

Simply, maybe that is enough CPU power? Top models are not running on the perf/watt sweatspot. They maybe cut a few watts on CPU and add a bit more MHz to GPU?
Also, there is no binning done with console chips. While am1 drops into 3 bins, all consoles apus need to be functional and within specs.
 
Last edited:

Spjut

Senior member
Apr 9, 2011
932
162
106
Halo vs. Doom 3 on the original XBox. Still it's loe-end PC hardware, yet they were able to cram Doom 3 onto it.

Doom 3 had parts of the levels remade in the Xbox version though.


I think it's worth to mention that Capcom once compared the Xbox 360's CPU with a dual-core 3.2GHz Pentium 4 Extreme Edition CPU.
http://www.eurogamer.net/articles/face-off-resident-evil-5-article?page=2


And the chief technical officer of 4A Games had the following to say about the 360's CPU
You can calculate it like this: each 360 CPU core is approximately a quarter of the same-frequency Nehalem (i7) core. Add in approximately 1.5 times better performance because of the second, shared thread for 360 and around 1.3 times for Nehalem, multiply by three cores and you get around 70 to 85 per cent of a single modern CPU core on generic (but multi-threaded) code.
http://www.eurogamer.net/articles/digitalfoundry-tech-interview-metro-2033?page=4
 

DrMrLordX

Lifer
Apr 27, 2000
22,931
13,011
136
The 360 fiasco definitely was a big part in this. It clearly showed the highest TDP that they can reliably fit into a console sized package- namely, slightly lower than the launch XBox 360. Remember that even the hottest original incarnation of the 360 only drew 180W while gaming- tiny compared to a modern PC PSU!

Well, right, but an a8-7600 can go as low as 45W, which includes CPU and GPU in one die . . . mind you, that is one third as many GCN cores as what you get from the PS4 chip, at a lower clockspeed. They totally could have done 3x A8-7600 in a 130W power envelope, and those chips run pretty cool too. Whether or not they would have wanted to deal with a 3P monstrosity is an entirely different matter.

While console power consumption has fallen since last generation, gaming PC power consumption has massively increased. The entire PS4 only consumes 140W at most, less than the TDP of a single HD 7870- and less than half the power consumption of an R9 290X.

Gaming PC power consumption has increased massively on the GPU side (ignoring the latest Nvidia offerings). For anyone running an APU, power consumption is quite low, and you can still run at decent resolution/quality settings with a Kaveri chip. Based on what AMD has been able to do on the PC with the Kaveri, it almost looks like the PS4 and Xbox1 have overbalanced their APU by using too many GCN cores. 1152 shaders? Yowza. That's got to be a huge slice of the 140W power consumption figure you listed.

TDP was never an issue. Cooling have also improved dramaticly since plus throttle behaviour if it should overheat.

Well yes, but my claim was that it was more of a psychological issue than a technical issue. Some executive proclaims "no more thermal hardware failures! AND we're spending less on components this time!", and viola, the engineers roll their eyes and spec a lower overall TDP so they can go in front of the brass and claim to have less power consumption, lower temperatures, and lower hardware failure rates without having to improve the cooling solutions.

I have a feeling they went with the 8 slower cores due to PR. A 2M/4T chip just wouldnt look good on paper when you had a 3C/6T and an "8 core" Cell before. The cores are also so weak that we see a whole 2 cores dedicated to non gaming.

That may well have been a factor. History has shown us many examples of engineering decisions being made in the PC sector base on marketing hype (such as the old "mhz myth").

Its a real shame tho, because it impacts games in such a negative manner.

I'm still a little confused as to why Ubisoft is having so much trouble, though that issue is being well-discussed elsewhere, so I'll leave others to that.
 

NTMBK

Lifer
Nov 14, 2011
10,448
5,831
136
Well, right, but an a8-7600 can go as low as 45W, which includes CPU and GPU in one die . . . mind you, that is one third as many GCN cores as what you get from the PS4 chip, at a lower clockspeed. They totally could have done 3x A8-7600 in a 130W power envelope, and those chips run pretty cool too. Whether or not they would have wanted to deal with a 3P monstrosity is an entirely different matter.

That 45W figure only includes the APU. The number for PS4 includes 256-bit GDDR5 interface, optical drive, hard drive, network controller, and power supply inefficiencies on top of all that.
 

jhu

Lifer
Oct 10, 1999
11,918
9
81
There's discussion over on slashdot about this. I agree with this guy:

I think what Ubisoft is trying to say here is that it's programmers are shit, and so is it's game engine (AnvilNext). Other publishers manage to do okay, but poor old Ubisoft are stuck with this turd and can't just switch to Unreal or something more competent, so sorry guys you only get 900p on consoles.

I mean, obviously if you are CPU bound the solution is to reduce the load on the GPU by making the game render at 900p. The problem is the AI for the large number of characters in the game, so clearly reducing the pixel count will help with that.
 

mindbomb

Senior member
May 30, 2013
363
0
0
Well, he specifically said the AI was responsible. I think the trend in games is to be multiplayer though, and have other players control characters rather than the cpu. So the limited cpu power still doesn't seem like that big of a deal.
 

DrMrLordX

Lifer
Apr 27, 2000
22,931
13,011
136
That 45W figure only includes the APU. The number for PS4 includes 256-bit GDDR5 interface, optical drive, hard drive, network controller, and power supply inefficiencies on top of all that.

Oh of course, there's no doubt about that. There's the motherboard and the dsps that I've heard about being in there as well. They probably would have had to undervolt/underclock or just disable turbo for those A8s to really get them to fit within a smaller overall TDP limit vs PS3/Xbox360. Bet they coulda done it, though. Spreading out heat dissipation between three low-temperature sockets should be easier than dealing with one hot monolithic chip. Considering how long it took AMD to bring Kaveri to market (especially the 7600), availability would have been an issue.
 

NTMBK

Lifer
Nov 14, 2011
10,448
5,831
136
Bet they coulda done it, though. Spreading out heat dissipation between three low-temperature sockets should be easier than dealing with one hot monolithic chip.

Three heat sources means three heatsinks, means more complex airflow design and case layout. The PS4 has a rather elegant cooling design in my opinion, a single big fat blower pushing air straight out the rear vents. Not to mention you would have all sorts of NUMA considerations when trying to get maximum performance out of the design- you want your data in the memory attached to the processor the job is executing on. And splitting rendering over multiple GPUs is not going to be fun! And then you have the energy costs of the high bandwidth interconnects, and the poor latencies... I wouldn't enjoy developing for it, that's for sure.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
There's discussion over on slashdot about this. I agree with this guy:

I think what Ubisoft is trying to say here is that it's programmers are shit, and so is it's game engine (AnvilNext). Other publishers manage to do okay, but poor old Ubisoft are stuck with this turd and can't just switch to Unreal or something more competent, so sorry guys you only get 900p on consoles.

I mean, obviously if you are CPU bound the solution is to reduce the load on the GPU by making the game render at 900p. The problem is the AI for the large number of characters in the game, so clearly reducing the pixel count will help with that.

I'm sorry but that doesn't make any sense. If you're CPU bound then reducing GPU processes won't do diddly-squat unless you're reducing draw calls and commands from the CPU to the GPU of which resolution decrease AFAIK won't do a thing to improve CPU performance, unless the LOD is more aggressive to reduce draw calls and the lower res is used to hide these faults.

My only other guess here is that he's specifically referring to the visual presentation of a large AI crowd, where a reduced resolution means that Ubisoft can reduce the number of people in the crowd because you wouldn't be able to see them as well in 900p versus 1080p. LOD tricks would be easier to hide.

Next question is: is GPGPU being used for the crowds at all?
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,596
136
Its surely difficult to understand. What about this explanation:
"It's probably not the AI calculations related to gameplay, but Ubisoft's AI calculations related to their DRM that get highest priority in their games ..."

- from slashdot

:)
 
Last edited:

DrMrLordX

Lifer
Apr 27, 2000
22,931
13,011
136
Three heat sources means three heatsinks, means more complex airflow design and case layout.

Yeah, but it also means (probably) lower delta T per source of flux. That makes things a little easier (well, sort of).

Not to mention you would have all sorts of NUMA considerations when trying to get maximum performance out of the design- you want your data in the memory attached to the processor the job is executing on. And splitting rendering over multiple GPUs is not going to be fun! And then you have the energy costs of the high bandwidth interconnects, and the poor latencies... I wouldn't enjoy developing for it, that's for sure.

Oh I know, multiproc systems aren't so simple a thing, though it isn't much of a problem for Opteron systems and the like. They'd just have to adapt Kaveri to a cut-down version of G34. The HT spec allows for non-CPU devices to have direct access to HT links as well. I think AMD would have been able to figure something out. Would it have been cheap enough for Sony to want to pay for it? Probably not.

It woulda been sweet, though.