Assassin's Creed Unity Dev speaks: Consoles locked at 900/30, CPU's holding game back

futurefields

Diamond Member
Jun 2, 2012
6,470
32
91
Pretty interesting and informative article here. Interested to see what you guys make of it...

http://www.videogamer.com/ps4/assas...y_is_900p_30fps_on_both_ps4_and_xbox_one.html

Assassin's Creed Unity will run at 900p/30fps on both PlayStation 4 and Xbox One, Ubisoft has confirmed, with the publisher opting to aim for platform parity to avoid discussion over any differences in performance.

"We decided to lock them at the same specs to avoid all the debates and stuff," senior producer Vincent Pontbriand told VideoGamer.com while explaining that it's the consoles' CPUs – not the GPU – that prevents Ubisoft Montreal from improving the game's performance.

"Technically we're CPU-bound," he said. "The GPUs are really powerful, obviously the graphics look pretty good, but it's the CPU [that] has to process the AI, the number of NPCs we have on screen, all these systems running in parallel.

"We were quickly bottlenecked by that and it was a bit frustrating, because we thought that this was going to be a tenfold improvement over everything AI-wise, and we realised it was going to be pretty hard. It's not the number of polygons that affect the framerate. We could be running at 100fps if it was just graphics, but because of AI, we're still limited to 30 frames per second."


Last year's Assassin's Creed 4: Black Flag also shipped at 900p/30fps on both PS4 and Xbox One. A post-release patch, however, bumped the PS4 version to 1080p. Ubisoft has given no indication that it has plans to do the same for Unity.
 

Lil Frier

Platinum Member
Oct 3, 2013
2,720
21
81
Interesting to hear. Isn't this something XBLC could help with, if it was made my Microsoft--offloading that CPU work to the cloud? I though that was what the idea behind using the XLBC power for games like Titanfall was, making me wonder if being multi-platform hurts Ubisoft here (not to suggest they should make a deal with Microsoft, just a theory).
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
I do not understand why people care about a magical and pointless resolution or frame rate number in a game like this. 1080/60 is just a buzzword. If they never at any point in development mentioned the resolution or frame rate 99% of people would never know or care. Mention it's less than 1080/60 in an article and people lose their minds.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I do not understand why people care about a magical and pointless resolution or frame rate number in a game like this. 1080/60 is just a buzzword. If they never at any point in development mentioned the resolution or frame rate 99% of people would never know or care. Mention it's less than 1080/60 in an article and people lose their minds.

Yeah I am much more interested in the crowds that support 10,000 characters or whatever.
 

Rakehellion

Lifer
Jan 15, 2013
12,181
35
91
Process AI on the GPU. Problem solved.

opting to aim for platform parity to avoid discussion over any differences in performance.

Avoid discussion? This is possibly the worst reason I've ever heard.
 

alcoholbob

Diamond Member
May 24, 2005
6,380
449
126
These CPUs have 6 cores dedicated for gaming. Ubisoft could try some more multi-threading instead of loading 2 cores to the max.
 

mmntech

Lifer
Sep 20, 2007
17,501
12
0
Ubisoft Montreal doesn't know how to program for x86 hardware?
YouDontSay.jpg


Somehow I'd think they'd say the same thing even if you thew a Haswell i7 at them. Granted, I knew it wouldn't be long before someone started complaining about the CPUs in the PS4 and XB1.

Not that it particularly matters, as all other console AC games have run at 30fps, and it's never been an issue. Though I would expect true 1080p to be the defacto standard by now.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Thats also why there is uncompressed audio and textures. The CPU simply cant handle to decompress on the fly.

It also explains the lack of physics and AI in recent console games :/
 

futurefields

Diamond Member
Jun 2, 2012
6,470
32
91
Indeed, physics and AI seem to have taken a step back even as they are cramming to use all resources on eye candy to appear "next gen".
 

futurefields

Diamond Member
Jun 2, 2012
6,470
32
91
And people still don't understand how hard it is so multithread the cpu-intensive elements of a game...
 

Rakehellion

Lifer
Jan 15, 2013
12,181
35
91
Thats also why there is uncompressed audio and textures. The CPU simply cant handle to decompress on the fly.

It also explains the lack of physics and AI in recent console games :/

Compressed textures are handled by the GPU's texture unit at zero performance cost.

AI and audio could also be done on the GPU and I expect this will be the case more so in the future.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Compressed textures are handled by the GPU's texture unit at zero performance cost.

AI and audio could also be done on the GPU and I expect this will be the case more so in the future.

I am talking about the loading from storage.

And any compressed audio is still handled by the CPU.

Hence why we got 10-15GB games talking up 40-50GB.
 

Rakehellion

Lifer
Jan 15, 2013
12,181
35
91
And people still don't understand how hard it is so multithread the cpu-intensive elements of a game...

Depends on what the elements are. You could write them from the ground up with multithreading in mind instead of recycling your old Windows code. AI is something that typically multithreads extremely well.
 

Rakehellion

Lifer
Jan 15, 2013
12,181
35
91
I am talking about the loading from storage.

And any compressed audio is still handled by the CPU.

Loading from storage is limited by bus bandwidth, not CPU speed. And I said audio could be done on the GPU and AI (which is holding Ubisoft back) should be done on the GPU.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Loading from storage is limited by bus bandwidth, not CPU speed. And I said audio could be done on the GPU and AI (which is holding Ubisoft back) should be done on the GPU.

Decompression of a file pack is CPU intensive. Specially on a very slow CPU. Same for audio, then it doesnt matter what GPU or DSP you got. Its mainly a CPU task.
 

Rakehellion

Lifer
Jan 15, 2013
12,181
35
91
Decompression of a file pack is CPU intensive. Specially on a very slow CPU. Same for audio, then it doesnt matter what GPU or DSP you got. Its mainly a CPU task.

You don't compress files that are already compressed. And that still doesn't apply to AI, which is what Ubisoft is saying limits them.
 

Socio

Golden Member
May 19, 2002
1,732
2
81
These CPUs have 6 cores dedicated for gaming. Ubisoft could try some more multi-threading instead of loading 2 cores to the max.

Agreed multi cores regardless if it is in a console or PC, heck even the iPhone for example uses a dual core processor and it is only a matter of time before they are all using multi-core are the foreseeable future.

If gaming companies want to stay on the cutting edge they must invest in developing for and getting the most use out of multi core systems.
 

mmntech

Lifer
Sep 20, 2007
17,501
12
0
Agreed multi cores regardless if it is in a console or PC, heck even the iPhone for example uses a dual core processor and it is only a matter of time before they are all using multi-core are the foreseeable future.

If gaming companies want to stay on the cutting edge they must invest in developing for and getting the most use out of multi core systems.

They've only just started realizing that most gaming PCs are 64-bit now.

The problem is that despite all it's cores, Jaguar was mainly intended for netbooks, micro-servers, and low powered embedded devices. It's basically AMD's answer to the Bay Trail Atom. It was a curious choice for MS and Sony though I would assume they knew what they were getting into. First party games seem to handle things just fine.

I guess it was more a matter of 1) looking for something that would run cool, 2) be combined with a GPU on die, and 3) be easy and quick to mass produce.

Nvidia has yet to pair any CPU with a performance GPU on die, and Intel's Iris certainly wouldn't be fast enough.
 
Last edited:

smackababy

Lifer
Oct 30, 2008
27,024
79
86
This is pretty clear: ITT, people think throwing more threads at something makes it perform better.

With the addition of mulithreading you get increasing complexity. This complexity increases the development time and the bugs.

And, you still run into the problem of threads finishing before one another and simply being idle until everything else catches up. Now, this is fine if your main logic thread is the one behind, but if it a different thread, you're not going to enjoy playing that game.

At what point, do you draw the line between increasing complexity and the possibility of bugs with the performance "loss" of keeping things in the same thread? That is a question whose answer is so specific, it can't be answered in this thread.

And, thankfully, they went the 900p@30fps route. It would just look awful at 1080p@60fps. This we know is true.
 

Lil Frier

Platinum Member
Oct 3, 2013
2,720
21
81
They've only just started realizing that most gaming PCs are 64-bit now.

The problem is that despite all it's cored, Jaguar was mainly intended for netbooks, micro-servers, and low powered embedded devices. It's basically AMD's answer to the Bay Trail Atom. It was a curious choice for MS and Sony though I would assume they knew what they were getting into. First party games seem to handle things just fine.

I guess it was more a matter of 1) looking for something that would run cool, 2) be combined with a GPU on die, and 3) be easy and quick to mass produce.

Nvidia has yet to pair any CPU with a performance GPU on die, and Intel's Iris certainly wouldn't be fast enough.

I haven't seen much (for lack of looking) on how the Jaguar performance numbers actually look. Two questions on that front:

1. How does Jaguar compare to, say, Kaveri and Richland (on a CPU and GPU level)?
2. Is there an MSRP out there on the Jaguar stuff, and if so, what would the price difference looked like (I know console makers don't pay MSRP, but it could give a general cost indication) if they had aimed for one of the "big boy" APUs, per se?
 

jpiniero

Lifer
Oct 1, 2010
16,613
7,098
136
Its still 6 very slow cores. A dualcore Celeron is 2-3 times faster for example.

Not with a power target of 20 W, which is roughly what the 8C Kabini draws. You'd give up a lot of MT for a small increase in ST. You have to remember that there would also be no turbo enabled either, because the devs want the clock speed to be consistent.

Even now, for something from Intel, Avoton would be the most realistic for a theoretical console CPU. Denver or even a straight A57 would also be possibilities but ARM wasn't a real possibility without 64-bit.

1. How does Jaguar compare to, say, Kaveri and Richland (on a CPU and GPU level)?

IIRC, the IPC is pretty similar.