[Techspot]- AC:Unity "too much" for console CPUs

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
So by blowing the TDP budget, or waiting an extra 3 years for 14nm to be viable. Sounds like a great solution.

Not really. Just look at what mobile Maxwell can do with an Intel CPU.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
I'm saying that Intel and Nvidia are entering in bad deals now while trying to break in proven markets, which are much bigger than the consoles, while AMD is entering in a bad deal today that won't generate further benefits in the future.

Beyond the console chips, what other players do you think that might be attracted to similar deals with AMD? That's right, none. Almost nobody needs bleeding edge graphics and custom solutions can't compete with off the shelf solution on costs for anyone who need some graphics capacity. The console deals are really a two of a kind and unlikely to generate further benefits for AMD except become a top contender for the next generation of console chips.

AMD was floating last year semi-custom deals on the pipeline of around 500MM, now they are floating 100MM deals and nothing to be seen so far. It is not a big market.


And AMD is entering a decent deal now while trying to win mind share in an existing and proven market. It goes beyond just the consoles. How many kids are going to unwrap their shiny new console on Christmas morning and then see AMD's name every time they play a game for the next five years? That could translate into sales with future PC gamers. I know every arm chair CEO on the internet would have done it differently if building a console, but I'd like to think there are smart people at Sony, MS, and Nintendo.

No matter how you spin it, what you're more or less saying is that it is good for Intel and Nvidia to lose millions of dollars now because they will hopefully make up for it with future sales. But for AMD to make millions of dollars now while better positioning themselves for hopefully even more future sales is somehow a bad thing.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
How many kids are going to unwrap their shiny new console on Christmas morning and then see AMD's name every time they play a game for the next five years?

I am sorry, where is that AMD name on consoles? My previous Wii and now Wii U doesnt show such thing. Is it showing on the PS4 or the Xbox One?

Being an embedded supplier usually means being completely anonymous.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Not really. Just look at what mobile Maxwell can do with an Intel CPU.

1. Maxwell is more than 12 months later on the market than consoles
2. How much does Maxwell cost ?
3. How much Intel CPU cost ?
4. How high is the TDP for both of them ??

By the same token i could say that AMD could have a better solution than even Intel + Maxwell next year with 14nm products. ;)
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
And AMD is entering a decent deal now while trying to win mind share in an existing and proven market. It goes beyond just the consoles. How many kids are going to unwrap their shiny new console on Christmas morning and then see AMD's name every time they play a game for the next five years? That could translate into sales with future PC gamers.

Not many I would say. Nvidia and Intel are pretty much premium established brands on the PC markets. In fact, the entire point of a gaming PC is higher quality gaming than on a console. I doubt that AMD will be able to get anything out of it.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
I am sorry, where is that AMD name on consoles? My previous Wii and now Wii U doesnt show such thing. Is it showing on the PS4 or the Xbox One?

Being an embedded supplier usually means being completely anonymous.



Maybe I'm wrong, I only played a PS4 for 10 minutes at a Gamestop, thought I saw it. I know my other (but older) console has ATI sticker, I may have made the mistake of assuming it was the case with the new consoles as well. Obviously the information is out there and I'm sure many gamers will read plenty about their consoles on wikipedia or whatever.
 

NTMBK

Lifer
Nov 14, 2011
10,448
5,831
136
I am sorry, where is that AMD name on consoles? My previous Wii and now Wii U doesnt show such thing. Is it showing on the PS4 or the Xbox One?

Actually, my Wii came with an ATi logo on the side, and so did my Gamecube. Don't know whether the next gen consoles do though- certainly not spotted one on my Wii U.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
I dont believe there is an AMD logo on PS4 or XBone, but a lot of people already know the SoC is designed by AMD.

Edit : Even the SoC die doesnt have laser cut AMD Logo on it.

PS4-SoC1.jpg


Feature-top-Pic-6822.jpg
 
Last edited:

NTMBK

Lifer
Nov 14, 2011
10,448
5,831
136
Also, when comparing CPU performance, don't forget that the consoles have a lot of dedicated coprocessors which will get used. There are audio coprocessors, DMA engines, compression/decompression accelerators- these are all going to take load off the CPU, which on a PC game would all be run in software.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
1. Maxwell is more than 12 months later on the market than consoles
2. How much does Maxwell cost ?
3. How much Intel CPU cost ?
4. How high is the TDP for both of them ??
5. Why would Microsoft want to make another console deal with nV, if AMD has what they need?
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
I dont believe there is an AMD logo on PS4 or XBone, but a lot of people already know the SoC is designed by AMD.

Edit : Even the SoC die doesnt have laser cut AMD Logo on it.


Yup, I was probably wrong about that. But, I have little doubt that many people who use the consoles will be aware that AMD is providing the CPU and graphics. And that that could pay dividends for them in the future.
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
I don't know about all the bickering about the cpu is necessary when it is obvious that ubisoft is incompetent. Why would you design a game that cannot run well on the current game consoles in the first place. It would be like saying that current gen consoles are bad becuase they can't do real time ray tracing. This is a simple ploy to distract us from the parity comment and shift blame, clever marketing.
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
A discrete GPU and CPU means going back to a split memory model with higher costs for communication between GPU and CPU. An integrated design offers a lot of theoretical performance benefits, which console games will no doubt exploit in the years to come.

The real alternative would have been an APU from either NVidia or Intel- but 64 bit ARM cores weren't ready in time for this generation, which knocked Nvidia out of the running, and Intel GPU tech isn't that competitive compared to NVidia or Intel.

What about something like a G1820, but a quad core custom version with an extra 1MB of cache matched to a custom AMD dGPU? I wonder if MS actually approached Intel and tried to work out a decent chip option? So far those theoretical benefits are just that theoretical. The CPU just doesn't have enough puff.
 
Aug 11, 2008
10,451
642
126
Funny how every comment by the developers about how great the consoles were going to be and how they would be better than a top end PC, etc., were greeted with total acceptance by a certain crowd, but now that a developer criticizes the hardware, it is totally dismissed as a lie or incompetence. And it may be true that Ubisoft is making excuses for their poor programming. It just seems a bit over the top, the outrage flowing from this thread just because somebody said something critical about a console chip.
 

erunion

Senior member
Jan 20, 2013
765
0
0
Using a more powerful core doesn't exclude using many cores on a given chip. I think that the real problem, as you stated in the first place, is that the console chips of current generation are designed to be very low cost solutions. Time, Costs, Scope, pick two and send the other to the sacrificial chamber, and this time it was Scope.

despite how similar the xbone and ps4 ended up being, its important to remember that they had very different development processes.

The xbone was years in the making and had a grand scope to expand the role of consoles. MS wanted many small cores for multitasking and dedicated processes , early docs showed them considering x86 and ARM. They also wanted a single chip solution, like the late mode 360s were already using. AMD was able to provide what MS was searching for.

Sony had no concrete plans for a PS3 replacement because of their financial troubles. The future of the playstation division wasn't certain. Sony decided late in the game to launch new a console along side the xbone. AMD had a cheap solution, ready to go. Which was exactly what Sony needed, as Playstation couldn't survive another generation of being a year late and $599.

PS4 is an example of marketing success. Features that MS thought were obvious in the post-iOS, post-steam world, received pushed back from gamers. (like digits downloads and cloud based gamer IDs) MS stuck to their plan at first, while Sony learned from MS's mistakes and made sure to say what gamers wanted to hear. Sony saw the writing on the wall with Kinect, and unbundled their move camera from the console.
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I don't know about all the bickering about the cpu is necessary when it is obvious that ubisoft is incompetent. Why would you design a game that cannot run well on the current game consoles in the first place. It would be like saying that current gen consoles are bad becuase they can't do real time ray tracing. This is a simple ploy to distract us from the parity comment and shift blame, clever marketing.

Thing is, the game has been in development long before the PS4 and Xbox One became available.. The AC games are handled by multiple development teams working in cycles..

Whatever AC game coming out next year has probably already been in development for a couple of years..
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
What about something like a G1820, but a quad core custom version with an extra 1MB of cache matched to a custom AMD dGPU? I wonder if MS actually approached Intel and tried to work out a decent chip option? So far those theoretical benefits are just that theoretical. The CPU just doesn't have enough puff.

A single Haswell core at 22nm is the same size as 4x Jaquar cores at 28nm. So the overall die size of the SoC would be even bigger translating to higher cost. Dont think MS and Sony would like that ;)

2013_core_sizes_768.jpg
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
Thing is, the game has been in development long before the PS4 and Xbox One became available.. The AC games are handled by multiple development teams working cycles..

Whatever AC game coming out next year has probably already been in development for a couple of years..

ok I'll admit that the incompetent comment was unfair and I acknowledge that making games takes alot of time and effort -especially for a game of this scope. I however, will stick to the point that they are using the cpu as a scapegoat and boy does the internet like their goats.
 
Last edited:

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
ok I'll admit that the incompetent comment was unfair and I acknowledge that making games takes alot of time and effort -especially for a game of this scope. I however, will stick to the point that they are using the cpu as a scapegoat and boy does the internet like their goats.

Did you ever think that they could actually have exceeded the CPU capacities of both console chips?
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Did you ever think that they could actually have exceeded the CPU capacities of both console chips?

I for one definitely think thats possible. This game will have thousands of A.I entities on screen. I can't think of any console game with that many A.I entities on screen at the same time.. Usually only strategy games on PCs have that many..

I understand that people are skeptical towards Ubisoft knowing their history, but AC Unity is an extremely ambitious title. It will probably be the most advanced title released this year technically speaking, and by quite a distance as well.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Funny how every comment by the developers about how great the consoles were going to be and how they would be better than a top end PC, etc., were greeted with total acceptance by a certain crowd, but now that a developer criticizes the hardware, it is totally dismissed as a lie or incompetence. And it may be true that Ubisoft is making excuses for their poor programming. It just seems a bit over the top, the outrage flowing from this thread just because somebody said something critical about a console chip.

Because anything would be a welcome relief from the PS3 or xbox 360?

A single Haswell core at 22nm is the same size as 4x Jaquar cores at 28nm. So the overall die size of the SoC would be even bigger translating to higher cost. Dont think MS and Sony would like that ;)

2013_core_sizes_768.jpg

And 2 haswell cores with hyperthreading @ 3.2 ghz are the exact same performance as 8 jaguar @ 1.6 ghz, likely at similar power consumption (judging by mobile chips). Don't get me wrong as that would not have worked for an APU.
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
Did you ever think that they could actually have exceeded the CPU capacities of both console chips?

ofcourse it is possible that they could use all the cpu resources and no one is doubting that. The argument has 2 forks: the parity comment and the overdesign. The parity comment tells me that they are making excuses to shift blame and use the cpu as a scapegoat. While the overdesign tells me that they haven't done their homework to optimize and produce something that the consoles are capable of. As for the bickering about the cpus, it is quite possible that these same issues would occur even on a more powerful uarch.

I still think this is just damage mitigation and clever marketing rather than a true technical issue.
 

Piroko

Senior member
Jan 10, 2013
905
79
91
Sadly not every job is as parallel (and almost free of inter dependencies) as rendering or ray tracing. Those always have been super parallel.

Running real stuff, even with nominally paralellizable algorithms is going to incur overhead and sadly the less powerful the cores, the bigger overhead fraction becomes ( cause some stuff like getting data into caches, synchronization, extra copies etc have fixed cost).
That post got me thinking for a moment, trying to name a prime example of work that can't be parallelized in the context of a modern game.

I mean, I'm quite sure that Creative has demonstrated in the past how embarassingly parallel EAX computing is. Draw calls - dito. Object calls? AI? Proper AI where individual units do situation assessment based on their own data with minimal input from an AI Director? Player input? Yes, finally found one.

I really don't think a modern game should be limited by "only" 6 to 8 threads. There's a crapton of stuff happening in every frame, most of it only has dependencies because historically that was the way to do it. Not every problem can be parallelized, sure, but oftentimes it can be rephrased to be a problem of parallel nature - with little to no negative effect on the underlying game concept.
 

MiddleOfTheRoad

Golden Member
Aug 6, 2014
1,123
5
0
Why dont you link all the devs that celebrate the 6-8 weak cores? We have examples of devs complaining about they are too slow. So it would be nice if you can list those that think its the opposite.

And spending time to optimize for slow=more money, more time and more work. I am sure everyone loves that as well.

Yeah.... Right.....

I also remember all the Developers complaining about programming for the Emotion Engine. That clearly led to the Playstation 2 being a complete sales flop, right?
 

Rakehellion

Lifer
Jan 15, 2013
12,181
35
91
The team anticipated "a tenfold improvement over everything AI-wise," Pontbriand said, but they were "quickly bottlenecked."

They have no business running AI on the CPU. That's just bad programming.