Quality/Performance Issues in Assassin's Creed: Unity [WCCF]

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Wow keysplayr, looks very smooth. I bet that in two months you will have a locked 60fps with a single GTX 980 at the same settings (I left a comment on your vid)

________________________________

I just tried to run Black Flag to compare and apparently eyefinity doesn't work anymore with recent drivers. I haven't found many info about it on the net but still found some.

http://nerdanswer.com/answer.php?q=580632
http://forums.amd.com/game/messageview.cfm?catid=474&threadid=180100



It really sucks.

Ok Cool!

Carfax mentioned that with FXAA, somebody was getting 40fps avg at 1440p with AC Unity on a single GTX980. That is not out of the question. I can't test it yet because my 27" monitor took a dive, but soon I'll be able to.

As far as 1440 at 60fps, perhaps that was with 980SLI. I can test that to. I just have to put the other card in.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
The MSAA implementation isn't terrible. If you were familiar with the disadvantages of MSAA in a deferred rendering engine, then you wouldn't even be asking me this question.

I respect your opinion, but I also think it's B.S because I know you're pro AMD and you hate anything that has to do with NVidia's Gameworks. So basically although you're entitled to your opinion, your opinion cannot be taken seriously as it's not objective.

Frostbite 3 is a capable engine, but I haven't seen the kind of visuals from any FB3 game that I've seen in ACU.

Examples. Not my personal screenshots, but they serve. These two screenshots alone are superior to anything seen in any FB3 game to date, including Dragon Age Inquisition.

The detail in ACU is unmatched, and so are the amount of animations, the quality of lighting, and even the hair simulation. Plus, ACU supports physically based rendering. Dragon Age Inquisition by comparison has plasticky looking hair and skin, and has no support for PBR as far as I know. Plus the tessellation factor is weak for a modern title, because AMD's tessellation performance is average at best.

And unlike you, my opinion is objective. I have the game on preorder, and I have no irrational hatred towards Bioware or EA..

iX10oxEBekFSd.jpg

T5pZgvh.png

That's great that you can post screens of ACU looking good.

Since they aren't your screens, want to tell us what build they are on, what the system is being used to run them, etc.?

Otherwise, they could be running at 1 FPS for all we know and were only made for art and not to show remotely achievable gameplay.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
2nd vid not working

Sorry, Gotcha. Still being published. Should be done shortly.

EDIT: Interesting. Even after being published, it's only available at 360p for about 5 minutes. Then after that, all the other quality options become available.

2nd video is published, but wait a few for the 1080p option to become available.
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I noticed some odd problems on the hair and beard during first video (looked like white lines through it all and was curious about that box attached to his butt near the beginning. Are those some of the graphical problems mentioned, or something story related I don't know about?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Don't see 980 dropping below 280x levels there, don't even see it falling behind 290x with similar cpu.

No one said anything about 980 dropping below 290X's level. If you look at Computerbase's review, the 980 is way more sensitive to CPU speed than 290X. Therefore, the point that Ubisoft made that only AMD CPUs and GPUs will experience adverse performance in Unity is BS as clearly 980 bombs with a slow Intel CPU.

On our forum some members tried to insinuate that in almost all modern games it is AMD GPUs that require the faster CPU to show full potential but it's clearly a case on a game-by-game basis. AC shows the opposite in fact. When running 4770K with 2 cores @ 3.5Ghz, both 290X and 980 can only manage 30.8 fps. When runnig 4770K with 4 cores @ 2.5Ghz, both 290X and 980 again hit an identical 39.9 fps. When we get to 4770K @ 3.5Ghz with HT disabled, it's only then that the 980 gets a measurable 17% lead (50.2 vs. 43). That's what I mean that with a slower CPU, all of 980's performance advantage is wiped out, but 290X hardly loses much performance in comparison.

At 2560x1600 with the typical IQ reduction of FXAA (blur), 970 itself gets only 31 fps, 780Ti gets <34 fps and 980 manages <37 fps AVG. Therefore, it's impossible to maintain 60 fps minimums throughout the entre game with FXAA at 1440P/1600P with even dual 980s. When enabling MSAA, well, no need to comment even.

I am inclined to believe every professional performance data from GameGPU to Computerbase over some PC gamer online that cherry-picks less demanding scenes to show good performance because he isn't objective to the AC Unity game and will defend it at all costs. When someone has a vested interest in the franchise and needs to justify his/her expensive GPU upgrades that happen to be 1% of GPUs that work good enough to run the latest iteration of the said franchise, we have to look at more objective opinions from professionals regarding optimizations and overall performance characteristics.

Far Cry 4 may be the final straw for Ubisoft in PC gaming. Another technical disaster like Unity or Watch Dogs and they'll have a near impossible time restoring their reputation.

Far Cry 4 can't be another low-tech shoddy port with mediocre visuals and huge performance demands. Shots of the pre-release are variable. One screen looks alright.

FC4 will most likely just be FC3+, and nothing more. I don't understand why it's so difficult for modern PC gamers to accept that we are in a period of console ports. With Crytek, DICE and CDPR not releasing any powerhouse games annually, 90% of the games we'll get are going to be console ports. The added features will be whatever AMD and NV throw in via GE or GW. As an added benefit of higher resolution, we'll get sharper textures and some AA modes on the PC. By far and large the game itself will look 90% identical to the PS4 version, especially when it comes to Ubisoft games as they are all for parity / identical gaming experience (Oct 14, 2014 -- N4G).

Computerbase posted a comparison video of AC Unity on PS4 vs. PC and I suggest you download it or watch it in full screen.
http://www.computerbase.de/2014-11/assassins-creed-unity-grafikvergleich-pc-und-ps4/

^ If you notice, apart from slightly sharper texures and resolution, which is to be expected on the PC since consoles run at 900P, slightly better/more detailed shadows on the PC, AC Unity is basically 95% identical to the PS4 version! This comparison was ran on a 980 with FXAA on the PC at 1080P and for the life of me I am not wowed even for a second that a $550 GPU can barely produce better visuals than a $400 console with an underpowered Jaguar CPU and 7850. It's actually shocking how much of a console port Unity really is. I said from the beginning that Unity was an XB1 game with GW NV code thrown in at the last minute. Heck, even NV's tessellation patch is not ready nearly 1 week after launch. The videos show PS4 and PC providing near identical visuals, while XB1 runs faster than PS4 despite a 40-50% GPU disadvantage. What more needs to be said of Unity?

This is nothing at all like when Crysis 1 came out. When C1 came out, it was 1 full generation ahead of PS3/Xbox 360. There wasn't a single game on those consoles for 5 years to match the graphics of C1. You don't need to be an art or design major to notice that AC Unity is NOT a true next generation PC game. In no way shape or form is it 1 generation ahead of PS4's graphics. The same fate awaits FC4.

That is because there is a big difference between making a next generation game specifically for the PC on a next generation game engine and making a next generation game by targetting PS4/XB1 using a slightly updated last generation game engine. Unfortunately for Ubisoft they do not have any true next generation game engines at their disposal; so naturally it is impossible for them to make a true next generation PC game. All those PC-specific effects that NV provides for free wouldn't even be in the game if it wasn't for GW and this "next generation" PC game would look 99% identical to PS4. :whiste:

Maybe some will say that my expectation of a next generation PC game is too unrealistic. Perhaps, but I expect a generational leap from BF4, Crysis 3, and Metro LL as those are "old games" and I am not seeing it on any game on the PC or PS4/XB1 so far. Clearly AC Unity and FC4 will not be those games. Next year we will get GPUs 40-50% more powerful than 980 but instead of next generation graphics, we seem to be getting next generation performance demands, minus the great AI, graphical and physics leap. Games like Evil Within, Lords of the Fallen, Titanfall, Watch Dogs all have poor performance and average graphics at best. What gives? Star Citizen, Uncharted 4 and Witcher 3 might get there.

--

For anyone paying attention it looks like AC Rogue also bombed. Gamespot - 6/10, IGN 6.8. To get scores that low on those sites implies the game is really not fun to play. This is one of Ubisoft's worst years ever. FC4 is the only game they released this holiday worth buying, but even then the reviewers are saying it isn't as fun as FC3's islands.
 
Last edited:

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
Single GTX980
https://www.youtube.com/watch?v=KBSyd1rR9sk

GTX980 SLI 4X-MSAA
http://youtu.be/ngpFxaxkOcU

GTX980SLI FXAA
http://youtu.be/Bv4D0v3ZlBg

LOL somebody all of a sudden disliked all three videos.
Wonder who that could have been........ ;)
that is pretty horrible performance bro :) it is obviously the game's fault :)

final question: is the 4x msaa one with all max or mostly max settings? if it is, your performance is 10 fps above what I stated :) oh and, please try to run in your next video, I want to see the frames dip to 30 or lower :)
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Make a video to gain some credibility. If you want to prove a point, prove it right.

I'm not going to upload any videos, mostly because I just can't be bothered. It's not important enough for me to take the time to so..

I'm already wasting enough time arguing with you people over this.

Yesterday I tried AC Unity on Quad R9 290x with triple monitor at 8040x1440 and to my surprise (not) only one card was utilized and it was litteraly a slide show. 10 fps max (at ultra settings of course) and please don't call me incompetent.
It looks like AMD doesn't have a Crossfire profile for the game yet. That's not really Ubisoft's fault. It's AMD's responsibility to provide working Crossfire profiles, just like it's NVidia's responsibility to provide working SLI profiles.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
MSAA is still functional in a deferred rendering engine, case in point: BF3, BF4. Performance loss isn't major, certainly nowhere near the turd implementation from Ubisoft.

Surely you know why that is.. BF3 and BF4 both use prebaked lighting, plus the game isn't rendering nowhere near as much detail as ACU . AC Unity also uses baked lighting, but it's the global illumination variety and so is much more complex than what is found in the BF series..

I mean how is it acceptable for such massive performance loss to use MSAA? It's almost the same perf hit as SSAA! Rather than demanding better optimizations, you diss others who run with MSAA because they prefer a cleaner sharp image.
MSAA does not give you a cleaner and sharper image in AC Unity. Plus, MSAA uses FXAA in this game anyway so you can't have one without the other.

Those fake screenshots are great, straight from the pre-rendered factory. The fact that you play the game on "Max" settings and your own screenshots
LOL you think they are fake? I hate to break it to you, but they're not. The top one I posted is in game, and the bottom one is from a cut scene which is still IN ENGINE and is not pre-rendered.

Source for screenshots

your own screenshots are rubbish in comparison should have alerted you to Ubi's often misleading PR selling "special" rendered images as in-game screenshots.
My screenshots are not rubbish in comparison, and they are in totally different areas. I haven't progressed far in the game at all as I've been going around the map collecting as many collectibles as I can.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
That's great that you can post screens of ACU looking good.

Since they aren't your screens, want to tell us what build they are on, what the system is being used to run them, etc.?

Otherwise, they could be running at 1 FPS for all we know and were only made for art and not to show remotely achievable gameplay.

Here's the source for those screenshots..

The top one is in game, but the second is from a cut scene but is still in engine..
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Single GTX980
https://www.youtube.com/watch?v=KBSyd1rR9sk

GTX980 SLI 4X-MSAA
http://youtu.be/ngpFxaxkOcU

GTX980SLI FXAA
http://youtu.be/Bv4D0v3ZlBg

LOL somebody all of a sudden disliked all three videos.
Wonder who that could have been........ ;)

Thanks for uploading. Frame rate hits 60 FPS as I knew it would, but it looks like your CPU is holding you back man. On my own system, the frame rate is very steady, which I think is because I have a hex core processor..

With V-sync disabled, I was seeing frame rates into the 70s if I recall.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
that is pretty horrible performance bro :) it is obviously the game's fault :)

final question: is the 4x msaa one with all max or mostly max settings? if it is, your performance is 10 fps above what I stated :) oh and, please try to run in your next video, I want to see the frames dip to 30 or lower :)

hehe, wait. Tell us again exactly what you said before. Saves us the trouble of backtracking through the thread.

Ah heck I'll do it for ya: Post 78

"that is bs. I know for a fact that 980 sli is getting 30 to 40 fps. hell, someone even tested it with quad 980 sli and still get redacted fps. and yes I am talking about max or almost max settings. why bother with anything if you got 980 sli. if a gamer got 1200$ of gpus, he will be expecting max in 1080p at the least."

Apparently you didn't know it for a fact.

The settings are as high as can go in ALL 3 videos with the exception of using FXAA in the last one. All other settings completely maxxed.
Also keep in mind that Shadowplay, while it doesn't impact performance much, it does lower the framerate 3-4 fps from what it is without shadowplay recording on. Also consider my i5 2500K as opposed to say the latest mainstream high core count Intel CPU. And my res is just a scooch higher at 1200p instead of 1080p. Not much but these little things add up.

I am due for a platform upgrade as I have skipped both Ivy and Haswell. I'm hoping Broadwell will be at least another 15% over Haswell. For me, that would be a substantial upgrade opportunity.
 
Last edited:

tential

Diamond Member
May 13, 2008
7,348
642
121
Here's the source for those screenshots..

The top one is in game, but the second is from a cut scene but is still in engine..

So those pics are from a High Res Screenshot thread and aren't representative of what you'd see in actual gameplay. I've seen GREAT screenshots but when I see gameplay it looks like Keysplayr and your screenshots. Rubbish.

These screens were taken with the INTENTION of looking cinematic so I don't see your point in posting them. They weren't taken with an intention of showing what it'd look like in actual gameplay.

If I had the hardware Keysplayr had running at the settings he has stated, I'd be extremely disappointed.

It's nice though to see that the engine is capable of making the game look amazing. Now the question is can the AC Unity team get the performance to make those shots feasible.

Edit: If we're only just taking Screens as to what the "best looking game is" we can find tons of highly modded Skyrim screens out there. Doesn't mean those screens are playable.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
So those pics are from a High Res Screenshot thread and aren't representative of what you'd see in actual gameplay. I've seen GREAT screenshots but when I see gameplay it looks like Keysplayr and your screenshots. Rubbish.

These screens were taken with the INTENTION of looking cinematic so I don't see your point in posting them. They weren't taken with an intention of showing what it'd look like in actual gameplay.

If I had the hardware Keysplayr had running at the settings he has stated, I'd be extremely disappointed.

It's nice though to see that the engine is capable of making the game look amazing. Now the question is can the AC Unity team get the performance to make those shots feasible.

Edit: If we're only just taking Screens as to what the "best looking game is" we can find tons of highly modded Skyrim screens out there. Doesn't mean those screens are playable.

To be fair, there is a LOT of detail in this game everywhere you look. And my rig isn't the fastest out there in terms of platform. I should be getting over 70fps (I feel) in 980SLI testing with all settings maxxed and 4X-MSAA at 1080p.
That said though, I am quite in agreement that Ubi needs to optimize this title far more than they have, if at all. Poor port.
 
Last edited:

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
hehe, wait. Tell us again exactly what you said before. Saves us the trouble of backtracking through the thread.

Ah heck I'll do it for ya: Post 78

"that is bs. I know for a fact that 980 sli is getting 30 to 40 fps. hell, someone even tested it with quad 980 sli and still get redacted fps. and yes I am talking about max or almost max settings. why bother with anything if you got 980 sli. if a gamer got 1200$ of gpus, he will be expecting max in 1080p at the least."

Apparently you didn't know it for a fact.


Also keep in mind that Shadowplay, while it doesn't impact performance much, it does lower the framerate 3-4 fps from what it is without shadowplay recording on. Also consider my i5 2500K as opposed to say the latest mainstream high core count Intel CPU. And my res is just a scooch higher at 1200p instead of 1080p. Not much but these little things add up.

I am due for a platform upgrade as I have skipped both Ivy and Haswell. I'm hoping Broadwell will be at least another 15% over Haswell. For me, that would be a substantial upgrade opportunity.
come on bro, show me a video of you running through the game, you know, play the game ahahah. not just casually walking :) I want to see those frames dip! :twisted: you got to the low 30s, I want to see low 20s or 10s :twisted:

fyi, the 30 to 40 is from a guy who has the new intel 8 core cpu + 980 sli. he played the game, not just stared up the sky screen caps or casually walked through the game. 100% max on every graphical setting of course :awe:. he experienced frame dips constantly.

in response to my post 78, I point to my post 108 :biggrin: I don't mind more anecdotal evidence, the more the better, even if they aren't the same.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
come on bro, show me a video of you running through the game, you know, play the game ahahah. not just casually walking :) I want to see those frames dip! :twisted: you got to the low 30s, I want to see low 20s or 10s :twisted:

fyi, the 30 to 40 is from a guy who has the new intel 8 core cpu + 980 sli. he played the game, not just stared up the sky screen caps or casually walked through the game. 100% max on every graphical setting of course :awe:. he experienced frame dips constantly.

in response to my post 78, I point to my post 108 :biggrin: I don't mind more anecdotal evidence, the more the better, even if they aren't the same.

Yeah, I got to the low 30's with a single card running. Your claim was that two GTX980's suffered that fate. We can plainly see, it's untrue.

And as you continue to claims things it obviously becomes more bizarre. You can obviously see that I am scoring double what you claim a new Intel 8-core CPU + 980SLI gets. Who is this "guy" you keep referring to. I'd like to speak with him if possible.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
FC4 will most likely just be FC3+, and nothing more. I don't understand why it's so difficult for modern PC gamers to accept that we are in a period of console ports. With Crytek, DICE and CDPR not releasing any powerhouse games annually, 90% of the games we'll get are going to be console ports. The added features will be whatever AMD and NV throw in via GE or GW. As an added benefit of higher resolution, we'll get sharper textures and some AA modes on the PC. By far and large the game itself will look 90% identical to the PS4 version, especially when it comes to Ubisoft games as they are all for parity / identical gaming experience (Oct 14, 2014 -- N4G).

Computerbase posted a comparison video of AC Unity on PS4 vs. PC and I suggest you download it or watch it in full screen.
http://www.computerbase.de/2014-11/assassins-creed-unity-grafikvergleich-pc-und-ps4/

^ If you notice, apart from slightly sharper texures and resolution, which is to be expected on the PC since consoles run at 900P, slightly better/more detailed shadows on the PC, AC Unity is basically 95% identical to the PS4 version! This comparison was ran on a 980 with FXAA on the PC at 1080P and for the life of me I am not wowed even for a second that a $550 GPU can barely produce better visuals than a $400 console with an underpowered Jaguar CPU and 7850. It's actually shocking how much of a console port Unity really is. I said from the beginning that Unity was an XB1 game with GW NV code thrown in at the last minute. Heck, even NV's tessellation patch is not ready nearly 1 week after launch. The videos show PS4 and PC providing near identical visuals, while XB1 runs faster than PS4 despite a 40-50% GPU disadvantage. What more needs to be said of Unity?

This is nothing at all like when Crysis 1 came out. When C1 came out, it was 1 full generation ahead of PS3/Xbox 360. There wasn't a single game on those consoles for 5 years to match the graphics of C1. You don't need to be an art or design major to notice that AC Unity is NOT a true next generation PC game. In no way shape or form is it 1 generation ahead of PS4's graphics. The same fate awaits FC4.

That is because there is a big difference between making a next generation game specifically for the PC on a next generation game engine and making a next generation game by targetting PS4/XB1 using a slightly updated last generation game engine. Unfortunately for Ubisoft they do not have any true next generation game engines at their disposal; so naturally it is impossible for them to make a true next generation PC game. All those PC-specific effects that NV provides for free wouldn't even be in the game if it wasn't for GW and this "next generation" PC game would look 99% identical to PS4. :whiste:

Maybe some will say that my expectation of a next generation PC game is too unrealistic. Perhaps, but I expect a generational leap from BF4, Crysis 3, and Metro LL as those are "old games" and I am not seeing it on any game on the PC or PS4/XB1 so far. Clearly AC Unity and FC4 will not be those games. Next year we will get GPUs 40-50% more powerful than 980 but instead of next generation graphics, we seem to be getting next generation performance demands, minus the great AI, graphical and physics leap. Games like Evil Within, Lords of the Fallen, Titanfall, Watch Dogs all have poor performance and average graphics at best. What gives? Star Citizen, Uncharted 4 and Witcher 3 might get there.

--

For anyone paying attention it looks like AC Rogue also bombed. Gamespot - 6/10, IGN 6.8. To get scores that low on those sites implies the game is really not fun to play. This is one of Ubisoft's worst years ever. FC4 is the only game they released this holiday worth buying, but even then the reviewers are saying it isn't as fun as FC3's islands.


What's odd is that we are now on two games from Ubisoft with AC Unity and Watch Dogs that perform terribly and have lackluster visuals, especially given their performance demands. Unity really amped this fail metric up. Look at the examples in the thread, the game looks like it's on PS4 and runs like a turd even at 1080p with the best single GPU available. What's odd is they are both a completely different game engine also made by different Ubisoft studios.

I am waiting on the Far Cry 4 performance benches to come in. I think it's probably nvidia's gameworks that is destroying the experience in these games by killing performance. Closed libraries that not even the devs can look into and optimize for. Performs unreasonably better on nvidia cards than AMD, far higher than the normal performance disparity we usually see. But still the games even perform like dogs on nvidia cards.

Far Cry 4 is the next game to use all these gameworks features and also has its own unique engine and dev team. If it is another unoptimized turd like AC Unity and Watch Dogs, it's probably due to gameworks that all these games perform like crap and do not deliver anything technically impressive in spite of that.

We need to see a game from another developer that uses gameworks to see what is happening here. Only ubisoft seems willing to implement these features and they are possibly paying for it in terribly performing games. The only other game on the horizon I know of is Witcher 3 and AFAIK it is not using all the features the Ubi games are. I've only seen gpu phsyx features mentioned.