• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

AC: Unity - GTX 680 or above minimum, 780 recommended+

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
Did something go disastrously wrong at Ubisoft to have such high min requirements? Its almost like they want the game for only the leet player with high end rigs.
Were the development team too understaffed or underqualified which caused the tuning portion to be mostly left out, to be brute forced by hardware?
Considering how Microsoft and Sony are pressuring developers to play the PC down and gimp games to 30fps etc it wouldn't surprise me if they were deliberately showing higher specs so most people with lesser PC's would just buy it on a console.

It kind of helps both industries to do this, PC hardware as well as Consoles makers.

It could just be a case of them not having a lower spec PC to try it on, I remember when Dishonoured came out and it reccomended a 460, when it could easily run on a 8800GT.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
744
126
^ That's why generally it is a good idea to see how your particular GPU performs before making the upgrade for that particular game you want to play. I wouldn't be surprised if Ubisoft completely ignored using Direct Compute for global illumination because that would have required more intense programming than just throwing this on the CPU. The fact that XB1 and PS4 versions will look nearly identical despite PS4 having 50% more computational prowess and graphics performance shows that Ubisoft didn't even try to optimize the game specifically for PS4. This game basically ran at just 9 fps 9 months ago on the consoles which shows how low the game engine optimization for modern GPUs was. Imagine 9 fps on HD7850 GPU at 900p?!

I think what Ubi did was take their "good enough" optimized game and just ported it to the PC with higher textures. Then they worked with NV to add in GW specific stuff. Instead, the game should have been made for the PC from the ground-up, used Compute to accelerate intensive GPU effects like global illumination and then ported back to consoles. Since they talk about how slow the game ran on consoles just 9 months ago, I think by that point they haven't even started on PC optimizations.

Ok so it's a next generation game for PS4/XB1 that seems like it hasn't been optimized for the PC. If the developer cared to optimize for GPU scaling, there is no way Ps4 and XB1 versions would look nearly identical. You would expect the unused PS4 GPU power to go somewhere.

Also, since NV basically supplies their specific GW code for the features they want implemented, the optimization for all AMD GPUs from day 1 will be non-existent.

After Watch Dogs though I bet Ubisoft wants to invest as little as possible into PC ports. That's why even Far Cry 4 looks like a console port, only partially saved by NV's GW's advanced features. I bet if NV didn't help finance/send engineers to help Ubisoft, we would have gotten even worse versions of AC, FC and WD franchises.
 
Last edited:

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
How did you come to that conclusion?

One could also turn it around and say its to get people to buy 290/X cards.
Possibly. But anyone looking to buy a new card would generally go with a newer product I guess. I wouldn't make a lateral move from a 780 or 780ti but if I didn't know better about inflated VRAM requirements I would just get a 970/980 to cover myself. But from a lesser card perhaps I may consider a used 290 if the price were really competitive.
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
If they wanted us to buy it on console, they wouldn't waste their time implementing PC technologies such as PCCS, HBAO+, TXAA, hairworks, enhanced surface tessellation etcetera.
Yeah we've gone a little too doom and gloom in the PC forums. It's not perfect but it isn't terrible either.
 

Carfax83

Diamond Member
Nov 1, 2010
6,064
868
126
How many times are people on this forum are going to repeat this complete nonsense? Comparing console to PC specs is not even remotely close to a 1:1 comparison on what kind of PC you can expect to run a game.
Nobody said anything about a 1:1 comparison. It's just an approximation. The fact is hardware is hardware, and the PS4 and Xbox One represent the base gameplay experience.

A GTX 680/7970 should be easily capable of delivering above that base experience..
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
744
126
Yeah we've gone a little too doom and gloom in the PC forums. It's not perfect but it isn't terrible either.
There are 350+ comments on Ubi's blog pretty much mirroring the responses on our forum:
http://blog.ubi.com/assassins-creed-unity-pc-specs/

Look at the video Ubisoft just released and stop at the 0:26 minute mark. Are you kidding me? Crysis 3, Metro LL and Ryse: SOR blow this out of the water:

- Low polygon character models
- Unrealistic hair
- Low level of detail

Go to 0:36-0:38

- Low quality to non-existent character shadows
- Low textures
- Low polygon models/low level of detail with overpowering DOF that's trying to hide these shortcomings.

https://www.youtube.com/watch?v=3-h8me2zrRw

This could be the best game of the year to some or the best AC game ever. But I am just saying given that level of graphics, a 680 should be able to hit 50-60 fps at 1080P. This is not like a game that looks 2-3x better than Black Flag.

I am not surprised though as BF was crazy demanding for the graphics, <30 fps on a 580 and <40 fps on a 680 and < 60 fps on a 780Ti!



At 1600P, you need 290X CF or 780 SLI to hit 60 fps.



You tell me, are AC games optimized?

Black Flag has drops to 49 fps on a 2600K for crying out loud with 780Ti SLI, and there was hardly scaling beyond 4 threads, not to mention i5-760 stock was barely slower than i7 4770K. What?! :eek:

 

Aikouka

Lifer
Nov 27, 2001
29,939
596
126
If the developer cared to optimize for GPU scaling, there is no way Ps4 and XB1 versions would look nearly identical.
If I had to guess, the issue is that during their development, neither console got to 60 FPS (e.g. X1 @ 35 FPS, PS4 @ 45 FPS), so both were "dumbed down" to 30 FPS. Why would Ubisoft want to do anything more for the PS4? Unless Sony gives them an "incentive", there's no reason for them to provide a visual advantage to a single console given a sale for either is good for Ubisoft.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
744
126
If I had to guess, the issue is that during their development, neither console got to 60 FPS (e.g. X1 @ 35 FPS, PS4 @ 45 FPS), so both were "dumbed down" to 30 FPS. Why would Ubisoft want to do anything more for the PS4? Unless Sony gives them an "incentive", there's no reason for them to provide a visual advantage to a single console given a sale for either is good for Ubisoft.


Why hasn't that been the case for BF4, COD : Ghosts, Evil Within, Metro Redux and many other games? If Ubisoft didn't even bother to optimize for PS4, I doubt they spent much time optimizing for PC since they know people have GTX780 paired with i7 4770k and so on. It seems Ubisoft either cares little about providing the best possible experience on a per platform basis or the agreement with MS is what made them make every version similar to each other, excepting the tweaks NV introduced to the PC courtesy of their other agreement with NV.

See the problem with these recent PS4/XB1 ports being locked to 30 fps could become scalability of the game with faster hardware since the developer never bothered optimizing the game engine:

"Bethesda has given PC users the tools to unlock The Evil Within's built-in 30fps cap, but we couldn't help but wonder - just how much computational power is required to double the game's frame-rate and produce a sustained, consistent 60fps experience at 1080p? It turns out that our PC test rig - fitted with a Core i7 3770K overclocked to 4.3GHz and matched with 16GB of DDR3 - couldn't handle it, not even when outfitted with the GeForce GTX 980, the fastest single-chip graphics card available on the market today.

What's curious is that despite The Evil Within's apparently mammoth system requirements for the best experience, what looks like a straight, under-optimised PC conversion works fairly well with entry-level enthusiast hardware. As we've seen in the past, a hard 30fps lock can give both CPU and GPU a lot of "wiggle room" - and in the case of The Evil Within, a lowly Core i3 4130 paired with a £100 GTX 750 Ti produces an experience very close indeed to the PS4 version. The problem is scalability - while it's relatively straightforward to maintain 30fps at higher resolutions with a good enthusiast card, locking to 60fps at 1080p just wasn't possible on any of the hardware configurations we tested.

The Evil Within joins the ranks of a steadily growing number of disappointing PC conversions, where achieving console-style image quality and performance is relatively straightforward on mid-range gaming PCs, but scaling up beyond the 30fps console standard to a locked 60fps proves inordinately difficult. "
~ Digital Foundry

Is 30 fps locked AC U going to have the same issues with scaling? Hopefully not.
 
Last edited:

Fire&Blood

Platinum Member
Jan 13, 2009
2,331
16
81
If I had to guess, the issue is that during their development, neither console got to 60 FPS (e.g. X1 @ 35 FPS, PS4 @ 45 FPS), so both were "dumbed down" to 30 FPS. Why would Ubisoft want to do anything more for the PS4? Unless Sony gives them an "incentive", there's no reason for them to provide a visual advantage to a single console given a sale for either is good for Ubisoft.
Hey buddy long time no see on the forums :)

The incentive for parity came from MS side, they have a marketing deal with Ubisoft for Unity. All the videos publisher so far have xbox prompts. The preload for the game started today but only for the xbox of course. I'm leaning towards thinking that without the deal, PS4 would have had 1080p and the xbox would stay @900p as has been the case so far for majority of games. Obviously 60fps was out of question for this game but PS4 tends to hit the target resolution of 1080p. AFAIK, this is the first multi platform title that both consoles are set to a equal sub 1080p res. Games which had the PS4 render at sub 1080p, the xbox version was even lower.
 

Aikouka

Lifer
Nov 27, 2001
29,939
596
126
Why hasn't that been the case for BF4, COD : Ghosts, Evil Within, Metro Redux and many other games?
Different companies with different priorities. I have no idea whether the change was influenced by money or not (although, it's not farfetched), but putting in more effort into the weaker system and letting the stronger system brute force the unoptimized code certainly sounds quicker (i.e. cheaper) to me.

Although, there's another thing to consider. Just because it's locked to 30 FPS doesn't mean that's the capability of the system. I mentioned 35 FPS and 45 FPS just as numbers pulled out of my nether region, but if those numbers were even close, one thing you could guess is that the PS4 would probably have far less chance of dipping below 30 FPS when things got wild. It's kind of like how Bayonetta 2 is 60 FPS on the Wii U... most of the time. :p

If Ubisoft didn't even bother to optimize for PS4, I doubt they spent much time optimizing for PC since they know people have GTX780 paired with i7 4770k and so on.
Maybe... maybe not. I don't necessarily want to guess, but we've definitely seen how past releases have been affected by console hardware with things like uncompressed audio and such. Although, that normally affects us by messing up our ISP's bandwidth cap more than anything, and with Comcast finally enforcing my 300GB cap, I don't really want to download a 50GB game.

See the problem with these recent PS4/XB1 ports being locked to 30 fps could become scalability of the game with faster hardware since the developer never bothered optimizing the game engine
That's why I'm wondering if it just comes down to money. Why spend the time (i.e. money) when you can make up some reason to lock it to 30 FPS (e.g. "The Cinematic Feel"), which involves much less effort. We've actually seen other sorts of awful ports recently such as Square-Enix's Final Fantasy XIII, which pulls some Dark Souls-like rendering snafus.

The thing is... people always say, "If you don't like it, don't buy it!" Although, that makes me wonder what would happen if the PC version sold poorly as a result. Would companies just start abandoning them altogether?

Hey buddy long time no see on the forums :)
Hey! I usually just lurk on this sub-forum. By the time I find an interesting topic to post in, there's usually too much bickering for me. :p

I'm leaning towards thinking that without the deal, PS4 would have had 1080p and the xbox would stay @900p as has been the case so far for majority of games.
Yeah, that's why I'm saying that there just isn't a reason for Ubisoft to do anything. I mean... if Microsoft pays them promotional money (or whatever they call it) and Sony pays them nothing, why worry about the PS4 release? Is it playable? Yes. Is it just as good as the competing console release? Yes. So, why care about making it better?

I'm not trying to say that this lazy and/or dubious approach is acceptable, but from a business perspective, it seems to make sense. Ultimately, if Ubisoft screws up the PC release, then that's their problem when the sales (might) reflect that. I think that the Assassin's Creed series can be quite fun, but if the game is poorly ported, I'd rather wait until I can get it for about $20 or less. So, the best advice is to just wait and see. I just wish they'd stop offering possibly good game content (i.e. extra missions) for pre-ordering. :\

EDIT:

I think the biggest disappointment for me is just that developers seem to forget what the point of a PC is. PCs are all about customization, and I'm not even talking about strictly for gaming. If you want an awesome experience with tons of whiz-bang effects at a high resolution, you're usually the type that pays extra to get it. Unfortunately, companies are in a console-centric mindset where it's all about providing a single experience with only a few adjustable settings. Sure, a lot of people don't want to deal with tweaking settings, but that's also why NVIDIA and AMD have solutions (GeForce Experience and Gaming Evolved respectively) to help users with that.
 
Last edited:

ASK THE COMMUNITY