• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

Mirror's Edge Catalyst won't run on Pentiums

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
yes because a Sempron 3850 gona offer a excelent experience.

No idea what you're talking about here.. o_O What does a Sempron 3850 have to do with what we're talking about?

Minimum AMD CPU requirement for the game is an FX 6350..
 

Shivansps

Diamond Member
Sep 11, 2013
3,918
1,570
136
No idea what you're talking about here.. o_O What does a Sempron 3850 have to do with what we're talking about?

Minimum AMD CPU requirement for the game is an FX 6350..

Then your argument of "forcing the best experience by removing dual cores" makes no sence.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Carfax83, you will be happy to know that the latest Frostbite iteration for Mirror's Edge have implemented dynamic global lighting.

http://www.frostbite.com/2016/04/lighting-the-city-of-glass/

No longer baked static illumination.

PBR + Dynamic GI, lots of compute. This game is going to tank on Maxwell, worse on Kepler and shine on GCN.

That's great to know. It's about time that DICE joins the rest of the top tier engines in supporting dynamic global illumination. Now the choice as to who is top dog between UE4, CryEngine and Frostbite 3 just got a lot harder.

As to how it will perform on Maxwell, Kepler and GCN, I guess we'll have to see..

It's a myth though that Maxwell is weak on compute. UE4 uses compute for it's dynamic global illumination as well, and from the benchmarks we saw with Fable Legends, Maxwell actually had a large lead over it's GCN counterparts in that respect.

I'm sure CryEngine also uses compute for it's dynamic global illumination as well, and we see the same result with NVidia hardware taking the lead.

fable-1080p-timings.png
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Then your argument of "forcing the best experience by removing dual cores" makes no sence.

So I suppose your point is that since the Sempron 3850 is a weak quad core which probably has terrible performance in the game, the amount of logical cores is actually irrelevant.

I already responded to VirtualLarry on the previous page about this, when he asked whether I'd admit that a sufficiently powerful dual core could run the game. I said yes, and I even one upped him by admitting that it was theoretically possible for a sufficiently powerful single core CPU to run the game, with the caveat that the engine would probably have to be reprogrammed..

However, none of this matters, because DICE isn't simply listing a quad core as the minimum specification. It's listing a minimum Intel i3 3250, and an AMD FX 6350. Anything less than those two CPUs (regardless of how many logical cores), will likely result in a substandard experience.
 

jpiniero

Lifer
Oct 1, 2010
16,841
7,285
136
Artificially limiting the game to a quad isn't solving anything. The "ignorant fools" will still buy the game and just complain that it doesn't launch.

The Pentium and Celeron are the only ones really affected by this. It's not a big market.

I do kind of subscribe to the "People who buy a $60 CPU probably would just pirate the game" theory.
 
Feb 19, 2009
10,457
10
76
It's a myth though that Maxwell is weak on compute. UE4 uses compute for it's dynamic global illumination as well, and from the benchmarks we saw with Fable Legends, Maxwell actually had a large lead over it's GCN counterparts in that respect.

IIRC, the 390X beat the 980 at 1080p, while Fury X comes close to the 980Ti at that resolution, which doesn't happen often.

fable-1080p-avg.png


Now, it's an UE4 game, which typically have a huge advantage for NV (compare any other UE4 game and you will see how much faster NV is) and with Fable, there's no advantage, even a slight loss.

What I take from that is if UE4 uses more compute, GCN actually starts to excel and overcome any default deficit it has due to NV's sponsorship & GameWorks integration in that engine which causes AMD to perform 25-50% worse in those titles (look up Ark, Hatred, other UE4 titles).

https://developer.nvidia.com/nvidia-gameworks-and-ue4

http://physxinfo.com/news/12540/pre...ntegration-into-unreal-engine-4-is-available/

https://developer.nvidia.com/unrealengine

https://blogs.nvidia.com/blog/2015/11/09/gameworks-vr-unreal-engine-4-ue4/

Were you aware of how awful UE4 games normally run on GCN?

ps. When I say Kepler/Maxwell is bad at compute in games, I don't mean raw compute flops. It's engine and shaders have a much higher latency to switch between graphics and compute workloads. So games that use more compute, will cause more delays for graphics rendering. This is the one area where Pascal has improved with it's fine-grained preemption.
 
Last edited:

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
TBH, the actual problem with dual-core CPUs "stuttering", is partially due to the Windows' scheduler. If background processes are deferred from being scheduled for like 3 seconds, the NT scheduler then gives them a "boost" over other processes, even if those other processes have higher priority. This is designed to prevent CPU resource starvation, for low-priority background processes. The problem is, games are real-time apps, and having their CPU cycles taken away every three seconds, for a series of scheduler quantums, is what causes "stutters" in the game every few seconds.

Whereas, most quad-core CPUs, and games written to use quad-cores, do NOT use 100% of each core. There is a primary thread, and then several secondary threads, which use up a portion of a core, but not the whole CPU core. Thus allowing background processes to be scheduled on the same cores as the secondary threads. Thus preventing scheduler starvation for background processes.

Microsoft could alleviate this issue, by implementing a game-specific scheduler, that kicks in when you are or are actively running a game.
 
Last edited:
Aug 11, 2008
10,451
642
126
I'm only the messenger ():)

Per DICE's recommendation, the game requires at least four logical cores to run..

People should respect the developer's decision, and know what they're getting into when it comes to PC gaming. To be more specific, that it is not guaranteed for a game or any other application to work if you can't meet the basic requirements for program.

Well then nobody should complain about gameworks either, because the devs have to be respected, right? They never make a bad choice, or are lazy, or are influenced by outside factors.
 
Feb 19, 2009
10,457
10
76
I seriously can't believe you guys are complaining about AAA gaming in mid 2016 to finally require quad core CPU ...

We complaint about the lack of innovation and when gamedevs raise the ceiling, we complaint about that. lol
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
IIRC, the 390X beat the 980 at 1080p, while Fury X comes close to the 980Ti at that resolution, which doesn't happen often.

Yep, and that was using DX12 as well. But when reviewers use reference clocked GPUs (NVidia was very conservative with Maxwell's clock speed), that gives NVidia a disadvantage because aftermarket models can be much faster.

I understand the use for reference clocked GPUs, to have a strict baseline for performance. But it's unrealistic in the sense that the vast majority of NVidia owners will have aftermarket parts with significantly higher clock speeds.

Case in point, my GTX 980 Ti has a boost clock that can reach as high as 1443MHz without overclocking, and typically averages around 1430. If you compare that to the reference model, you are looking at a massive performance increase..

What I take from that is if UE4 uses more compute, GCN actually starts to excel and overcome any default deficit it has due to NV's sponsorship & GameWorks integration in that engine which causes AMD to perform 25-50% worse in those titles (look up Ark, Hatred, other UE4 titles).

That's a vast oversimplification. I don't think Gameworks has anything to do with the performance, or lack thereof for AMD parts. The main reason for poor performance in these UE4 titles on AMD hardware, is because they are all Indie developers.

Indie developers don't have the resources that major developers have, and so spend whatever time they have for optimization mainly on NVidia hardware as NVidia hardware represents the large majority of the discrete GPU market.

With professional developers, the gap will decrease tremendously. Though that's not to say that I don't believe UE4 favors NVidia hardware. It does, but that's not unusual.

Some engines favor certain hardware over others, and it's always been like that. Frostbite 3 favors AMD hardware as well, but it's not overt favoritism because FB3 is a well tuned engine.

Were you aware of how awful UE4 games normally run on GCN?

Like I said, it's because they are mostly indie titles. UE4 engine is marvelous, and it runs well on everything if it's properly tuned by competent developers..

ps. When I say Kepler/Maxwell is bad at compute in games, I don't mean raw compute flops. It's engine and shaders have a much higher latency to switch between graphics and compute workloads. So games that use more compute, will cause more delays for graphics rendering. This is the one area where Pascal has improved with it's fine-grained preemption.

I guess we'll have to see when the game finally ships.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Microsoft could alleviate this issue, by implementing a game-specific scheduler, that kicks in when you are or are actively running a game.

I agree, that background processes are partly to blame for stuttering with dual core rigs. But there is a solution for this which doesn't just alleviate, but outright cures the problem.

M0aR CPU Powah! :D

Get a quad core with HT and problem solved. Or get a hexcore with HT and you'll never have to worry about CPU power, or lack thereof for a really long time.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Well then nobody should complain about gameworks either, because the devs have to be respected, right? They never make a bad choice, or are lazy, or are influenced by outside factors.

You can't use the gameworks argument for this, because Mirror's Edge Catalyst isn't a gameworks title..

Also, DICE is one of the most competent developers around. If they say the game requires four logical threads to run, then I'll take them at their word..

DICE also has a habit of pushing boundaries. They were the first major developer to completely axe DX9 from a major title, Battlefield 4, and make it DX11 only.

Progress is good! :thumbsup:
 

IllogicalGlory

Senior member
Mar 8, 2013
934
346
136
I seriously can't believe you guys are complaining about AAA gaming in mid 2016 to finally require quad core CPU ...

We complaint about the lack of innovation and when gamedevs raise the ceiling, we complaint about that. lol
I agree with VirtualLarry. He's not saying that the game needs to provide a great experience on your Pentium G3258, he's saying that there's no reason a Pentium G3258 owner, or the owner of a later-gen Pentium shouldn't be able to play the game. Unless there's a reason that the game literally cannot run on a dual core CPU, (extremely unlikely) then there's no reason to artificially limit the choices. All you need is a warning or a disclaimer, warning you that performance might suck, that pops up if you try playing the game on a two thread system.

I used to enjoy gaming at 9 FPS on a $300 laptop, so can people today. ;)
 
Aug 11, 2008
10,451
642
126
I agree with VirtualLarry. He's not saying that the game needs to provide a great experience on your Pentium G3258, he's saying that there's no reason a Pentium G3258 owner, or the owner of a later-gen Pentium shouldn't be able to play the game. Unless there's a reason that the game literally cannot run on a dual core CPU, (extremely unlikely) then there's no reason to artificially limit the choices. All you need is a warning or a disclaimer, warning you that performance might suck, that pops up if you try playing the game on a two thread system.

I used to enjoy gaming at 9 FPS on a $300 laptop, so can people today. ;)

Well said, and I totally agree.
 
May 11, 2008
22,565
1,471
126
Another developer too lazy to remove the console 0+1 dedication.

I expect a trashy PC port.

I think this is a very good idea. Because it pushes the limits of the hardware. Or do you still want C64 quality games ? Software ideas taxing the hardware is what drives hardware innovations.

The game developers from these days are obviously taking state of the art concepts from the technology world. A few years ago, multi threading in gaming was quite a challenge because of the seemingly serial nature of game engines. But as time progresses, smart people come up with better tools and ideas to tap into all the available processing power.

I have no issue with upgrading to a future 8 thread/4 core or 8 core cpu if that means i can do all kinds of fun things with it.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I agree with VirtualLarry. He's not saying that the game needs to provide a great experience on your Pentium G3258, he's saying that there's no reason a Pentium G3258 owner, or the owner of a later-gen Pentium shouldn't be able to play the game. Unless there's a reason that the game literally cannot run on a dual core CPU, (extremely unlikely) then there's no reason to artificially limit the choices. All you need is a warning or a disclaimer, warning you that performance might suck, that pops up if you try playing the game on a two thread system.

There is already a warning in the system recommendations that the game "requires" four logical cores. That's pretty explicit language, and DICE isn't a mediocre developer, so I don't even know why this is such a huge deal..

This will be the most technically advanced game released on Frostbite 3 as of yet, so if the developer puts that kind of caveat in the system specs, then you know something must be up.
 

IllogicalGlory

Senior member
Mar 8, 2013
934
346
136
I completely understand why developers targeting a traditionally decent/playable experience on dual cores isn't necessary or even desirable anymore. What I don't understand is why people are actively advocating for locking dual core owners out of the game entirely in the vein of Far Cry 4 and CoD, when as far as we know there's no reason that the game can't run on a dual core system.

Someone owning a dual core system might still find enjoyment from the game even if it could be considered unplayable and their system is way under spec. People playing with modern games with dual cores certainly understand compromise at this point.
 

mohit9206

Golden Member
Jul 2, 2013
1,381
511
136
I completely understand why developers targeting a traditionally decent/playable experience on dual cores isn't necessary or even desirable anymore. What I don't understand is why people are actively advocating for locking dual core owners out of the game entirely in the vein of Far Cry 4 and CoD, when as far as we know there's no reason that the game can't run on a dual core system.

Someone owning a dual core system might still find enjoyment from the game even if it could be considered unplayable and their system is way under spec. People playing with modern games with dual cores certainly understand compromise at this point.

I certainly do.Running modern games like AC Syndicate and Ryse among others on my Pentium is not such a great experience but that is mainly due to low end GPU rather than the CPU itself.If i could get myself a GTX970 i would be in seventh heaven so CPU is still good enough.
Now if someone tried playing the latest games on an AMD Thunderbird 1Ghz maybe then there could be some problems but locking out Pentium owners while those quad core Kabini and Atom owners can still play the game if they wanted to is a bit not ok.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
What I don't understand is why people are actively advocating for locking dual core owners out of the game entirely in the vein of Far Cry 4 and CoD, when as far as we know there's no reason that the game can't run on a dual core system.

Likely because they can't understand why anyone would consciously play a game on low end hardware that is practically guaranteed to have less of a gaming experience than the consoles.

I mean, who uses a PC with low end hardware for playing AAA games if it can't even deliver the same standard of experience as the consoles?

That's not what PC gaming is about...

Someone owning a dual core system might still find enjoyment from the game even if it could be considered unplayable and their system is way under spec. People playing with modern games with dual cores certainly understand compromise at this point.

The other reason is, the developer specifically noted that four logical cores were necessary. I understand why some might be cynical given that several other games like Far Cry 4, DAI etcetera had quad core CPUs for minimum recommendation, but could certainly be played with dual core CPUs.

But those games never had the explicit language concerning logical cores that Mirror's Edge Catalyst had for their minimum recommendations.
 

airfathaaaaa

Senior member
Feb 12, 2016
692
12
81
Yep, and that was using DX12 as well. But when reviewers use reference clocked GPUs (NVidia was very conservative with Maxwell's clock speed), that gives NVidia a disadvantage because aftermarket models can be much faster.

I understand the use for reference clocked GPUs, to have a strict baseline for performance. But it's unrealistic in the sense that the vast majority of NVidia owners will have aftermarket parts with significantly higher clock speeds.

Case in point, my GTX 980 Ti has a boost clock that can reach as high as 1443MHz without overclocking, and typically averages around 1430. If you compare that to the reference model, you are looking at a massive performance increase..



That's a vast oversimplification. I don't think Gameworks has anything to do with the performance, or lack thereof for AMD parts. The main reason for poor performance in these UE4 titles on AMD hardware, is because they are all Indie developers.

Indie developers don't have the resources that major developers have, and so spend whatever time they have for optimization mainly on NVidia hardware as NVidia hardware represents the large majority of the discrete GPU market.

With professional developers, the gap will decrease tremendously. Though that's not to say that I don't believe UE4 favors NVidia hardware. It does, but that's not unusual.

Some engines favor certain hardware over others, and it's always been like that. Frostbite 3 favors AMD hardware as well, but it's not overt favoritism because FB3 is a well tuned engine.



Like I said, it's because they are mostly indie titles. UE4 engine is marvelous, and it runs well on everything if it's properly tuned by competent developers..



I guess we'll have to see when the game finally ships.

well you can always check benchmarks of games with enabled gameworks features and when they have them disabled..
90% of the cases when you disable gameworks filters amd gets a nice boost
now if that was the case for 1 or 2 games that would have been fine but when its the norm we cant really blame the devs
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
I seriously can't believe you guys are complaining about AAA gaming in mid 2016 to finally require quad core CPU ...

We complaint about the lack of innovation and when gamedevs raise the ceiling, we complaint about that. lol

You must have missed the entire thread if that's what you believe.

Now, what happens if 2 cores ends up unused on your quad? :)
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
I think this is a very good idea. Because it pushes the limits of the hardware. Or do you still want C64 quality games ? Software ideas taxing the hardware is what drives hardware innovations.

The game developers from these days are obviously taking state of the art concepts from the technology world. A few years ago, multi threading in gaming was quite a challenge because of the seemingly serial nature of game engines. But as time progresses, smart people come up with better tools and ideas to tap into all the available processing power.

I have no issue with upgrading to a future 8 thread/4 core or 8 core cpu if that means i can do all kinds of fun things with it.

It doesn't push anything if core 0+1 is unused for example.

Core 0+1 is locked for OS on the consoles. So games start of at core 2-7.