yes because a Sempron 3850 gona offer a excelent experience.
No idea what you're talking about here..
Minimum AMD CPU requirement for the game is an FX 6350..
yes because a Sempron 3850 gona offer a excelent experience.
No idea what you're talking about here..What does a Sempron 3850 have to do with what we're talking about?
Minimum AMD CPU requirement for the game is an FX 6350..
Carfax83, you will be happy to know that the latest Frostbite iteration for Mirror's Edge have implemented dynamic global lighting.
http://www.frostbite.com/2016/04/lighting-the-city-of-glass/
No longer baked static illumination.
PBR + Dynamic GI, lots of compute. This game is going to tank on Maxwell, worse on Kepler and shine on GCN.
Then your argument of "forcing the best experience by removing dual cores" makes no sence.
Artificially limiting the game to a quad isn't solving anything. The "ignorant fools" will still buy the game and just complain that it doesn't launch.
It's a myth though that Maxwell is weak on compute. UE4 uses compute for it's dynamic global illumination as well, and from the benchmarks we saw with Fable Legends, Maxwell actually had a large lead over it's GCN counterparts in that respect.
I'm only the messenger ()
Per DICE's recommendation, the game requires at least four logical cores to run..
People should respect the developer's decision, and know what they're getting into when it comes to PC gaming. To be more specific, that it is not guaranteed for a game or any other application to work if you can't meet the basic requirements for program.
IIRC, the 390X beat the 980 at 1080p, while Fury X comes close to the 980Ti at that resolution, which doesn't happen often.
What I take from that is if UE4 uses more compute, GCN actually starts to excel and overcome any default deficit it has due to NV's sponsorship & GameWorks integration in that engine which causes AMD to perform 25-50% worse in those titles (look up Ark, Hatred, other UE4 titles).
Were you aware of how awful UE4 games normally run on GCN?
ps. When I say Kepler/Maxwell is bad at compute in games, I don't mean raw compute flops. It's engine and shaders have a much higher latency to switch between graphics and compute workloads. So games that use more compute, will cause more delays for graphics rendering. This is the one area where Pascal has improved with it's fine-grained preemption.
Microsoft could alleviate this issue, by implementing a game-specific scheduler, that kicks in when you are or are actively running a game.
Well then nobody should complain about gameworks either, because the devs have to be respected, right? They never make a bad choice, or are lazy, or are influenced by outside factors.
I agree with VirtualLarry. He's not saying that the game needs to provide a great experience on your Pentium G3258, he's saying that there's no reason a Pentium G3258 owner, or the owner of a later-gen Pentium shouldn't be able to play the game. Unless there's a reason that the game literally cannot run on a dual core CPU, (extremely unlikely) then there's no reason to artificially limit the choices. All you need is a warning or a disclaimer, warning you that performance might suck, that pops up if you try playing the game on a two thread system.I seriously can't believe you guys are complaining about AAA gaming in mid 2016 to finally require quad core CPU ...
We complaint about the lack of innovation and when gamedevs raise the ceiling, we complaint about that. lol
I agree with VirtualLarry. He's not saying that the game needs to provide a great experience on your Pentium G3258, he's saying that there's no reason a Pentium G3258 owner, or the owner of a later-gen Pentium shouldn't be able to play the game. Unless there's a reason that the game literally cannot run on a dual core CPU, (extremely unlikely) then there's no reason to artificially limit the choices. All you need is a warning or a disclaimer, warning you that performance might suck, that pops up if you try playing the game on a two thread system.
I used to enjoy gaming at 9 FPS on a $300 laptop, so can people today.![]()
Another developer too lazy to remove the console 0+1 dedication.
I expect a trashy PC port.
I agree with VirtualLarry. He's not saying that the game needs to provide a great experience on your Pentium G3258, he's saying that there's no reason a Pentium G3258 owner, or the owner of a later-gen Pentium shouldn't be able to play the game. Unless there's a reason that the game literally cannot run on a dual core CPU, (extremely unlikely) then there's no reason to artificially limit the choices. All you need is a warning or a disclaimer, warning you that performance might suck, that pops up if you try playing the game on a two thread system.
I completely understand why developers targeting a traditionally decent/playable experience on dual cores isn't necessary or even desirable anymore. What I don't understand is why people are actively advocating for locking dual core owners out of the game entirely in the vein of Far Cry 4 and CoD, when as far as we know there's no reason that the game can't run on a dual core system.
Someone owning a dual core system might still find enjoyment from the game even if it could be considered unplayable and their system is way under spec. People playing with modern games with dual cores certainly understand compromise at this point.
What I don't understand is why people are actively advocating for locking dual core owners out of the game entirely in the vein of Far Cry 4 and CoD, when as far as we know there's no reason that the game can't run on a dual core system.
Someone owning a dual core system might still find enjoyment from the game even if it could be considered unplayable and their system is way under spec. People playing with modern games with dual cores certainly understand compromise at this point.
Yep, and that was using DX12 as well. But when reviewers use reference clocked GPUs (NVidia was very conservative with Maxwell's clock speed), that gives NVidia a disadvantage because aftermarket models can be much faster.
I understand the use for reference clocked GPUs, to have a strict baseline for performance. But it's unrealistic in the sense that the vast majority of NVidia owners will have aftermarket parts with significantly higher clock speeds.
Case in point, my GTX 980 Ti has a boost clock that can reach as high as 1443MHz without overclocking, and typically averages around 1430. If you compare that to the reference model, you are looking at a massive performance increase..
That's a vast oversimplification. I don't think Gameworks has anything to do with the performance, or lack thereof for AMD parts. The main reason for poor performance in these UE4 titles on AMD hardware, is because they are all Indie developers.
Indie developers don't have the resources that major developers have, and so spend whatever time they have for optimization mainly on NVidia hardware as NVidia hardware represents the large majority of the discrete GPU market.
With professional developers, the gap will decrease tremendously. Though that's not to say that I don't believe UE4 favors NVidia hardware. It does, but that's not unusual.
Some engines favor certain hardware over others, and it's always been like that. Frostbite 3 favors AMD hardware as well, but it's not overt favoritism because FB3 is a well tuned engine.
Like I said, it's because they are mostly indie titles. UE4 engine is marvelous, and it runs well on everything if it's properly tuned by competent developers..
I guess we'll have to see when the game finally ships.
I seriously can't believe you guys are complaining about AAA gaming in mid 2016 to finally require quad core CPU ...
We complaint about the lack of innovation and when gamedevs raise the ceiling, we complaint about that. lol
I think this is a very good idea. Because it pushes the limits of the hardware. Or do you still want C64 quality games ? Software ideas taxing the hardware is what drives hardware innovations.
The game developers from these days are obviously taking state of the art concepts from the technology world. A few years ago, multi threading in gaming was quite a challenge because of the seemingly serial nature of game engines. But as time progresses, smart people come up with better tools and ideas to tap into all the available processing power.
I have no issue with upgrading to a future 8 thread/4 core or 8 core cpu if that means i can do all kinds of fun things with it.
