monstercameron
Diamond Member
Fp is so bad on fx bd, pd and sr but its all in the name of fus..hsa. I kinda hope AMD doubles down on their fx design methodology. More fast integer, decent St fpu and delegate the rest of the fp workload to the gpu.
And for someone who's gaming is not the primary goal???? Why talk about gaming only? If gaming is all you do then yes don't pick FX, but if you need your PC for other stuff that might benefit from several threads too, FX is a very compelling option.
Holding the Stilt's comments about Vishera-k aside, the 8370SE does not surprise here much. Nothing to see here, move right along . . .
Now, if the Vishera-k info he posted is legit, then that is a somewhat-interesting development. Could we start seeing 5 ghz overclocks on the 8320SE?
The Stilt? You mean the guy that lacks basic understanding of Tcase? He may get more credit than deserved.
only that there are more well rounded and efficient offerings at very similar price
So under Dx an i3 is faster, and even with mantle the i3 is as fast as the 8350 while using 100 watts less power. Doesnt seem like a very good example to justify the 8350.
I do admire that they brought 8 cores to 95w, means that you get to pick between the fx6300 and the 8370e as your best 95w options.
Fp is so bad on fx bd, pd and sr but its all in the name of fus..hsa. I kinda hope AMD doubles down on their fx design methodology. More fast integer, decent St fpu and delegate the rest of the fp workload to the gpu.
The fact that both consoles, the XBOX One and Sony Playsation 4, have 8-core AMD Jaguar CPU's makes me believe these FX 8-core chips will have a really long life time in terms of gaming.
Most PC ports, starting with Grand Theft Auto V, should scale and perform very competitively on the AMD FX chips for the life of these new consoles (until 2020ish).
8 cores jaguar, FX or Haswell-E all suffers from the same terminal condition: Diminishing returns.
One. Amdahls law.
Two. Synchronization overhead, the more cores you have to synchronize the bigger the overhead becomes .. That 8'th core will rarely go over 50% efficiency. A tech like Intels TSX would greatly help leverage this issue. Unfortunately they derped the very feature that should help pave the way for the many-core future.
The fact that both consoles, the XBOX One and Sony Playsation 4, have 8-core AMD Jaguar CPU's makes me believe these FX 8-core chips will have a really long life time in terms of gaming.
Most PC ports, starting with Grand Theft Auto V, should scale and perform very competitively on the AMD FX chips for the life of these new consoles (until 2020ish).
consoles run a different version of the game, because of the OS/fixed spec differences
Dead Rising 3 was made as an Xbox One exclusive and released in 2013 (1.75GHz 8 core Jaguar) and runs at 30FPS (with slowdowns)
doesn't seem like it brings much for AMD CPUs
![]()
perhaps newer Xbox One games PC ports will be more optimized for MT, but the PC version of dead rising 3 loves 4 threads, like most games from the past 5 years.
best case scenario I can see for the 8 core FX for gaming is to close the gap for some games (like a BF4 or anything highly optimized for more than 4 threads), but for many years to come I think other games will be heavily dependent on the core performance and limited to 1-4 threads, so it's hard to see the FX as a better option for gaming than most Intel CPUs, but let's wait and see, GTA 5 should be a good one.
Those are pretty unusual results for a modern game. I have to admit I was pretty shocked when I saw them. It does perfectly illustrate the problem with FX though for gaming, it is wildly inconsistent. In some games it is very competitive, while in others it is atrociously slower. And there are no games in which FX is really marked faster than an i5 or i7. One can blame the software, or speculate what will happen in the future, bur right now, the performance "is what it is", which in equal at best and in some cases, markedly slower, while negating some of the initial cost savings by using more power. This is why I don't understand those who so vigerously defend the higher power consumption of the FX. If the performance were reversed and FX was faster, I would have no problem with the higher power use, but giving up performance while using more power all for something like a 10 percent cost savings based on the total cost of a system, just seems like a poor compromise, especially for a relatively expensive hobby like PC gaming.
So AMD's GPUs scale a lot better with Intel CPUs in Dead Rising 3. Not the first time I saw something like that but it's kinda ironic.
An HD7950 or GTX660TI are NOT cheap gaming cards.
I would suspect most people running it would be doing so at lower settings anyway and running it at under 60FPS.
Even then it will be a better experience than the console version.
So AMD's GPUs scale a lot better with Intel CPUs in Dead Rising 3. Not the first time I saw something like that but it's kinda ironic.
What's ironic is when AMD ships GPU review units installed in Intel systems. Or when they reviewers guides say to use Intel CPUs.
So AMD's GPUs scale a lot better with Intel CPUs in Dead Rising 3. Not the first time I saw something like that but it's kinda ironic.
Looking at this, the min frame rates (and the averages as well) pretty much double up until you hit the Intel cpus, which get a ~50% boost.
Given that the game is well threaded, this might imply that AMD's drivers really are limiting performance here.
Say dead rising 3 has 6 threads, each doing 6 independent tasks. The primary thread may handle graphics rendering related stuff and directly interact with the DX11 API, and the graphics driver runs on the same core. That could leave 2 threads busy waiting for the primary thread to finish, and 3 additional threads in DR3 that can spin up as needed for non-graphics/physics tasks.
Anyhow, since the workload is the same regardless of gpu in play, it really gives a strong indication that whether through threading or simply being more efficient, nvidia's driver is leaving a lot more time available for that primary thread than AMD's driver.