Digital Foundry: all questioned AAA developers recommend AMD CPU's for gaming PC's

tulx

Senior member
Jul 12, 2011
257
2
71
Quoting this highly intriguing article by Digital Foundry:

"We approached a number of developers on and off the record - each of whom has helped to ship multi-million-selling, triple-A titles - asking them whether an Intel or AMD processor offers the best way to future-proof a games PC built in the here and now. Bearing in mind the historical dominance Intel has enjoyed, the results are intriguing - all of them opted for the FX-8350 over the current default enthusiast's choice, the Core i5 3570K. " [emphesis mine]

Seems like AMD's win in the console space will significaly improve their position in the PC market as well. Having most if not all AAA developrs preferring your hardware is a very real advantage in such a competitive space. Was about time that Intel is forced step up their game. Interesting times ahead.
 

grimpr

Golden Member
Aug 21, 2007
1,095
7
81
Yeap, it was gonna happen sooner or later, i7s & FX will dominate the gaming desktop.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
I guess the one they have a quote from is a number. And he talks about furture-proofing, lol!
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Quoting this highly intriguing article by Digital Foundry:

"We approached a number of developers on and off the record - each of whom has helped to ship multi-million-selling, triple-A titles - asking them whether an Intel or AMD processor offers the best way to future-proof a games PC built in the here and now. Bearing in mind the historical dominance Intel has enjoyed, the results are intriguing - all of them opted for the FX-8350 over the current default enthusiast's choice, the Core i5 3570K. " [emphesis mine]

Seems like AMD's win in the console space will significaly improve their position in the PC market as well. Having most if not all AAA developrs preferring your hardware is a very real advantage in such a competitive space. Was about time that Intel is forced step up their game. Interesting times ahead.

That's one of those outcomes that just defies reason.

From the developer's perspective in the product-chain, whether their games are running on an FX-8350 or an i5 3570K is irrelevant. Its all x86, same ISA, etc.

May as well ask them if they prefer their games to be ran on computers that have black computer cases or beige computer cases.

Likewise its not actually a good thing for AMD for the game developers to basically say 99% of the rest of AMD's product lineup (everything that falls below the performance of a 3570K, including the 8320, 6300, etc) is not good enough to make the grade either.

And on what basis is the argument being made?

"I'd go for the FX-8350, for two reasons. Firstly, it's the same hardware vendor as PS4 and there are always some compatibility issues that devs will have to work around (particularly in SIMD coding), potentially leading to an inferior implementation on other systems - not very likely a big problem in practice though," he says.

"Secondly, not every game engine is job-queue based, even though the Avalanche Engine is, some games are designed around an assumption of available hardware threads. The FX-8350 will clearly be much more powerful [than PS4] in raw processing power considering the superior clock speed, but in terms of architecture it can be a benefit to have the same number of cores so that an identical frame layout can be guaranteed."

Ah, I see. It is a two-pronged argument.

The first being already deemed largely irrelevant by the very person claiming it matters, and the second being an embarrassing admission by the individual that they don't realize the 8350 is a CMT microarchitecture which doesn't give you 8-cores worth of unshared resources unlike the CMP-based microarchitecture he is attempting to draw comparisons to.

Sad to see such obvious errs in understanding going to print. He'll likely be jibbed about this from his friends once they read what he is quoted as having claimed.

Meanwhile the AMD execs are groaning to themselves "thanks buddy for basically telling our would-be customers of sub-8350 products that they are obsolete before they hit the checkout stand :\"
 

Ayah

Platinum Member
Jan 1, 2006
2,512
1
81
Article sounds like a joke. I'd be all for developers parallelizing game workloads to >8 full cores though.
 

inf64

Diamond Member
Mar 11, 2011
3,884
4,692
136
I wonder what will happen when future AMD cores (SR and EX) further bridge the "ipc" gap with intel counterparts? We have good indications (from AMD tech docs) that AMD will launch mainstream 6T SR based APU with 512SP supporting DDR4/GDDR5 memory standard. This core, if AMD's claims come true, should easily be on par with FX83xx in MT workloads and outclass it in ST ones. This is mainstream APU mind you. The "FX" counterpart, if AMD ever opts for such a model, should offer >8T, more IPC and probably better thermal spec. We know what intel will have in store too( not too exciting from what we could see so far). What happens when you have 3570/3700K class AMD chip (SR 6T/512SP APU) that costs a lot less than any Haswell part and offers more or less similar CPU performance and outclasses it in GPU part? Also is unlocked and can OC decently? Will then we have a "wait for broadwell, it will destroy AMD" answer?
 

podspi

Golden Member
Jan 11, 2011
1,982
102
106
The first being already deemed largely irrelevant by the very person claiming it matters, and the second being an embarrassing admission by the individual that they don't realize the 8350 is a CMT microarchitecture which doesn't give you 8-cores worth of unshared resources unlike the CMP-based microarchitecture he is attempting to draw comparisons to.

Sad to see such obvious errs in understanding going to print. He'll likely be jibbed about this from his friends once they read what he is quoted as having claimed.


He states his reason is that the FX-8 series has eight hardware threads. While there are material differences in how Jaguar, Piledriver, and Ivy Bridge get you those 8 hardware threads, your post implies to me that you think each Jaguar thread will have a higher level of performance than each FX-8 thread, which I find unlikely despite the CMP/CMT difference.

I too believe that article is suspect, though I expect games are going to become more multithreaded in the future. We're going to want more threads, at the very least.

Meanwhile the AMD execs are groaning to themselves "thanks buddy for basically telling our would-be customers of sub-8350 products that they are obsolete before they hit the checkout stand :\"

Aren't they? :colbert:. Seriously though, I have a Phenom II X6, and I would not consider upgrading to anything other than an FX-8350, on the AMD-side. I am cautiously optimistic for Jaguar, though perhaps not as optimistic as you it seems :D (or maybe I misunderstand what you were trying to say?)
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
I believe this 100%. The PS4 and 720 are both going to be x86 based and we know the PS4 is using an eight core CPU. If game developers plan to go fully multi threaded with the new console games going forward, more coars will indeed be better for gaming. So the FX8350 will out perform a 3570K in new multithreaded games. This is no surprise as there are already games out there coded to use many cores and threads where the 8350 beats out the 3570k.

All that said an i7 will perform better, but the i5 will lag behind. If this comes to fruition maybe it will force Intel's hand to offer HT enabled CPUs at a better price point or more than four cores in their mainstream desktop CPUs.


proz.jpg


far%20cry%20proz.png
 

GreenChile

Member
Sep 4, 2007
190
0
0
I wonder what will happen when future AMD cores (SR and EX) further bridge the "ipc" gap with intel counterparts? We have good indications (from AMD tech docs) that AMD will launch mainstream 6T SR based APU with 512SP supporting DDR4/GDDR5 memory standard. This core, if AMD's claims come true, should easily be on par with FX83xx in MT workloads and outclass it in ST ones. This is mainstream APU mind you. The "FX" counterpart, if AMD ever opts for such a model, should offer >8T, more IPC and probably better thermal spec. We know what intel will have in store too( not too exciting from what we could see so far). What happens when you have 3570/3700K class AMD chip (SR 6T/512SP APU) that costs a lot less than any Haswell part and offers more or less similar CPU performance and outclasses it in GPU part? Also is unlocked and can OC decently? Will then we have a "wait for broadwell, it will destroy AMD" answer?
And what if SR and EX sprout butterfly wings and shoot rainbows out their owner's butts?

Do you not think Intel could respond very quickly to any new threat? I really do hope AMD puts out something that competes. Competition drives everyone to make better stuff at better prices.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Even discounting the points IDC makes, I'm not sure I buy the main argument either. Back when the 360 was launched a very similar argument was thrown out - that it would take a 6 core x86 processor to best handle 360 ports (1 core for each thread) - and that has generally proven false. Thuban and SNB-E are typically only marginally faster than their 4 core counterparts in most games when all other things are held equal.

On that note, do we even know if game developers get all 8 cores? Sony locked out one of the SPEs on Cell just for the OS; I would fully expect they're going to lock out at least 1 core in such a manner, with the possibility that it's more (1 for the OS, 1 for background processes, etc).

I wonder how much that articled costed. ;)
I can't tell whether you're being serious or not. But Digital Foundry is one of the last good hardware news/analysis groups out there. I'd put them right up there with AT.
 

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
All that said an i7 will perform better, but the i5 will lag behind. If this comes to fruition maybe it will force Intel's hand to offer HT enabled CPUs at a better price point or more than four cores in their mainstream desktop CPUs.

Crysis 3 is only one game, and not all the tests show the same thing,

Crysis3-CPU.png


f2-3.jpg


the problem here is,
let's say i5 3570K at 4.5GHz vs 8350 at 4.5GHz, ST performance on the i5 is much higher, so at the end it shouldn't be to different, even when taking advantage of all the 8 cores... and if you cannot use 8 cores properly for something...
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
1024x768 benches, I though he had finally got past that nonsense.

Anyway, I thought that was a really interesting article it raises a lot of good points. I say the more cores games are coded to take advantage of, the better. 8 cores on the PS4/Xbox next will most certainly translate perfectly to the 8 core AMD desktop CPUs, the synergy is very obvious.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
Crysis 3 is only one game, and not all the tests show the same thing,

Crysis3-CPU.png


f2-3.jpg


the problem here is,
let's say i5 3570K at 4.5GHz vs 8350 at 4.5GHz, ST performance on the i5 is much higher, so at the end it shouldn't be to different, even when taking advantage of all the 8 cores... and if you cannot use 8 cores properly for something...

I don't know that that is the case though. If I were to take an application written to take full advantage of available cores/threads and clock my CPU at 3.2Ghz and compare it to a 4.6Ghz 3570 in that application, my CPU is still going to plow over that i5.

The new consoles are going x86 based which is changing the game for console developers this time round. The PS4 is going for an eight core chip. Right now most games, particularly older titles, are only making use of two cores/threads. Is the plan for the x86 based PS4 to have an eight core CPU and just make use of two of those cores, leaving the rest to attend to background processes or stand idle ? I tend to believe not and that they wanted that hardware and those cores to leverage them for game performance.

Right now the bulk of games on PC are ported from consoles, consoles that currently are not x86 based and are using a PowerPC or Sony's Cell CPU. With the new consoles being x86 based and having many cores, I can see developers leveraging multithreading to make the most of the hardware. Console game developers focus heavily on eeking everything they can out of the console hardware. Now with consoles and PCs both being x86 based, I expect game developers to continue to do so and make use of multithreading on consoles, translating to better performance on PCs with more cores/threads when those games are ported over.

Right now on PC the games that make good use of more cores/threads are few and far between, but we're starting to see some of the latest games taking better advantage of them. I'm hoping for upcoming next-gen Xbox 720 / PS4 games benefiting a great deal more from more available threads when they are ported over to the PC.

With the next-gen consoles coming x86 based and the CPUs they are packing, as a gamer I think it's time to step up to an i7 if you are buying Intel or to take a look at whatever AMD has that offers moar coars :D
 

Spjut

Senior member
Apr 9, 2011
932
162
106
Games began being more multithreaded on the PC as well back when the 360 was released.
And has been pointed out, we already have some games that benefit from more than four cores(Crysis 3, BF3)
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
I don't know that that is the case though. If I were to take an application written to take full advantage of available cores/threads and clock my CPU at 3.2Ghz and compare it to a 4.6Ghz 3570 in that application, my CPU is still going to plow over that i5.

The new consoles are going x86 based which is changing the game for console developers this time round. The PS4 is going for an eight core chip. Right now most games, particularly older titles, are only making use of two cores/threads. Is the plan for the x86 based PS4 to have an eight core CPU and just make use of two of those cores, leaving the rest to attend to background processes or stand idle ? I tend to believe not and that they wanted that hardware and those cores to leverage them for game performance.

Right now the bulk of games on PC are ported from consoles, consoles that currently are not x86 based and are using a PowerPC or Sony's Cell CPU. With the new consoles being x86 based and having many cores, I can see developers leveraging multithreading to make the most of the hardware. Console game developers focus heavily on eeking everything they can out of the console hardware. Now with consoles and PCs both being x86 based, I expect game developers to continue to do so and make use of multithreading on consoles, translating to better performance on PCs with more cores/threads when those games are ported over.

Right now on PC the games that make good use of more cores/threads are few and far between, but we're starting to see some of the latest games taking better advantage of them. I'm hoping for upcoming next-gen Xbox 720 / PS4 games benefiting a great deal more from more available threads when they are ported over to the PC.

With the next-gen consoles coming x86 based and the CPUs they are packing, as a gamer I think it's time to step up to an i7 if you are buying Intel or to take a look at whatever AMD has that offers moar coars :D

The main argument against more multithreaded pc ports is the fact that jaguar is probably going to only have 1/3 to 1/4 the IPC of ivy bridge at 3.5 Ghz (clockspeed is less than half and jaguar isn't close to ivy bridge clock for clock). So developers can probably throw three console threads into one pc thread.

one is at 1080p,

the other was using 1024x768 for obvious reasons

jLSSykv.png

yeah but no-one is going to be playing at those settings. 30 fps isn't going to cut it.

1024x768 benches, I though he had finally got past that nonsense.

Anyway, I thought that was a really interesting article it raises a lot of good points. I say the more cores games are coded to take advantage of, the better. 8 cores on the PS4/Xbox next will most certainly translate perfectly to the 8 core AMD desktop CPUs, the synergy is very obvious.

This benchmark is actually more useful than the 1080p one because it shows max extractable performance. 30 fps at any resolution is not good (how many would PLAY the game at those settings on 1080p?), 45 is doable.

There will be a setting and resolution between 1080p and 768x1024 which is the sweet spot at 40 fps. Will the 8350 hit that? No. the i5/i7 is capable of hitting it and that is the setting that gamers will be playing at.

768x1024 benches are reliable ONLY IF the fps is cpu bottlenecked and really low.
 

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
And has been pointed out, we already have some games that benefit from more than four cores(Crysis 3, BF3)

sure but, a single Intel core is faster than a single AMD core.

LQrnQ9x.jpg




I don't know that that is the case though. If I were to take an application written to take full advantage of available cores/threads and clock my CPU at 3.2Ghz and compare it to a 4.6Ghz 3570 in that application, my CPU is still going to plow over that i5.

trouble is, will games really use 8 cores to the max? a game engine is not like 3d software rendering or video encoding, and again, each ivy bridge core at the same clock is MUCH faster than each Piledriver core, so it's not as simple as "4 vs 8".

also the PS4 uses a low performance 8 core CPU, in reality it should perform more like a Core i3 than an i5 I think... and if it's x86 it should be easier to port the games to the PC, making them use resources more efficiently than what we got now with the more... "exotic" 3c/6t power pc and cell!?
 

parvadomus

Senior member
Dec 11, 2012
685
14
81
Yes, they can use all cores to the max. I cant think of any part of a game that cannot be parallelized. I only see lazy developers. AI, physics, particles, animations, etc. all can run faster with more cores.
 

CHADBOGA

Platinum Member
Mar 31, 2009
2,135
833
136
I wonder what will happen when future AMD cores (SR and EX) further bridge the "ipc" gap with intel counterparts? We have good indications (from AMD tech docs) that AMD will launch mainstream 6T SR based APU with 512SP supporting DDR4/GDDR5 memory standard. This core, if AMD's claims come true, should easily be on par with FX83xx in MT workloads and outclass it in ST ones. This is mainstream APU mind you. The "FX" counterpart, if AMD ever opts for such a model, should offer >8T, more IPC and probably better thermal spec. We know what intel will have in store too( not too exciting from what we could see so far). What happens when you have 3570/3700K class AMD chip (SR 6T/512SP APU) that costs a lot less than any Haswell part and offers more or less similar CPU performance and outclasses it in GPU part? Also is unlocked and can OC decently? Will then we have a "wait for broadwell, it will destroy AMD" answer?

So when will this awe inspiring 6T SR based APU be available to Joe Public?
 

Soulkeeper

Diamond Member
Nov 23, 2001
6,736
156
106
I really hope when ddr4 hits that AMD has all their ducks in a row.
Nice new chipset, new cpu, and unified socket for APU and non-GPU

What i'd like to see is APU being the low/mid option and a non-gpu option using the extra die space for another module or two. All working on the same socket/boards.
 
Aug 11, 2008
10,451
642
126
I wonder what will happen when future AMD cores (SR and EX) further bridge the "ipc" gap with intel counterparts? We have good indications (from AMD tech docs) that AMD will launch mainstream 6T SR based APU with 512SP supporting DDR4/GDDR5 memory standard. This core, if AMD's claims come true, should easily be on par with FX83xx in MT workloads and outclass it in ST ones. This is mainstream APU mind you. The "FX" counterpart, if AMD ever opts for such a model, should offer >8T, more IPC and probably better thermal spec. We know what intel will have in store too( not too exciting from what we could see so far). What happens when you have 3570/3700K class AMD chip (SR 6T/512SP APU) that costs a lot less than any Haswell part and offers more or less similar CPU performance and outclasses it in GPU part? Also is unlocked and can OC decently? Will then we have a "wait for broadwell, it will destroy AMD" answer?

The cpu part could be interesting, but even with 512sp, you will still need a discrete card to do any sort of high end gaming. After all that is only HD7750 level of performance, assuming they completely overcome any bandwidth bottlenecks. And this is what, 1 or 2 years down the line at best? By then low/mid levels cards will hopefully have improved as well.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Yes, they can use all cores to the max. I cant think of any part of a game that cannot be parallelized. I only see lazy developers. AI, physics, particles, animations, etc. all can run faster with more cores.

I am writing a simple program for the purposes of simulating varying levels of microstutter. I was attempting to run physics and split my game world into a series of jobs such that it utilised multiple threads very well, hopefully optimally. I hit a number of interesting problems with a pure functional game world and multithreading. Its definitely not lazy developers at fault. Having tried to writing such a thing myself I can attest to the fact it will take considerable additional engineering to achieve this, its not easy. I had to give up with the goal of such scalability, there were simply some activities I could not make multithread work well at all without limiting what the simulation could do. For a real game simulation and all its complexities the challenge is greater.

There is no lazy in this, multithreading is extremely difficult, and in many cases impossible.
 
Aug 11, 2008
10,451
642
126
Even discounting the points IDC makes, I'm not sure I buy the main argument either. Back when the 360 was launched a very similar argument was thrown out - that it would take a 6 core x86 processor to best handle 360 ports (1 core for each thread) - and that has generally proven false. Thuban and SNB-E are typically only marginally faster than their 4 core counterparts in most games when all other things are held equal.

On that note, do we even know if game developers get all 8 cores? Sony locked out one of the SPEs on Cell just for the OS; I would fully expect they're going to lock out at least 1 core in such a manner, with the possibility that it's more (1 for the OS, 1 for background processes, etc).

I can't tell whether you're being serious or not. But Digital Foundry is one of the last good hardware news/analysis groups out there. I'd put them right up there with AT.

I dont know about the article, but again as in the PS4 thread, devs can say anything they want, and are not necessarily unbiased because they may be my be involved with the hardware, or for the PC, perhaps with the AMD gaming evolved program. I dont know if that is true or not, but I would just take this kind of statement with a grain of salt without any supporting benchmarks.
 
Aug 11, 2008
10,451
642
126
Crysis 3 is only one game, and not all the tests show the same thing,

Crysis3-CPU.png


f2-3.jpg


the problem here is,
let's say i5 3570K at 4.5GHz vs 8350 at 4.5GHz, ST performance on the i5 is much higher, so at the end it shouldn't be to different, even when taking advantage of all the 8 cores... and if you cannot use 8 cores properly for something...

Also, as Toms pointed out, the minimum framerate for the 8350 is much lower than even a low end i5 without hyperthreading. And these are 1080p benchmarks at high detail settings. It is too bad though that they did not test the 3770 to see if hyperthreading improved performance.