Do AMD cpus at least give a smoother desktop experience w/more cores?

Page 18 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
Bad analogy. You can always take a Corvette to a race track to see the difference legally.

What you're basically saying is I going to buy a 'Vette and then put mini-spare tires on it for its wheels (it is pointless to buy a CPU for 100+ fps, if the monitor itself can only display 60).

While true, an FX CPU is not going to provide even 60fps in quite a few games, especially without overclocking. They're good for more games than not, but a decent portion will see minimums in the 40's or even 30's, and a tiny fraction will actually see framerates in the teens at times, where they'd be closer to 40 on an Intel CPU.

On the flip side, an i3 is going to show its limitations in games like BF4, online. Both chips have compromises that an i5 will not.

If you're satisfied with the performance of an FX CPU, you'd likely be just as satisfied with an i3 overall, though which one is more appropriate, again, depends on your specific use-case.
 
Last edited:

TheELF

Diamond Member
Dec 22, 2012
4,027
753
126
Well, I was referring to the Pentium G3258 that I just sold.

It didn't matter how much overclocking or awesome IPC that chip has -- 2 thread chips are lousy gaming CPU's in 2016.

Nope,2015/16 games are just crap at running, period.
Plenty of games that did run lousy on any CPU no matter what.
 

TheELF

Diamond Member
Dec 22, 2012
4,027
753
126
Bad analogy. You can always take a Corvette to a race track to see the difference legally.

What you're basically saying is I going to buy a 'Vette and then put mini-spare tires on it for its wheels (it is pointless to buy a CPU for 100+ fps, if the monitor itself can only display 60).

Yeah cause 120/144 Mhz monitors are a thing of science fiction.
 
Aug 11, 2008
10,451
642
126
It's not really the case that AMD chips give better bang for your buck. They're cheaper, definitely, and they perform poorer too. You get about what you pay for.

Most of the pro-AMD CPU debate seems to center around the idea that you don't need anything faster than a ~$100 CPU to have a good gaming experience - which I would agree with, as applies to myself, but a "good experience" is relative. Some people aren't happy with drops to ~30fps due to CPU limitations, while others are fine with it. I'm quite happy with my HD7850, while many would consider it to be around the minimum GPU to even play modern games. There are others still who will say it's false economy not to spend an extra $50-100 on an $800+ computer (just as an example), believing that spending extra will give it a much longer useful life - which is probably true, in most cases.

Ultimately, if you acknowledge these things, the only relevant question you'll be left with is whether or not AMD CPUs are better or worse choices than Intel CPUs of approximately the same cost, to which the answer is "it depends on what you're doing with it". For my own needs, Intel CPUs are better suited, and I tend to project my use cases onto others when recommending parts.

Yea, I agree. There are only a few very small niches that AMD even offers better value. Maybe if you insist on gaming on the igp (why???) or for a very low end gaming build with something like the Athlon X4 and a dgpu. Otherwise, even on the low end, I would pick celeron or pentium (big core, not atom) over anything AMD offers.
 

MiddleOfTheRoad

Golden Member
Aug 6, 2014
1,123
5
0
Nope,2015/16 games are just crap at running, period.
Plenty of games that did run lousy on any CPU no matter what.

Yeah right.... There isn't a single game that my Devil's Canyon doesn't run silky smooth. This is just pretentious nonsense -- I don't care if the game is running a smooth 30 fps or 200 fps, as long as it doesn't stutter -- I'm good. You are just complaining to hear the sound of your own voice. The whole "merely a console port" argument is stupid -- let me see one of these "merely a port(s)" run in Ultra detail on a PS4 / Xbone. Yeah, that ain't happening......
 
Last edited:

MiddleOfTheRoad

Golden Member
Aug 6, 2014
1,123
5
0
Yea, I agree. There are only a few very small niches that AMD even offers better value. Maybe if you insist on gaming on the igp (why???) or for a very low end gaming build with something like the Athlon X4 and a dgpu. Otherwise, even on the low end, I would pick celeron or pentium (big core, not atom) over anything AMD offers.

It really depends -- The FX grows more competitive as resolution increases. So if someone is gaming @ 4K, where every game is GPU bound (upgrading from a FX 8 core is pointless).... The FX runs toe-to-toe with considerably more expensive CPU's.

If someone is planning to game in low resolutions, that's when Intel's superior single threaded power kicks in. But personally, I'll never game at a lowly 1080p ever again. I'm not even crazy over the way 1440p looks. I'm a 2160p guy.

And at 2160p -- there is absolutely no difference between my FX-8320 and my i7 4790K.... Because the GPU is holding me back at that resolution.

So if you are going to play @ 4k -- all your money should go into the video card(s). As long as you've got 4+ threads, the CPU is a moot point.

http://www.technologyx.com/featured/amd-vs-intel-our-8-core-cpu-gaming-performance-showdown/4/

As for a Celeron or Pentium, I'd never recommend it. Sure you've got a good upgrade path -- but dual cores are obsolete. I personally recommend the i3 at the bare minimum, and suggest to stretch the budget to an i5 to friends doing Intel builds.
 
Last edited:

ultimatebob

Lifer
Jul 1, 2001
25,134
2,450
126
In normal desktop use, you would be hard pressed to tell any difference between an i3, an i7, an FX4300, or an FX8350.

An SSD will make a big difference to you if you are currently using a hard drive.

You can almost always come up with a rare scenario to make your favorite CPU look good. :D

It depends on what you use your desktop for. Something like Word or Chrome should perform about the same, but something resource intensive like AutoCAD or Adobe Premiere will be slower if you're going from a shiny new Core i7 to most of AMD's current offerings. Why someone would actually DO that is beyond me, though.

You might also notice if a difference if you compress big files with 7Zip. But, seriously, how often do you do that?
 

MiddleOfTheRoad

Golden Member
Aug 6, 2014
1,123
5
0
It depends on what you use your desktop for. Something like Word or Chrome should perform about the same, but something resource intensive like AutoCAD or Adobe Premiere will be slower if you're going from a shiny new Core i7 to most of AMD's current offerings. Why someone would actually DO that is beyond me, though.

You might also notice if a difference if you compress big files with 7Zip. But, seriously, how often do you do that?

Not really -- GPU acceleration is far more critical to good performance on Premiere. The CPU isn't nearly as important as it used to be as long as you have enough threads (although, it used to be... prior to GPU acceleration).

I mean seriously, look at the GPU result. A Titan cut rendering time down from 527 seconds to 39 seconds. Pretty darn amazing video card. Heck even a lowly GT-240 managed to cut rendering time down to 108 seconds. I'd call Premiere a GPU bound application IMO.


premiere.png

PPBM5-FX-6300-Timeline.jpg
 
Last edited:

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
It depends on what you use your desktop for. Something like Word or Chrome should perform about the same, but something resource intensive like AutoCAD or Adobe Premiere will be slower if you're going from a shiny new Core i7 to most of AMD's current offerings. Why someone would actually DO that is beyond me, though.

You might also notice if a difference if you compress big files with 7Zip. But, seriously, how often do you do that?

Autocad and Premiere are a far cry from normal home desktop use, though.

Anyone running those ought to be doing their research before putting a system together.
 

TheELF

Diamond Member
Dec 22, 2012
4,027
753
126
Yeah right.... There isn't a single game that my Devil's Canyon doesn't run silky smooth. This is just pretentious nonsense -- I don't care if the game is running a smooth 30 fps or 200 fps, as long as it doesn't stutter -- I'm good. You are just complaining to hear the sound of your own voice. The whole "merely a console port" argument is stupid -- let me see one of these "merely a port(s)" run in Ultra detail on a PS4 / Xbone. Yeah, that ain't happening......

I never said console port,but look at them games
Watch dogs runs way slower than it should
Arkham knight was a total mess
Fall out 4 has massive drops
Black ops 3 had a day one patch just to start up on i5's
Assassin creed syndicate has massive stutter,lock ups really, as soon as you enter a coach
Gta V looses whole city blocks worth of textures
And so on and so on.
All of this has nothing to do with the CPU the games are running on but is just crappy programming.

[/I]let me see one of these "merely a port(s)" run in Ultra detail on a PS4 / Xbone. Yeah, that ain't happening......
What's that got to do with anything?
We (me at least) are talking about CPU performance/utilization not about what kind of GPU you need for ultra settings.
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
It really depends -- The FX grows more competitive as resolution increases. So if someone is gaming @ 4K, where every game is GPU bound (upgrading from a FX 8 core is pointless).... The FX runs toe-to-toe with considerably more expensive CPU's.

If someone is planning to game in low resolutions, that's when Intel's superior single threaded power kicks in. But personally, I'll never game at a lowly 1080p ever again. I'm not even crazy over the way 1440p looks. I'm a 2160p guy.

And at 2160p -- there is absolutely no difference between my FX-8320 and my i7 4790K.... Because the GPU is holding me back at that resolution.

So if you are going to play @ 4k -- all your money should go into the video card(s). As long as you've got 4+ threads, the CPU is a moot point.

http://www.technologyx.com/featured/amd-vs-intel-our-8-core-cpu-gaming-performance-showdown/4/

As for a Celeron or Pentium, I'd never recommend it. Sure you've got a good upgrade path -- but dual cores are obsolete. I personally recommend the i3 at the bare minimum, and suggest to stretch the budget to an i5 to friends doing Intel builds.


This is a fallacious argument. The FX doesn't grow more competitive as resolution increases, but rather, it's harder for a GPU to maintain a high framerate as resolution increases.

Let's take a hypothetical GPU bound scenario:

Game @ max settings
4K - 30fps (average)
1140p - 45fps
1080p - 60fps

Medium-high
4K - 45fps
1440p - 60fps
1080p - 75fps

Medium
4k - 60fps
1440p - 75fps
1080p - 90fps


Now, let's assume you're in the majority and have a 60hz monitor, and so going above 60fps is pointless. Let's also assume that with an FX CPU, you can get 45fps average in the above game, with ~35fps minimums, and with an Intel CPU (i5 or i7?), 70fps average with 50fps minimums.

^ I feel this is a scenario that's quite realistic and representative of more demanding games.

Would it be a correct conclusion to draw, then, that because you can only average 30fps at 4K/max with your GPU, that there's no point in buying a CPU that will deliver more than 35fps minimums and 50 average, because that's still higher than the lowest framerate you can bring your game down to with in-game settings?

In the hypothetical scenario above, the FX CPU owner essentially has two choices: 4K max at 30fps, or lower settings with 45fps average and dips into the mid 30's because of the CPU.

The Intel CPU owner has the options of 4K @ 30fps, slightly lower settings for 45fps average with no drops due to CPU limitations, or lower settings still with 60fps average and dips into the 50's due to CPU limitations.

Frankly, I think most people would rather drop their settings a little bit and get higher framerates, rather than play their games on their $1,000 monitor and $800+ GPU(s) at 30fps average. In fact, it doesn't matter how much you spend on your monitor or GPU - you'll want a CPU that can deliver fluid gameplay, if you can afford it, because there aren't really many settings in-game that you can change that will improve a CPU bottleneck, whereas any modern discrete GPU can deliver fluid framerates in any game, given the right combination of settings.

I'm not saying that FX CPUs don't deliver the better part of 60fps in most games today - but so would an i3. Both an FX and an i3 will be fine for more uses than not, and both will offer a compromised experience in some situations.

I generally don't advocate cheaping out on a CPU purchase in order to budget more for a GPU because 1) CPU limits generally can't be fixed with in-game settings, so when you have one, you have to live with it, 2) GPUs become obsolete and depreciate far more quickly than CPUs, so it makes more sense to replace GPUs more regularly, and 3) GPUs are easier to replace/upgrade than CPUs, especially if you have to replace your entire platform, which is generally the case these days.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
So after 18 pages, the answer is still "no"

It's also amusing that the ADF has started portraying AMD as competitive by virtue of making another component the main bottleneck
 

superstition

Platinum Member
Feb 2, 2008
2,219
221
101
Using your Cinebench results, I decided to compare against my own system, as listed below in my signature. I used a Kil-A-Watt and measured from the wall for my power numbers, and used CPU-Z to obtain my voltages.

Total system power
FX @ 4.4ghz, 1.2250v
Idle: 120w
Cinebench: 242w
Score: 688

Ivy Bridge i5 @ 3.6ghz, 1.016v (stock speed, undervolted)
Idle: 37.4w (+/- 0.1w)
Cinebench: 69.0w (+/- 0.3w)
Score: 505

~

When comparing total system power consumption, my i5 comes out having approximately 2.6x the performance per watt.

It's not just the FX platform's high parasitic power draw skewing the results though. When you subtract idle watts from load watts (effectively isolating the CPU's power draw), and compare:

i5 Delta Watts: 31.6w
FX Delta Watts: 122w

The 3 generation old i5, in this light, has approximately 2.8x the performance per watt. Cranking it up to 4.0ghz reduces performance per watt slightly, but it still delivers 2.4x the performance per watt.

I highly doubt it's directly correlative, but AtenRa's testing with i3's shows Haswell as having ~33% better performance per watt than Ivy Bridge, and Skylake as having ~60% better performance per watt than Ivy Bridge. (watt hours to complete Cinebench R15 x264 benchmark). Also, I'm under the impression that HT CPUs have slightly better performance per watt than non-HT CPUs, which would make the FX look even worse when compared with an i7 (or i3, though they're not in the same ballpark in terms of performance).

Now, admittedly, your FX system still scores 36% higher in the benchmark, which has value in and of itself, but the power numbers don't look good at all.

FWIW
I hope someone pointed out that Cinebench, at least to my knowledge, is a heavily FPU-biased benchmark that will make the FX chips look worse because they only have four floating point units. Bulldozer was designed to focus on integer performance.

It may be as unfair as only using an integer benchmark to compare architectures.
 

TheELF

Diamond Member
Dec 22, 2012
4,027
753
126
So after 18 pages, the answer is still "no"
The answer is still "It all depends"
Running a bunch of badly cooperating software*, when you have no idea how multitasking works on a PC,will be smoother on an FX-8xxx then on a pentium-i3-i5,might be slower on the whole compared to an i5/i3 but it will be smoother.

Than again anybody who has figured out ctrl+alt+del can put priorities on badly cooperating software* and run the whole shabang smoothly even on a pure dual core, again might(will) be slower on the whole compared to an XX(insert beloved CPU here) but smoother or just as.

*Programs or games that use a lot of CPU resources and jet start with a priority higher than normal.(cough cough Assassins creed cough)
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
I hope someone pointed out that Cinebench, at least to my knowledge, is a heavily FPU-biased benchmark that will make the FX chips look worse because they only have four floating point units. Bulldozer was designed to focus on integer performance.

It may be as unfair as only using an integer benchmark to compare architectures.

Definitely. I'm curious about exploring performance per watt of different architectures in different scenarios, but the Cinebench numbers were already very conveniently there and easy to compare to.
 

TheELF

Diamond Member
Dec 22, 2012
4,027
753
126
I hope someone pointed out that Cinebench, at least to my knowledge, is a heavily FPU-biased benchmark that will make the FX chips look worse because they only have four floating point units. Bulldozer was designed to focus on integer performance.

It may be as unfair as only using an integer benchmark to compare architectures.
You need to calculate decimal points for nearly anything in computing...
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
The answer is still "It all depends"
Running a bunch of badly cooperating software*, when you have no idea how multitasking works on a PC,will be smoother on an FX-8xxx then on a pentium-i3-i5,might be slower on the whole compared to an i5/i3 but it will be smoother.

Than again anybody who has figured out ctrl+alt+del can put priorities on badly cooperating software* and run the whole shabang smoothly even on a pure dual core, again might(will) be slower on the whole compared to an XX(insert beloved CPU here) but smoother or just as.

*Programs or games that use a lot of CPU resources and jet start with a priority higher than normal.(cough cough Assassins creed cough)


No, sorry the answer is no.

1) Make believe scenarios don't exist, that's why they're called make believe
2) The thread is Intel vs AMD not FX8 vs i3 so even if your make believe scenario existed, the answer is still no
3) FX8 loses far more often then not to an i5 even in multi-threaded environments since Ivy Bridge.

In other words, no.
 

TheELF

Diamond Member
Dec 22, 2012
4,027
753
126
1)There is nothing make believe about crappy software.
2)Exactly, the thread is vague, so any combi goes.
3)Loosing is not the same as running smoothly,I thought I explained that pretty well.

FXs are notoriously slow and no amount of o/c can fix that but that is not part of the original topic.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
1)There is nothing make believe about crappy software.
2)Exactly, the thread is vague, so any combi goes.
3)Loosing is not the same as running smoothly,I thought I explained that pretty well.

FXs are notoriously slow and no amount of o/c can fix that but that is not part of the original topic.

Oh, right, running smoothly. The voodoo metric that only the ADF attests to. In other words, more make believe fairy tales. Enjoy them, it's the only time AMD beats Intel, in your own fantasies.

Where's this benchmark chart that measures smoothness? I've never seen it. If it's not a fantasy metric, it has to exist somewhere. Just like how smooth a game runs isn't always tied to the highest fps video card, and this can be measured by frame times. So where is your data to measure smoothness?
 
Last edited:

Abwx

Lifer
Apr 2, 2011
11,854
4,829
136
3) FX8 loses far more often then not to an i5 even in multi-threaded environments since Ivy Bridge.

I would say this is the other way around, you would be indeed hard pressed to prove this misleading point, even in benches that use AVX2 the i5 has trouble equaling the FX..
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
I hope someone pointed out that Cinebench, at least to my knowledge, is a heavily FPU-biased benchmark that will make the FX chips look worse because they only have four floating point units. Bulldozer was designed to focus on integer performance.

It may be as unfair as only using an integer benchmark to compare architectures.


The FX numbers there are from my machine. It should be noted that I'm running my FX at a speed that gives me a higher score than Yuriman's i5. If I lowered my clocks to get to a point where I scored more or less the same, the wattage used by my CPU would be significantly less. Let me try and clock my CPU for a ~500 score and let me get new numbers. I'm sure my FX will be quite a bit more competitive.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
I would say this is the other way around, you would be indeed hard pressed to prove this misleading point, even in benches that use AVX2 the i5 has trouble equaling the FX..

I'm not terribly concerned about what you would say considering you have said a lot already. Unfortunately for you, most of it has been wrong at best and complete BS at worst.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
I'm not terribly concerned about what you would say considering you have said a lot already. Unfortunately for you, most of it has been wrong at best and complete BS at worst.

http://forums.anandtech.com/showthread.php?t=2443441

For whatever it's worth, Cinebench isn't known to be an AMD-friendly bench, and the four module FX's seem to be able to pull away from all the i5's with the exception of the Skylake i5's with a healthy overclock.
 

Abwx

Lifer
Apr 2, 2011
11,854
4,829
136
I'm not terribly concerned about what you would say considering you have said a lot already. Unfortunately for you, most of it has been wrong at best and complete BS at worst.

The BS is when one is left unable to prove a statement he made, explicitely you talked too much, as pointed by the member above even in the Intel centric Cinebench i5s do not manage to beat the FX8350...

http://www.hardware.fr/articles/940-19/indices-performance-cpu.html

Check the benches, in video encoding they use AVX2, FX is castrated with 1600MHz RAM to help a little...