Is there any reason to use FX CPUs right now?

cebalrai

Senior member
May 18, 2011
250
0
0
Historically I've used AMD CPUs in media and modest gaming rigs if nothing else to support the little guy. I haven't built a machine in a few years though and now I'm about to build a mid-caliber gaming rig for our new 65" high-end Samsung TV in the basement.

Is there any reason to look at FX processors right now? Thanks.
 

Shehriazad

Senior member
Nov 3, 2014
555
2
46
Well the pricing for FX6300 and FX8350 can be ridiculously low at the moment(depending on region)...that's the only reason to look at those CPUs.


Over here the FX8350 is like 160$ and the FX6300 is like 100$...well and I guess models like the 130-140$ FX8320/E also exist.

But yea...just pricing, nothing else.
 
Last edited:

cebalrai

Senior member
May 18, 2011
250
0
0
Well the pricing for FX6300 and FX8350 can be ridiculously low at the moment(depending on region)...that's the only reason to look at those CPUs.


Over here the FX8350 is like 160$ and the FX6300 is like 100$...well and I guess models like the 130-140$ FX8320/E also exist.

But yea...just pricing, nothing else.


At those prices is there equal or better performance than Intel's offerings?
 

janeuner

Member
May 27, 2014
70
0
0
I've played both those games on an fx 6300 and a 4670k.

The framerate for tanks is substantially higher on the intel part, but I can't actually see the difference. min fps will be in the low 40s for amd, or mid-50s for intel.

Elite runs great on multicore systems. No matter what new processor you use, you will be gpu bound.

However, you are talking about htpc as well. Amd will use >50w more power than Amd. The cost savings now can be lost through the power bill awful fast if the computer is running under load for several hours every day.
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
AMD's FX line hasn't had a true refresh in a while. Recently some new models were released with an "e" suffix that indicates they're binned for better power usage characteristics, but AFAIK there were no design changes.

A brief history:

Late 2011, AMD released Bulldozer-based FX-81xx, 61xx and 41xx as their successor to Stars (Phenom II), which was largely a Core2 competitor. Bulldozer introduced "modules" which were basically core-pairs that shared resources. If both cores were in-use, they would suffer a performance penalty (something like 25% for each on BD?), but the advantage was that each module was significantly smaller than two cores would have been otherwise. It was up against Sandy Bridge, which had been on the market for around 9 months. Generally speaking, Bulldozer was slower than Stars and power consumption was also higher, despite a process shrink, and thus was pretty non-competitive with Intel's offerings, considering Stars was a Core2 competitor and Intel had their 2nd generation i7 out.


In mid-2012 Intel released Ivy Bridge, which was a die shrink and slight design change from Sandy Bridge. Performance was up 0-10%, while power consumption was down. Shortly after, AMD released Piledriver-based FX-83xx, 63xx and 43xx, which were a significant improvement over Bulldozer in single-threaded performance, clocked higher, had less of a penalty from module shared resources. Despite this, compared with Piledriver, an Ivy Bridge core was often more than 50% faster. AMD would sell you twice as many cores for the same price though, which resulted in FX-8 chips being a fair value at around $250. You could get performance that was competitive with an i7 (if your programs could saturate 8 threads) for $75 less, though single-threaded performance was still relatively bad, and power consumption (when all threads were saturated) was considerably higher too. Games such as Starcraft 2 and Guild Wars 2 performed badly on Piledriver, while Battlefield ran as well or better than on Intel's counterparts.

Since then, AMD has not released a new FX-based architecture on AM3, only a 2-module/4 core APU on FM2 called Steamroller, while Intel has released Haswell, which included another single-threaded performance improvement and efficiency bump, and is relatively close to releasing Broadwell, Haswell's successor. AMD has gradually lowered their prices, and it's arguable that at around $100, an FX-8310 would be a suitable chip for some. Multithreaded performance is equal or better than a Haswell i5, which starts at around $180, but power consumption when doing the same amount of work is much higher. Single-threaded performance on Haswell is 50-60% higher. AM3 is also a very old socket and lacks some of the features of AMD's more recent FM2 and Intel's 1150 sockets, which matters to some but often isn't a deal-breaker.

The FX-9xxx series are a recent release of chips binned to run at very high frequency, but also have a very high power consumption, so much so that AMD originally shipped them with watercoolers. They are also a lot more expensive than the FX-8xxx chips.

~

It's worth noting that most games, while bottlenecked by Piledriver's poor single-threaded performance, are still bottlenecked above 40fps minimums. This may improve when Directx12 is released too, but Piledriver will probably be more than 4 years old before we see the first games that take advantage of this.

Here is a benchmark from the Battlefield Hardline beta:

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Battlefield_Hardline_Beta_2-test-proz.jpg


FarCry 4:

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Far_Cry_4-nv-test-fc_proz.jpg


CoD Advanced Warfare:

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Call_of_Duty_Advanced_Warfare-test-cod_proz_amd.jpg



It's worth noting that AMD's chips idle nearly as well as Intel's, and that multithreaded performance is significantly better than Intel's similarly priced i3:

65065.png


65067.png



I probably wouldn't put one in a gaming rig though.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Wish AMD would hurry up and make an FM2+ CPU with more than 4 cores.

What would that change? They are not getting faster. And there is the 95W limit.

It would only make a product a handful of people may buy.
 

el etro

Golden Member
Jul 21, 2013
1,584
14
81
Get the cheaper i5 you can get for gaming.


FX does well in games that truly scales well with the 6/8 cores.
 

Maximilian

Lifer
Feb 8, 2004
12,604
15
81
Yes there are several reasons to use an FX chip!

- Curiosity

- Brand loyalty

- Uh...

Okay so only two reasons, better than none though I suppose :thumbsup:
 

Maximilian

Lifer
Feb 8, 2004
12,604
15
81
AMD's FX line hasn't had a true refresh in a while. Recently some new models were released with an "e" suffix that indicates they're binned for better power usage characteristics, but AFAIK there were no design changes.

A brief history:

Late 2011, AMD released Bulldozer-based FX-81xx, 61xx and 41xx as their successor to Stars (Phenom II), which was largely a Core2 competitor. Bulldozer introduced "modules" which were basically core-pairs that shared resources. If both cores were in-use, they would suffer a performance penalty (something like 25% for each on BD?), but the advantage was that each module was significantly smaller than two cores would have been otherwise. It was up against Sandy Bridge, which had been on the market for around 9 months. Generally speaking, Bulldozer was slower than Stars and power consumption was also higher, despite a process shrink, and thus was pretty non-competitive with Intel's offerings, considering Stars was a Core2 competitor and Intel had their 2nd generation i7 out.


In mid-2012 Intel released Ivy Bridge, which was a die shrink and slight design change from Sandy Bridge. Performance was up 0-10%, while power consumption was down. Shortly after, AMD released Piledriver-based FX-83xx, 63xx and 43xx, which were a significant improvement over Bulldozer in single-threaded performance, clocked higher, had less of a penalty from module shared resources. Despite this, compared with Piledriver, an Ivy Bridge core was often more than 50% faster. AMD would sell you twice as many cores for the same price though, which resulted in FX-8 chips being a fair value at around $250. You could get performance that was competitive with an i7 (if your programs could saturate 8 threads) for $75 less, though single-threaded performance was still relatively bad, and power consumption (when all threads were saturated) was considerably higher too. Games such as Starcraft 2 and Guild Wars 2 performed badly on Piledriver, while Battlefield ran as well or better than on Intel's counterparts.

Since then, AMD has not released a new FX-based architecture on AM3, only a 2-module/4 core APU on FM2 called Steamroller, while Intel has released Haswell, which included another single-threaded performance improvement and efficiency bump, and is relatively close to releasing Broadwell, Haswell's successor. AMD has gradually lowered their prices, and it's arguable that at around $100, an FX-8310 would be a suitable chip for some. Multithreaded performance is equal or better than a Haswell i5, which starts at around $180, but power consumption when doing the same amount of work is much higher. Single-threaded performance on Haswell is 50-60% higher. AM3 is also a very old socket and lacks some of the features of AMD's more recent FM2 and Intel's 1150 sockets, which matters to some but often isn't a deal-breaker.

The FX-9xxx series are a recent release of chips binned to run at very high frequency, but also have a very high power consumption, so much so that AMD originally shipped them with watercoolers. They are also a lot more expensive than the FX-8xxx chips.

~

It's worth noting that most games, while bottlenecked by Piledriver's poor single-threaded performance, are still bottlenecked above 40fps minimums. This may improve when Directx12 is released too, but Piledriver will probably be more than 4 years old before we see the first games that take advantage of this.

Here is a benchmark from the Battlefield Hardline beta:

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Battlefield_Hardline_Beta_2-test-proz.jpg


FarCry 4:

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Far_Cry_4-nv-test-fc_proz.jpg


CoD Advanced Warfare:

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Call_of_Duty_Advanced_Warfare-test-cod_proz_amd.jpg



It's worth noting that AMD's chips idle nearly as well as Intel's, and that multithreaded performance is significantly better than Intel's similarly priced i3:

65065.png


65067.png



I probably wouldn't put one in a gaming rig though.

These benchs are pretty embarrassing, at 4.7ghz the thing gets whooped by the old 2500k at stock 3.3ghz.

Maybe 2-3 years ago I would say its a reasonable alternative to Intel but not today, I agree I wouldn't put one in a gaming rig either.
 

schmuckley

Platinum Member
Aug 18, 2011
2,335
1
0
Personally I can't think of a one..
Oh wait..add LN2 and go for frequency scores.
That's it.
So yes there is a reason.
 

DrMrLordX

Lifer
Apr 27, 2000
22,768
12,776
136
These benchs are pretty embarrassing, at 4.7ghz the thing gets whooped by the old 2500k at stock 3.3ghz.

Two of those (3dpm, Cinebench) are the worst possible benches for an FX CPU. Seriously. Well okay he didn't throw in SuperPi, but it was meant to be a show of MT power.

As for the 2600k beating the 9590 in those games, bear in mind that a 2600k still has a market value of ~$200+. Big emphasis on the plus. You can get chip + board for less than that buying a 2600k. Heck you can probably throw in an aftermarket HSF too and still come out in the same budget.

Regardless.

The 8310 is cheap and is probably the best FX on the market for a budget buyer. It really isn't that hard to tune it to run 4-4.5 ghz where it can be best considered "respectable". There are people who have done this on various cheap 4+1 boards, which means you don't have to go crazy trying to find an 8+2 board to accomodate it. By all means, do so if you like, but pushing an 8310 past 4.8 ghz is probably not sane.

But 4.4-4.5 ghz @ 1.36v vcore (or lower) is achievable, which is the stock vcore of an 8350 and which puts the chip at ~125TDP or so. It isn't guaranteed, but your odds are looking pretty good if you go that route. You won't need exceptional cooling or an exceptional motherboard to achieve these ends, which is contrary to the reputation FX chips have earned for themselves over the years. You can go lower power and lower frequency as well, if you really want to.

As for gaming, I would consider this solution to be "adequate". Combined with something like a 290x (which is getting pretty cheap itself, btw), this gaming solution will probably look pretty good running Mantle-enabled games or future DX12 titles. Not great, but pretty good.

But really, this kind of a chip is at its best doing video encoding, WCG, stuff like that. It might not win any benchmarks, but tuned properly for a nice, low voltage, you aren't going to eat up a lot of power running it, and it will churn through MT workloads rather nicely.
 
Last edited:

Atreidin

Senior member
Mar 31, 2011
464
27
86
I replaced a Phenom II with a FX because it supports AES instructions, which really helped for the application and it was easier to do that than redo everything.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
The FX CPU's are still fairly solid, but they are getting harder to justify in a new rig vs. Intel's CPU's. The platform isn't as current and the performance can be average to quite good depending on the game / application. I think there are still some good buys for those with an AM3+ board or that can get a decent one cheaply. The FX6300, FX8310, and FX8320E all come to mind, solid bang for the buck, especially for someone who doesn't mind overclocking and tweaking voltages.
 

positivedoppler

Golden Member
Apr 30, 2012
1,145
237
116
what is the maximum resolution of your TV? If its 1080p, everything is overkill. Dont spend too much unless its a 4K tv
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
what is the maximum resolution of your TV? If its 1080p, everything is overkill. Dont spend too much unless its a 4K tv

Fallacy.


CPU bottlenecks are independent of resolution. If there's a specific game OP wants to play that is bottlenecked by CPU at 20fps (not talking about an FX, mind you, but a hypothetical CPU and game, FX chips will provide an adequate experience), no matter how high or low your resolution is, you won't exceed 20fps.

With regards to using Cinebench and 3DPM to compare, take it up with the writers for Anandtech; those were some of the only multithreaded benches that the FXs were competitive in that they ran in the recent Devil's Canyon review, so I had no others to link from it.

EDIT: Looking at the review again, I missed the h.265 bench which shows them in a good light, but in their Handbrake encoding bench the FX chips looked worse relative to Intel's offerings in encoding.
 
Last edited:

rancherlee

Senior member
Jul 9, 2000
707
18
81
If you can get an 8 core in the 99-120$ range then yes. I wouldn't spend more than 120$ on an FX part. I payed 210$ for my 8320/99pro combo and it a decent all around rig that performs well at 4.5ghz. I haven't played any games where I've felt the 8320 is holding me back much.
 

Ramses

Platinum Member
Apr 26, 2000
2,871
4
81
http://www.overclock.net/t/1534128/vishera-vs-devils-canyon-a-casual-comparison-by-an-average-user

They are fun CPU's, I've enjoyed my 8350 and 9590 and wanted for nothing performance wise for single player 1080p use with a pair of 280x. Don't think I'd build one today if gaming is the only goal, but they are a good experience and plenty useable overall still. As more software has become able to use multiple cores they seem to have better performance overall the last few years, but they are old and some apps lean on that single core performance a lot. They still remind me of a Cyrix vs an early Pentium years ago, good integer performance and cheap but poor FPU. Not an exact comparison but the spirit of it was similar. I enjoyed mine a lot.