Is there any reason to use FX CPUs right now?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
yes, I think they could be fun in terms of OC and tweaking, certainly more than the locked Intel CPUs

also if you are playing with video encoding and rendering (without the money for a 6 core+ Intel part) they can make a lot of sense... for those gamers also doing a lot of youtube/twitch stuff I guess, the potential to help with the video work is there

they also look good for file compression work with 7zip, but I think most of the time is more limited by something else (like HD/SSD) anyway
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
yes, I think they could be fun in terms of OC and tweaking, certainly more than the locked Intel CPUs

also if you are playing with video encoding and rendering (without the money for a 6 core+ Intel part) they can make a lot of sense... for those gamers also doing a lot of youtube/twitch stuff I guess, the potential to help with the video work is there

they also look good for file compression work with 7zip, but I think most of the time is more limited by something else (like HD/SSD) anyway

Only issue for those doing encoding/rendering/compression is, if they don't otherwise need a video card, they'll need one with an FX because it lacks an IGP.
 
Apr 20, 2008
10,067
990
126
For the same price as the cheapest i5 (i5 4430, 3.0Ghz, 84w, $185) you can get an FX-8300 (95w), Gigabyte GA-78LMT-USB3 (up to 125w FX, room to OC the 8300 if needed, IGP on board), and a 4GB stick of G.Skill DDR3 1600.

Start running down the list from the top of the most played games on Steam at this moment. Can the FX play every one of them well? Will an 8-core FX be suited for future AAA and indie games alike? Would saving $70 to go towards an even better video card give a much better experience for those who game or an SSD for those who would use it for work?

Unless you're willfully ignorant, you know the answers to these questions. Even if the one metric of gaming isn't factored in, the FX is extremely responsive in everyday desktop and productivity tasks. Chrome will hit 55+% utilization on one image heavy tab (no flash, more if you surf without an adblocker) in when an 8-core FX is clocked down to 1.4Ghz , meaning at the minimum 5 threads can be saturated if the load is high enough. We're at the point where out browsers can utilize as many cores as you can throw at it. Modern software is only getting more parallel by the day. If you're on a budget and you can't afford to upgrade every other year, the FX is an even better value.

But you know, it is not Intel and doesn't mesh with the regurgitated talking points spread all over these forums by people who spend more time talking about hardware than using it.
 

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
keep in mind the 760G IGP is a joke compared to the current Intel IGPs, it's a cut down 780G from 2008
 
Apr 20, 2008
10,067
990
126
keep in mind the 760G IGP is a joke compared to the current Intel IGPs, it's a cut down 780G from 2008

That's what you nitpick about? Really?

For productivity it doesn't really matter which one is chosen. For the gamer you need to get a discrete card either way.
 

positivedoppler

Golden Member
Apr 30, 2012
1,148
256
136
Fallacy.


CPU bottlenecks are independent of resolution. If there's a specific game OP wants to play that is bottlenecked by CPU at 20fps (not talking about an FX, mind you, but a hypothetical CPU and game, FX chips will provide an adequate experience), no matter how high or low your resolution is, you won't exceed 20fps.

I stand by my claim that it's all overkill for 1080p gaming if that's what his TV supports. But how would you prove that cpu bottleneck is independent of resolution?

http://www.anandtech.com/show/7963/...ew-core-i7-4790-i5-4690-and-i3-4360-tested/10

http://www.anandtech.com/show/8864/amd-fx-8320e-cpu-review-the-other-95w-vishera/5
 
Aug 11, 2008
10,451
642
126
For the same price as the cheapest i5 (i5 4430, 3.0Ghz, 84w, $185) you can get an FX-8300 (95w), Gigabyte GA-78LMT-USB3 (up to 125w FX, room to OC the 8300 if needed, IGP on board), and a 4GB stick of G.Skill DDR3 1600.

Start running down the list from the top of the most played games on Steam at this moment. Can the FX play every one of them well? Will an 8-core FX be suited for future AAA and indie games alike? Would saving $70 to go towards an even better video card give a much better experience for those who game or an SSD for those who would use it for work?

Unless you're willfully ignorant, you know the answers to these questions. Even if the one metric of gaming isn't factored in, the FX is extremely responsive in everyday desktop and productivity tasks. Chrome will hit 55+% utilization on one image heavy tab (no flash, more if you surf without an adblocker) in when an 8-core FX is clocked down to 1.4Ghz , meaning at the minimum 5 threads can be saturated if the load is high enough. We're at the point where out browsers can utilize as many cores as you can throw at it. Modern software is only getting more parallel by the day. If you're on a budget and you can't afford to upgrade every other year, the FX is an even better value.

But you know, it is not Intel and doesn't mesh with the regurgitated talking points spread all over these forums by people who spend more time talking about hardware than using it.

I you want to limit yourself to those ten games, I would bet that a stock i3 would be faster in most of them, while using much less power, especially if you overclock the 8300 to get some kind of decent single threaded performance.
 

Ramses

Platinum Member
Apr 26, 2000
2,871
4
81
One of these threads, a fellow pointed out that folks with FX or APU+cheap GPU kinda budget probably aren't spending $30 to $60 for higher end more recent games that don't run on them well anyway, or at least not too often. I think there's some truth to that. I learned long before the FX that one of the keys to being happy with older or budget hardware, was to buy games that were on sale for under ten bucks or so. It's a pretty good working system.

That said, Crysis 3 is my benchmark for a well written and good running game now.
I Google'd i3 Crysis 3 and it looks like 30fps-ish with a GTX750, which is playable but not amazing, an i5 of some sort picked up 10-20fps which is "plenty" imo, and as I recall still a bit below where my FX with a pair of 280x's was. CPU usage was not high on C3, and it seems to spread out well, and was still a beautiful and really fun game. If some other game can't pull that off on an FX cpu system without bringing something new and somehow "more" to the table, I don't blame the CPU myself.

Unless I've missread a lot of articles the trend is toward making use of more cores and the GPU rather than few fast CPU cores in game development. While it is getting a bit long in the tooth as they say, if you can get some flavor of FX up to 4ghz-ish or a bit more for half the price of an i5 or less it's not a bad budget CPU for some light to even moderate gaming if you can give it enough GPU. Yes an i5 or better is the preferred choice, but it's money.
 
Apr 20, 2008
10,067
990
126
I you want to limit yourself to those ten games, I would bet that a stock i3 would be faster in most of them, while using much less power, especially if you overclock the 8300 to get some kind of decent single threaded performance.

How about clicking right below that and seeing the top 100 games. There isn't a game on that list that's not perfectly playable on an FX. That is especially true because if you didn't have the cash to buy an i5 or i7, you certainly don't have the $$ to drop on a high end video card. Does the 8 core Piledriver FX not hit at least 60fps in almost every single title? How about AAA titles and console ports?

An i3 is the most shortsighted line of CPUs you can purchase, especially for those on a budget who would have to upgrade within a couple years. Most people use their desktop for 3-5 years until upgrading. How on earth could you recommend an i3?

Power consumption? You think the average person is going to fret about 60 watts? Unless you have a Prius in your driveway, you don't actually care about power consumption when making your purchases. That is especially true of negligible amounts of peak power consumption. I hope you don't recommend a discrete GPU either because an IGP is so much more efficient.
 

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
That's what you nitpick about? Really?

For productivity it doesn't really matter which one is chosen. For the gamer you need to get a discrete card either way.

but it will show very easily if you try to use the IGP for more than just showing your windows desktop, it's always good to give a warning about this ancient chipset.


I stand by my claim that it's all overkill for 1080p gaming if that's what his TV supports. But how would you prove that cpu bottleneck is independent of resolution?

http://www.anandtech.com/show/7963/...ew-core-i7-4790-i5-4690-and-i3-4360-tested/10

http://www.anandtech.com/show/8864/amd-fx-8320e-cpu-review-the-other-95w-vishera/5


CPU bottleneck can happen at any resolution, and it's more likely to happen at 1080P than 4K (GPU will suffer a lot with 4K, CPU not really)

just because some test shows no relevant difference between CPUs in gaming, it doesn't mean it's always the case, games are larger than just 30s random scenes (with huge performance variations), and people play a huge variety of games and settings...

if CPU performance was not a problem, Mantle and DX12 would not be really needed.
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
For the same price as the cheapest i5 (i5 4430, 3.0Ghz, 84w, $185) you can get an FX-8300 (95w), Gigabyte GA-78LMT-USB3 (up to 125w FX, room to OC the 8300 if needed, IGP on board), and a 4GB stick of G.Skill DDR3 1600.

Start running down the list from the top of the most played games on Steam at this moment. Can the FX play every one of them well? Will an 8-core FX be suited for future AAA and indie games alike? Would saving $70 to go towards an even better video card give a much better experience for those who game or an SSD for those who would use it for work?

Unless you're willfully ignorant, you know the answers to these questions. Even if the one metric of gaming isn't factored in, the FX is extremely responsive in everyday desktop and productivity tasks. Chrome will hit 55+% utilization on one image heavy tab (no flash, more if you surf without an adblocker) in when an 8-core FX is clocked down to 1.4Ghz , meaning at the minimum 5 threads can be saturated if the load is high enough. We're at the point where out browsers can utilize as many cores as you can throw at it. Modern software is only getting more parallel by the day. If you're on a budget and you can't afford to upgrade every other year, the FX is an even better value.

But you know, it is not Intel and doesn't mesh with the regurgitated talking points spread all over these forums by people who spend more time talking about hardware than using it.

2009 era frisbee of a mobo with 4+1 VRMs that will likely blow if you start OC an FX, 4GB of RAM isn't enough for any recent AAA game in the past year. Your cheapo build is full of compromises.

Look at Unity:

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Assassins_Creed_Unity-test-ac_ram2.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Action-Assassins_Creed_Unity-test-ac_proz.jpg


A 9590 is required to even come close to 60FPS minimum's, and is outclassed by an i3. Any game that wants single threaded grunt will suffer compared to Intel.

Even in the original Witcher from 2007 (!):

http--www.gamegpu.ru-images-stories-Test_GPU-Retro-The_Witcher-test-the_witcher_proz.jpg


an FX still can't crack 60FPS minimum's. You will feel those minimum dips way more than Intel.

And surfing with no adblocker? Eh? Haswell can do that clocked down to 800MHz.
 

Roland00Address

Platinum Member
Dec 17, 2008
2,196
260
126
Sometimes they are ridiculously cheap.

Here was a sale where the 8310 is under a $100. The 8310 is an 4 module 8 core cpu that is within 5% of the 8350 in single thread, and 12% the performance of the 8350 in multi thread and it is unlocked so you can always overclock this 95w cpu.

http://forums.anandtech.com/showthread.php?t=2417172&highlight=amd

Yes an i5 or i7 haswell will be faster overall but the 8310 is as fast as the pentium g3258 at stock speeds in single thread and as fast as the i5 haswell in purely multithreaded tasks. Thus getting such a cpu for $100 allows you to spend more money on a bigger ssd, a bigger gpu, or just save $100.

AMD FX is not for everyone, but for the everyday person it may work for someone of these people as long as they spend money on a ssd and a gpu.
 
Apr 20, 2008
10,067
990
126
2009 era frisbee of a mobo with 4+1 VRMs that will likely blow if you start OC an FX, 4GB of RAM isn't enough for any recent AAA game in the past year. Your cheapo build is full of compromises.

Look at Unity:





A 9590 is required to even come close to 60FPS minimum's, and is outclassed by an i3. Any game that wants single threaded grunt will suffer compared to Intel.

Even in the original Witcher from 2007 (!):



an FX still can't crack 60FPS minimum's. You will feel those minimum dips way more than Intel.

And surfing with no adblocker? Eh? Haswell can do that clocked down to 800MHz.

Change those settings to high quality and it's a whole different story. Unless you've got a $300 or more GPU in there, you're running nothing at VHQ or better settings. Who buys an FX and pairs it with a $450 GPU like the 780ti? With lower GPU settings comes less CPU intensive models/physics/animations. The majority of gamers on desktops have 650ti's and Radeon 250/260x class performance. Also, those games you linked are completely playable at 30fps as that's what they are when I've played them on consoles. The original Witcher? How many people play that game from 2007? I get what you're saying, but that's unrealistic. Look at the most played PC games and tell me the FX isn't enough. Every game is playable, most near 60fps minimum on high settings or better.

Those motherboards are adequate for OCing an FX up to 4.2-4.4Ghz for daily usage. I've got a Biostar TA970 and it's a 4+1 with a small heatsink on the VRMS and it can overclock to 4.6Ghz without issues. It's then a problem of acceptably cooling the CPU than anything else.

1400Mhz is the minimum clock in CCC.
 
Last edited:
Aug 11, 2008
10,451
642
126
How about clicking right below that and seeing the top 100 games. There isn't a game on that list that's not perfectly playable on an FX. That is especially true because if you didn't have the cash to buy an i5 or i7, you certainly don't have the $$ to drop on a high end video card. Does the 8 core Piledriver FX not hit at least 60fps in almost every single title? How about AAA titles and console ports?

An i3 is the most shortsighted line of CPUs you can purchase, especially for those on a budget who would have to upgrade within a couple years. Most people use their desktop for 3-5 years until upgrading. How on earth could you recommend an i3?

Power consumption? You think the average person is going to fret about 60 watts? Unless you have a Prius in your driveway, you don't actually care about power consumption when making your purchases. That is especially true of negligible amounts of peak power consumption. I hope you don't recommend a discrete GPU either because an IGP is so much more efficient.

Good thing the super bowl was not played on your field. Nobody would know which way to go because you keep shifting the goalposts. You were the one who wanted to limit the discussion to the top ten Steam games. Now you want to include the top 100, as well as plan for two years into the future.

And I see you are using the classic AMD argument that cost matters when you buy the cpu, but not when you are paying for electricity. Whatever else you use power for is irrelevant when examining the cpus. The only difference that matters is the difference in cost between the cpus and the amount of cost savings from lower electricity use. If you take your own estimate of 60 watts, 4 hours per day 15 cents per KWH (pretty conservative for a lot of areas, including taxes and fees), that comes out to 13 dollars per year. If you keep the cpu for 3 years, that totals up at least half of the cost difference. Whatever else you spend money on is irrelevant to this particular comparison.
 
Apr 20, 2008
10,067
990
126
You were the one who wanted to limit the discussion to the top ten Steam games. Now you want to include the top 100, as well as plan for two years into the future.

Start running down the list from the top of the most played games on Steam at this moment.


Looks like you didn't even read what I wrote, but who responded to me. If you need to run down a list of ten titles, you need to run more often. That page has the top 100 most active steam games.

And I see you are using the classic AMD argument that cost matters when you buy the cpu, but not when you are paying for electricity. Whatever else you use power for is irrelevant when examining the cpus. The only difference that matters is the difference in cost between the cpus and the amount of cost savings from lower electricity use. If you take your own estimate of 60 watts, 4 hours per day 15 cents per KWH (pretty conservative for a lot of areas, including taxes and fees), that comes out to 13 dollars per year. If you keep the cpu for 3 years, that totals up at least half of the cost difference. Whatever else you spend money on is irrelevant to this particular comparison.

If you have time to play video games for 4 hours a day, every single day, GPU power consumption is much more important than a CPU spread. That and you've got no other hobbies. $13 a year is nothing compared to a $70 upfront cost. Monitor panel type, number of monitors, GPU power consumption, PSU efficiency and how many light bulbs you have in your room are much more important. This might appear to be pedantic to you but that's how ridiculous it is that you think a 60w peak difference on one component is a big deal. It's not the end of the world and the vast majority of people truly don't care.

By your own example, if everyone bought a product because it would cost them less to use it over time, every single passenger vehicle on the road at this moment would be a hybrid or electric. Up front cost is everything. If it wasn't we wouldn't have credit cards or home loans to remove those barriers.
 
Last edited:

Ramses

Platinum Member
Apr 26, 2000
2,871
4
81
Part of the nature of a budget purchase is that it is highly dependant on right-then money rather than later-money. It's a luxury being able to spend more now to save later, while it is technically being more thrifty, it's often not a viable option. Having been broke-ass poor a significant portion of my life I can say with assuredness that $50 today is much more pressing and significant than $150 spread over a period of years, especially given the ability to minimize it with some effort.

Not arguing, just offering a common scenario.
 

redzo

Senior member
Nov 21, 2007
547
5
81
I find it good only for cheap VM box use right now: those AMD integer cores are way better than 8 hyper threaded intel threads which are more expensive: better/more resources for less $. Otherwise, an intel non hyper threaded haswell i5 is faster in pretty much everything for only a few $ more.
 

DrMrLordX

Lifer
Apr 27, 2000
22,931
13,011
136
With regards to using Cinebench and 3DPM to compare, take it up with the writers for Anandtech; those were some of the only multithreaded benches that the FXs were competitive in that they ran in the recent Devil's Canyon review, so I had no others to link from it.

EDIT: Looking at the review again, I missed the h.265 bench which shows them in a good light, but in their Handbrake encoding bench the FX chips looked worse relative to Intel's offerings in encoding.

A fair point, though one can look to other sources besides Anandtech to find MT results for FX chips. y-cruncher, wPrime, PiFast, all kinds of encoding and rendering benchmarks out there. Heck there's this thread here:

http://forums.anandtech.com/showthread.php?t=2294486

showing an 8350 vs 3770k vs non-HT 3770k. Idontcare up to his usual informing-the-public dogooderishness (heh). Regardless, an 8310 pushing 4.5 ghz in the same power envelope as a stock 8350 would be pretty tasty as a budget alternative to more-expensive Intel chips. Or you could go for savings up front and lower power by dialing it in to 4 ghz at a much lower vcore. Whatever floats your boat.
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
Scholzpdx, I'm glad you've brought up some other use-cases for FX chips. I've always considered an 8 core CPU to be rather excessive for Google Chrome, but I tend to have less than 6 tabs open at any given point, because I'm an organization freak. Someone like my brother, who regularly leaves 40+ tabs open, might benefit. Do background tabs use as much CPU as foreground ones though?

I think you'll have a hard time selling 40fps as being "plenty" to some gamers on here, but for you, the FX chips are a good match and as you've pointed out, compared with an i5, they can free up money for other parts on a fixed budget when compared with an i5 in some cases.

redzo also points out that FX chips are also good for VM boxes.

Intel has a nasty habit of disabling useful features on their lower-end chips. AFAIK, all of AMD's CPUs have a full feature set (including virtualization) and, with a proper motherboard, ECC too.

As for the power consumption argument, I feel in many cases it works out to near-negligible over the useful life of a CPU. Most of us here haven't (in the past) used a chip for more than 3-4 years. Things are starting to change with the slowed pace of advancement, but even so, I'm likely to sell my Ivy Bridge chip in a year or two, recoup most of the costs, and purchase a more modern platform to play with.

That said, I do look at the long-term cost of ownership, and it does factor into my decisions, among other things, and I put an i3 in my wife's computer after looking long and hard at AMD's APUs. Part of my decision was based on how quickly AMD chips depreciate vs Intel ones, and it's nice to know I can sell an i3 for nearly what I paid, several years later. Since we're in Florida right now, we pay for electricity twice when it comes to electronics due to the almost year-round need for air conditioning, which isn't true for those up north, whose CPUs double as heaters.

Beyond that, the games we regularly play are very CPU bottlenecked, and run badly on FX chips. She's not a power user with a bunch of tabs open, so a Pentium might have been perfectly fine too, while still providing the single-threaded performance that is necessary for our use-case. I appreciate that the Intel stock cooler is near-silent because we tend to choose studio apartments for the sake of keeping lifestyle inflation down, which means our PCs are near where we sleep, and noise is a concern. Size is also a concern, and our PCs are both in ITX cases and, despite both having HD7850's, can easily fit together in an average backpack and free up desk and floor space. We were also able to use $25 Antec power supplies in our builds, because even 380w is overkill. Our PCs together idle at less power than a single incandescent lightbulb, and both have integrated Wi-Fi and tons of USB3 ports, despite paying less than $60 for our motherboards.

With regards to hybrids:

SLXKFCW.jpg
 
Last edited:
Aug 11, 2008
10,451
642
126
Looks like you didn't even read what I wrote, but who responded to me. If you need to run down a list of ten titles, you need to run more often. That page has the top 100 most active steam games.



If you have time to play video games for 4 hours a day, every single day, GPU power consumption is much more important than a CPU spread. That and you've got no other hobbies. $13 a year is nothing compared to a $70 upfront cost. Monitor panel type, number of monitors, GPU power consumption, PSU efficiency and how many light bulbs you have in your room are much more important. This might appear to be pedantic to you but that's how ridiculous it is that you think a 60w peak difference on one component is a big deal. It's not the end of the world and the vast majority of people truly don't care.

By your own example, if everyone bought a product because it would cost them less to use it over time, every single passenger vehicle on the road at this moment would be a hybrid or electric. Up front cost is everything. If it wasn't we wouldn't have credit cards or home loans to remove those barriers.

Well, if "up front cost is everything" then we would all be riding bicycles. Repeating a fallacious argument ad infinitum with increasing extreme and irrelevant red herrings thrown in does not somehow make it valid.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
I stand by my claim that it's all overkill for 1080p gaming if that's what his TV supports. But how would you prove that cpu bottleneck is independent of resolution?

http://www.anandtech.com/show/7963/...ew-core-i7-4790-i5-4690-and-i3-4360-tested/10

http://www.anandtech.com/show/8864/amd-fx-8320e-cpu-review-the-other-95w-vishera/5


hmm, something is wrong here otherwise the 35W 4765T is the best CPU of all time :p

Ian should fix this ;)

http://www.anandtech.com/show/7963/...ew-core-i7-4790-i5-4690-and-i3-4360-tested/10
63232.png
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
That's what you nitpick about? Really?

For productivity it doesn't really matter which one is chosen. For the gamer you need to get a discrete card either way.

760G has partial video decode for a number of formats (H.264) and no multi-monitor support. Its pretty crappy for desktop use.

hmm, something is wrong here otherwise the 35W 4765T is the best CPU of all time :p

Ian should fix this ;)

http://www.anandtech.com/show/7963/...ew-core-i7-4790-i5-4690-and-i3-4360-tested/10
63232.png

Minimum has a large variation and probably there is a fade to black/cutoff moment. Not terribly useful as it may be a single large downward spike.
 

el etro

Golden Member
Jul 21, 2013
1,584
14
81
The only advantage of Piledriver processors over Sandy Bridge and Ivibridge processors is the higher number of cores. FX6 vs i3 and FX8 vs i5.
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
http://www.techspot.com/review/943-best-value-desktop-cpu/page7.html

"When clocked at 4.6GHz, the FX-8320E used 63% more power on average in our application tests, 55% more when encoding and 27% more when gaming."

" There may be some great reasons to buy the FX-8320E, but we don't think it's the chip you want if you're after the best overall performance for the price."

"The FX-8320E holds its own in productivity apps: Microsoft Excel, Adobe Photoshop CC, After Effects CC, WinRAR or 7-zip, though in order to achieve that result it uses ~62% more power than the Core i3-4360."

. . . . Then after you take the power consumption figures into account, arguments for the FX-8320E begin to seem rather indefensible."

Settles the power argument once and for all I'd say. There is no reason to go FX.