Question The FX 8350 revisited. Good time to talk about it because reasons.

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,465
20,486
146
PhilsComputerLab tested it.


Much like Intel right now, it was knocked for needing more power, more heat, worse performance per watt, and dead platform. You know, I had a 8320e and 8350, and never understood what all the hate was about. Well, I did, and do, but let's leave that at wink wink nudge nudge.Yeah, yeah, Intel is at least competitive in some areas where as the FX was not. Losing in both productivity and gaming. And yes, it was a more severe performance gap. We will get to that.

All that said, if you bought one like I did, and you paired it with a freesync display, and overclocked much like Phil did, all the negativity felt like fake drama. I had no issue with gaming or anything else. I am guilty of recommending against an Intel build lately for all those reasons. But mostly because of inferior core count at a given price point, until this latest generation. I think perhaps the core count is the only one that will hold up. Yeah, quiet computing and other niche needs play a role, but overall I have been unfair to Intel the last couple of years.

Phil plays the games, he does not just run a canned bench over and over. He has done this with old kit, and he claims the 8350 has held up better than the ivy i5 from that period. A very hollow victory, since when we would have all been using them, the i5 was easily superior from a empirical standpoint.

Also. prices on those old FX are terrible now as he pointed out. Cliff notes of the retro review - AMD fine wine tech, makes no sense unless you already have one, did eventually meet the claims of fanboys bitd that the cores would make it better.

Now before you start spamming me with other reviews that show the i5 is still superior, I have seen them. The problem is they are not about the game play experience, just bar graphs of canned benchies, 60 second runs, that kind of stuff. The 8350 is now smoother in some newer titles when you play them. I have no trouble accepting that. I have used 4/4 i5s overclocked too. Furthermore, it took 8 years and a new more resource hungry OS and version of said OS to get there. Talk about too little too late.

Weird thread right? I state the i5 was better when it mattered. That the FX still makes no sense because the market is bonkers. It also needed a board with great VRMs or it was a ticking time bomb. I know that first hand. But where all this is going, besides an interesting retro review I wanted to share. Perhaps we all need to take a deep breath sometimes and chill out. We are always trying to make certain people seeking advice get the best bang for buck. But good enough bang for buck, that fits the budget, and ticks the boxes they need ticked, is good too. I will keep pointing out the better value, but I will also try to understand that good enough is good enough. I will also try to put my crystal ball down when offering advice. Man, I think that is the hardest one to do.

TL:DR?

I drank too much coffee.
I feel bad about trashing other people's choices, because some perspective has been found.

Thanks, and flame suit on. :p

Note: Some critiques i.e. dead platform were obviously made later in its life cycle.


 
Last edited:

DrMrLordX

Lifer
Apr 27, 2000
21,624
10,833
136
Too little too late for Piledriver. Had AMD been able to launch something like the 8320e in 2011, CON cores might have done pretty well. It took them 4 years of iterating on the design and the process to get to that point (and it took IBM engineers joining GF to get 32nm up to snuff).

It should also be pointed out that the longevity of old i7s has been much higher than that of old i5s. If you were to compare FX9590 or 9370 against the i7s of the day, I'm guessing the i7s would still win in modern software.
 

therealmongo

Member
Jul 5, 2019
109
247
116
Had the same CPU on a DFI at the time, the DFI is still alive but was looking to pair it with a Phenom II 6 core, but it never got supported.

Also watched that review the other night, was amazed to see how well its doing in 2020

:p

** EDIT **
Memory crossed, it was paired on a Asus Crosshair V Formula, the DFI was a generation before!
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,465
20,486
146
Too little too late for Piledriver. Had AMD been able to launch something like the 8320e in 2011, CON cores might have done pretty well. It took them 4 years of iterating on the design and the process to get to that point (and it took IBM engineers joining GF to get 32nm up to snuff).

It should also be pointed out that the longevity of old i7s has been much higher than that of old i5s. If you were to compare FX9590 or 9370 against the i7s of the day, I'm guessing the i7s would still win in modern software.
Absolutely.

I am using this more as a opportunity to reflect on how overblown everything is in the forums. I happily used FX 8 cores when as far as forums were concerned, they were garbage. For the latter part of its life, the 8350 was priced against the i3 after free games and or discounts. At the time, around 2015 people were saying to get the i3 for all the obvious reasons. It was not the better pick in many ways then, and aged far worse. And replacing it with a Haswell i7 never became a good value, as they retained far too much of their launch price. Heck I can build a AM4 for what @Meghan54 sold that old Haswell i7 for a couple weeks back. We make arguments that regardless of how certain we are, often rely on our crystal ball. If you buy for what is known, the i5 6/6 is a good value. Something I had trouble accepting until perspective on it hit me.
 

NostaSeronx

Diamond Member
Sep 18, 2011
3,686
1,221
136
The best Family 15h models were the AM3 Opteron parts.
;; AMD Opteron 3280 (for Bulldozer) and AMD Opteron 3380 (for Piledriver)

Generally operating at the efficiency sweetspot, well above the 2 GHz bottleneck on feeding Fiji. Whereas >FX-8150/8350 were pushed for performance, for no real reason as interconnect delays killed any frequency benefit.
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
231
106
I was annoyed with the FX series as it didn't support my good quality AM3 board so I gave away my Thuban to a relative and bought a 1150 mobo instead. Later on, I remember seeing 95w 8300s regularly for around ~$100, though. Great deal for the money, no question about that. But yeah, too little too late.

AMD is no longer a budget brand, though.
 

AtenRa

Lifer
Feb 2, 2009
14,001
3,357
136
I used the FX8350 along with a Core i7 3770K and in most games i was GPU limited even back in 2013-2014.
There were games that required high single thread performance and thats where the FX was lagging behind the i7, but in games like BF4 MT and in Mantle/DX-12 and Vulkan games the FX is doing admirable for its age.
Also dont forget that Core i7 was 50% more expensive back in 2012 when the FX8350 was launched.
 

SPBHM

Diamond Member
Sep 12, 2012
5,056
409
126
I went with a cheap Sandy Bridge setup around 2012 because I could get away using any of the CPUs with a cheap motherboard and the free cooler, also the ST performance being at the level of a 1GHz lower Intel CPU made it a weird proposition back then, but I'm sure I could've managed with a 4GHz FX just fine.

it's possible that in some current games the FX shows a clear advantage over 4c/4t Intel, but if you look at BF5 it still can't maintain 60FPS, so the value of that is questionable right now... and for the used market as stated, it's not a good option, perhaps, too little, too late?

it might look better now than it did due to how software evolved, but still doesn't really make much sense or justify its existence back then, the big changes from k10.5 to Bulldozer to often perform the same or lower at the time...
 

zir_blazer

Golden Member
Jun 6, 2013
1,164
406
136
The thing about Bulldozer and Piledriver is that it took YEARS of hype waiting for the next big architectural jump of AMD since the K10 Barcelona (Which was mediocre at launch due to the TLB bug, but a worthy competitor in its 45nm iteration), and instead we got something than couldn't consistently beat what it was intended to replace. How are you not gonna be dissapointed after years of waiting? It was precisely the same thing than happened when Pentium 4 Willamatte launched, it did poorly against the Pentium 3 Coppermine.
Bulldozer was known as being slower per Core than Phenom II. The PIIX6 Thuban, released in April 2010, was faster than a October 2011 4M/8T Bulldozer in ST and not much slower in MT. You had to wait one more year for Piledriver, which got ST back on par with Thuban. Basically, you had a node shrink and two years and half waiting just to get something equivalent to a hypothetical 8C Thuban, whereas Intel already moved from Nehalem to Sandy Bridge then to Ivy Bridge. An interesing detail is that AMD had a 32nm shrink of the K10 in the form of Llano, which was the first APU. I don't know if it was viable to simply continue pushing the K10 in 32nm, yet I have no reason to believe than they would have been worse than 45nm Deneb/Thuban, and that already makes them better than Bulldozer. Note that a modern review of those Processors may give misleading results since later Windows CPU Scheduler updates and compiler optimizations may make them appear to have aged better than K10s, but that was not true at launch.
The only thing that couldn't be argued is that at least AM3+ Motherboards and Processors were priced decent enough to be price performance competitive with Intel offerings. But otherwise, they sucked for anything not budget. And for anyone waiting since AMD got Conroed in 2006 for AMD to have a product with a fightning chance to get back the performance crown, it was ragequit worthy.
 

EXCellR8

Diamond Member
Sep 1, 2010
3,982
839
136
Owned 2 of them, but since one was used I didn't get a 2nd $35 settlement check lol

Anyway, 8350 was a pretty good, not excellent, but still an OK buy back in 2013/2014. Never had much issues gaming on it but heavier MT tasking you could definitely see it's weaknesses. I could have upgraded to an FX 9000 chip but I actually held on to the 8350 for my AMD machine until Ryzen dropped in early 2017. I had built two Intel PC's (using a 4790K & 6600K) around 2016 to tie me over, which were leaps and bounds ahead of the aforementioned AMD chip.

I believe I sold my remaining 8350 alongside the Crosshair V Formula I had been using since 2013.
 

Thunder 57

Platinum Member
Aug 19, 2007
2,675
3,797
136
I bought an Ivy i5 and overclocked it rather than bother with the 8350. At the time it was the right choice. The 8350 has aged better in many ways though. Even overclocked, I saw performance struggle at times in BF1. I upgraded to the 2600X, and with the same GPU performance improved noticeably.
 

blckgrffn

Diamond Member
May 1, 2003
9,123
3,061
136
www.teamjuchems.com
Well, if anything this made me feel less "guilty" about all the 8xxx AMD builds I put together for friends and families when those chips cost about what an i3 did. Thanks for sharing :)

I see your point - I might turn up my nose as a new Intel i9 build at this point because there are fine details I don't like, but 8 years from now when that rig is finally getting to retirement, who cares? Buy what you can afford, be thankful we live in a time when all this fun can be had! :)
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,465
20,486
146
Well, if anything this made me feel less "guilty" about all the 8xxx AMD builds I put together for friends and families when those chips cost about what an i3 did. Thanks for sharing :)

I see your point - I might turn up my nose as a new Intel i9 build at this point because there are fine details I don't like, but 8 years from now when that rig is finally getting to retirement, who cares? Buy what you can afford, be thankful we live in a time when all this fun can be had! :)
I just saw a meme today that was comparing 1990 gamers to 2020 gamers. The nineties guy says I'm just glad it works. The 2020 guy says Why I no getting 144FPS in every game! Perspective makes the difference I surmise.
 

LightningZ71

Golden Member
Mar 10, 2017
1,627
1,898
136
I used to have a Thuban x6 that overclocked quite well. I had a friend with an 8350 at the time. We used to play the same games using the same cards and I don't remember an instance where he was ever to noticeably outperform my setup in any way. I later picked up a used 3770K setup and, with just a mild overclock, the differences were substantial in anything that wasn't heavily GPU bottlenecked. I'll never fully understand why AMD didn't just iterate on Thuban one more time on a smaller node while working the kinks out of the construction cores some more. A product based on the Husky cores in Llano in an 8 core module on 32nm with an 8MB L3 would have been relevant through the Haswell era, even without the later revisions of SSE4 and the new AVX extensions. The time that it would have bought them would have allowed them to fix a whole lot that was wrong at the job site...
 

EXCellR8

Diamond Member
Sep 1, 2010
3,982
839
136
I had a great OC board for the FX 8350 but it still didn't stack up against the Intel chips at the time; once Haswell hit the market the waning FX series really took a hit.

Did anyone use an AM3+ ITX board? Was there even such a thing for the mainstream?
 

NostaSeronx

Diamond Member
Sep 18, 2011
3,686
1,221
136
I'll never fully understand why AMD didn't just iterate on Thuban one more time on a smaller node while working the kinks out of the construction cores some more. A product based on the Husky cores in Llano in an 8 core module on 32nm with an 8MB L3 would have been relevant through the Haswell era, even without the later revisions of SSE4 and the new AVX extensions. The time that it would have bought them would have allowed them to fix a whole lot that was wrong at the job site...
Husky had issues and wasn't as advanced as Bulldozer. Which is why they launched with 8-core Bulldozer, rather than an 8-core Husky. Bulldozer is still the better option to have launched with. Since, CMT Compute Units have multiple advancements possible that a CMP Compute Cluster can't have.
 

Avalon

Diamond Member
Jul 16, 2001
7,565
150
106
Had the same CPU on a DFI at the time, the DFI is still alive but was looking to pair it with a Phenom II 6 core, but it never got supported.

Also watched that review the other night, was amazed to see how well its doing in 2020

:p

** EDIT **
Memory crossed, it was paired on a Asus Crosshair V Formula, the DFI was a generation before!

Man, I sure miss my DFI lanparty boards. Never had a bulldozer, though. Went all in on Intel at the time.
 

chrisjames61

Senior member
Dec 31, 2013
721
446
136
The best Family 15h models were the AM3 Opteron parts.
;; AMD Opteron 3280 (for Bulldozer) and AMD Opteron 3380 (for Piledriver)

Generally operating at the efficiency sweetspot, well above the 2 GHz bottleneck on feeding Fiji. Whereas >FX-8150/8350 were pushed for performance, for no real reason as interconnect delays killed any frequency benefit.


How exactly are these cpu's better than an 8350? They are of a low frequency have a locked multiplier. I'm not getting at why you think these are good parts for an enthusiast?
 

chrisjames61

Senior member
Dec 31, 2013
721
446
136
I had a great OC board for the FX 8350 but it still didn't stack up against the Intel chips at the time; once Haswell hit the market the waning FX series really took a hit.

Did anyone use an AM3+ ITX board? Was there even such a thing for the mainstream?


There were a couple, maybe three ITX AM3+ boards. An ITX board just couldn't handle 125 watt TDP cpu's.
 

NostaSeronx

Diamond Member
Sep 18, 2011
3,686
1,221
136
How exactly are these cpu's better than an 8350? They are of a low frequency have a locked multiplier. I'm not getting at why you think these are good parts for an enthusiast?
If they were as popular as the G34 processors, then the AM3+ Octo-core Opterons probably have been cheaper than the FX-8300($99) in 2014+.

Budget enthusiasts are a thing.
Cheap processor/cheap mobo/cheap ram/cheap HDD/cheap OS(linux?)/cheap power supply/cheap GPU.

To the FX-8150 from an Opteron 3280 isn't really all that good given the TDP/Frequency increase. Making the FX models from a performance perspective not really worth the effort of getting to. Since, most of the cheap heatsinks at the time were for 65W and lower.

Op-3280 -> FX-8150 => baseline frequency scaling of ST workloads
1260L -> 2600K => At best 3x the frequency scaling on ST workloads
Gain from lower TDP sku to higher TDP sku; SNB has good scaling where BLD doesn't. On a performance aspect, perf/OC enthusiasts would get more perf from Sandy Bridge.

imho, FX should have used the K9(design w/ trace cache) architecture. While, the Opteron/A-series should have used a more aggressively power/area/cost-focused K10(Bulldozer architecture Krypton#). [[ K9 @ 5+ GHz, K10/Bulldozer @ 2+ GHz, Geode II/Bobcat @ 1+ GHz; highest client-server performance to lowest client-server performance(65nm project targets btw) // https://images.anandtech.com/reviews/cpu/amd/analystday/slide33.png <== K9 is the actual successor to those cores and https://images.anandtech.com/reviews/cpu/amd/analystday/slide42.png <== K10/Bulldozer is the actual successor to these cores ]]
 
Last edited:

chrisjames61

Senior member
Dec 31, 2013
721
446
136
If they were as popular as the G34 processors, then the AM3+ Octo-core Opterons probably have been cheaper than the FX-8300($99) in 2014+.

Budget enthusiasts are a thing.
Cheap processor/cheap mobo/cheap ram/cheap HDD/cheap OS(linux?)/cheap power supply/cheap GPU.

To the FX-8150 from an Opteron 3280 isn't really all that good given the TDP/Frequency increase. Making the FX models from a performance perspective not really worth the effort of getting to. Since, most of the cheap heatsinks at the time were for 65W and lower.

Op-3280 -> FX-8150 => baseline frequency scaling of ST workloads
1260L -> 2600K => At best 3x the frequency scaling on ST workloads
Gain from lower TDP sku to higher TDP sku; SNB has good scaling where BLD doesn't. On a performance aspect, perf/OC enthusiasts would get more perf from Sandy Bridge.

imho, FX should have used the K9(design w/ trace cache) architecture. While, the Opteron/A-series should have used a more aggressively power/area/cost-focused K10(Bulldozer architecture Krypton#). [[ K9 @ 5+ GHz, K10/Bulldozer @ 2+ GHz, Geode II/Bobcat @ 1+ GHz; highest client-server performance to lowest client-server performance(65nm project targets btw) // https://images.anandtech.com/reviews/cpu/amd/analystday/slide33.png <== K9 is the actual successor to those cores and https://images.anandtech.com/reviews/cpu/amd/analystday/slide42.png <== K10/Bulldozer is the actual successor to these cores ]]

You just typed a wall of text and numbers which basically said nothing tbh.