Question The FX 8350 revisited. Good time to talk about it because reasons.

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

NostaSeronx

Diamond Member
Sep 18, 2011
3,683
1,218
136
You just typed a wall of text and numbers which basically said nothing tbh.
FX Bulldozer/Piledriver sucks, Opteron Bulldozer/Piledriver was better.
Practically no worthwhile gains whatsoever in games, apps, etc from going from 2.4/2.7 GHz to 4.7 GHz.
Buying for overclocking/higher clocks is useless and wasn't worth it.

While Sandy Bridge is the completely opposite with a monstrous gain from 2.4 GHz to 4.4-5 GHz.
Buying for overclocking/higher clocks is useful and was worth it.

Move from Xeon 1260L to i7-2600K is a big deal in scaling. Since, they were priced the same the 2600K was the better deal at launch.
Move from Opteron 3280 to FX-8150 is a big joke in scaling. Since, the Opterons were going to be priced lower, the Opterons were a better deal.
 
Last edited:

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
When bulldozer came out I had a Phenom II 965BE system clocked to 4GHz, and at the time, Bulldozer really wasn't any faster in games. When I did build a new system in Jan of 2015, I ended up going with an Intel 4690K (Which I recently replaced with a 3700X).

I never really considered Bulldozer or Piledriver to be "bad", I just felt they were kind of meh. They functioned fine, and I built systems for other people with them. I just never built one for myself.

I have wondered before what if AMD had not cheaped out on the chip design itself, how much better could it have been? One of the reasons Bulldozer had such poor IPC was because the chips were layed out by computer software. There was no hand tuning done, which as I recall, is one of the reasons PileDriver was better, it had some tuning to its circuit layouts done. Yeah, ignore this part, I was wrong :)
 
Last edited:

NostaSeronx

Diamond Member
Sep 18, 2011
3,683
1,218
136
I have wondered before what if AMD had not cheaped out on the chip design itself, how much better could it have been? One of the reasons Bulldozer had such poor IPC was because the chips were layed out by computer software. There was no hand tuning done, which as I recall, is one of the reasons PileDriver was better, it had some tuning to its circuit layouts done.
Bulldozer/Piledriver are both custom layout designs, just like Greyhound/Husky were. All four(five for GH+) are hand-tuned and have hand-tweaked Flip-Flops. Bulldozer has SEFF(soft-edge flip-flops) while Piledriver was HEFF(hard-edge flip-flops).
 

AtenRa

Lifer
Feb 2, 2009
14,000
3,357
136
Bobcat/Jaquar were synthesized , Bulldozer/Piledriver were custom.

I dont remember if Steamroller and Excavator were also synthesized or custom.
 

NostaSeronx

Diamond Member
Sep 18, 2011
3,683
1,218
136
I dont remember if Steamroller and Excavator were also synthesized or custom.
Steamroller/Excavator were semi-custom. Standard track macros with hand-placed critical paths and automated(synthesized) non-critical paths.

Post-Steamroller canned and Zen development ramp. Lead to SteamrollerB/Excavator utilizing the same tools that would eventually create Zen which is also a semi-custom design. Critical paths are hand-placed in Zen everything else is synthesized/automated.
 
Last edited:

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
Still I5s had a long life, i used my 2500K from 2011 to 2017 when i got a 1700. 8C FXs only started to catch up around 2015-2016.
No one should use a FX today, no matter the performance.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,357
20,173
146
No one should use a FX today, no matter the performance.
1590679849966.jpeg

What about that old i5? Should they be using that? If the answer is no, you are wrong twice. I have clients on systems that old that are perfectly happy with them. I usually just have to replace a dying HDD with an SSD or a bad stick of ram and they are back to doing what they always do.
 

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
View attachment 21734

What about that old i5? Should they be using that? If the answer is no, you are wrong twice. I have clients on systems that old that are perfectly happy with them. I usually just have to replace a dying HDD with an SSD or a bad stick of ram and they are back to doing what they always do.

Well if someone want to use a 4/4 CPU in 2020 that his problem, there is no much diference in using a 8 year old 2500K or a 9100F/3200G.

Regardless of that, FXs has a really bad power efficiency and thermals, power efficiency alone is a good reason to replace them ASAP.
 

Spjut

Senior member
Apr 9, 2011
928
149
106
View attachment 21734

What about that old i5? Should they be using that? If the answer is no, you are wrong twice. I have clients on systems that old that are perfectly happy with them. I usually just have to replace a dying HDD with an SSD or a bad stick of ram and they are back to doing what they always do.

I hate the mindset of PC hardware being badmouthed just because it's old on paper, and sadly, you always see that mindset on various PC forums. That's what gives the PC its bad reputation about having to be upgraded constantly.

People should be praising that even the 12 year old i7 920 can play the latest games and still do so better than the PS4, likewise for the Bulldozer lineup.
 

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
I hate the mindset of PC hardware being badmouthed just because it's old on paper, and sadly, you always see that mindset on various PC forums. That's what gives the PC its bad reputation about having to be upgraded constantly.

People should be praising that even the 12 year old i7 920 can play the latest games and still do so better than the PS4, likewise for the Bulldozer lineup.

Is not about replacing it because its old, its because the power to performance ratio is trash... this will also apply to Intel 10th gen owners in a few years and it already applies to Intel HEDT.
 
Last edited:

blckgrffn

Diamond Member
May 1, 2003
9,111
3,030
136
www.teamjuchems.com
Is not about replacing it because its old, its because the power to performance ratio is trash... this will also apply to Intel 10th gen owners in a few years and it already applies to Intel HEDT.

Trash? I mean, we should drop several hundred dollars on upgrades because our PC might use $30-$40 more in power over the course of an entire year if used heavily? :eek::cool::tearsofjoy:

At this rate my Dad's 5930k setup is going to be going... indefinitely? Until the motherboard dies? I just dropped an E12 powered 2TB NVME drive in that thing and wow, it flies. That's an Intel HEDT platform that I should dustbin pronto? :D

Truth is that if you are packing an i7 or a solid i5 or better and are just using your PC like a lot of people are - Mobas, Fortnite/Apex/CSGO then you are fine. Dapunishers meme on "why not 144hz" definitely applies at that point.

I sold a Phenom I system with a GTX 650 in it last spring. Really cheap, but I found a motherboard in my tubs with an old Athlon 2 in it. I decided to find the fastest Phenom it supported ($8 from Korea) and 8GB of AMD only DDR2 (another $11 shipped from China) and put it in a new case with a new old CX430 and a 180GB Intel SSD all for like $100. The Phenom I was sinfully slow even with a quad core, but overall it was still good enough to play a lot of games. The kid I sold it to was super pumped. For him, with his $100 that was mostly $10s, the power usage to performance ratio was about perfect I think :p
 

ondma

Platinum Member
Mar 18, 2018
2,718
1,278
136
Is not about replacing it because its old, its because the power to performance ratio is trash... this will also apply to Intel 10th gen owners in a few years and it already applies to Intel HEDT.
It depends on the usage. For light, day to day usage, even the FX will not use that much power, and be fast enough. For encoding or gaming, then, yes, a platform upgrade is probably warranted.
 

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
Trash? I mean, we should drop several hundred dollars on upgrades because our PC might use $30-$40 more in power over the course of an entire year if used heavily? :eek::cool::tearsofjoy:

At this rate my Dad's 5930k setup is going to be going... indefinitely? Until the motherboard dies? I just dropped an E12 powered 2TB NVME drive in that thing and wow, it flies. That's an Intel HEDT platform that I should dustbin pronto? :D

Truth is that if you are packing an i7 or a solid i5 or better and are just using your PC like a lot of people are - Mobas, Fortnite/Apex/CSGO then you are fine. Dapunishers meme on "why not 144hz" definitely applies at that point.

I sold a Phenom I system with a GTX 650 in it last spring. Really cheap, but I found a motherboard in my tubs with an old Athlon 2 in it. I decided to find the fastest Phenom it supported ($8 from Korea) and 8GB of AMD only DDR2 (another $11 shipped from China) and put it in a new case with a new old CX430 and a 180GB Intel SSD all for like $100. The Phenom I was sinfully slow even with a quad core, but overall it was still good enough to play a lot of games. The kid I sold it to was super pumped. For him, with his $100 that was mostly $10s, the power usage to performance ratio was about perfect I think :p

Several hundred dollars? Come on, a Ryzen 3 3100+B450+16GB would devastate an FX8350 for $250. And that +100W at idle is going to cost over $40 bucks at year.

Better yet, a 3400G/3200G/2200G would be far better option for non gaming, as you get rid of a unnecessary dGPU.

I have nothing with using old hardware, but there is good and cheap options today.
 
Last edited:

blckgrffn

Diamond Member
May 1, 2003
9,111
3,030
136
www.teamjuchems.com
Several hundred dollars? Come on, a Ryzen 3 3100+B450+16GB would devastate an FX8350 for $250 with motherboard and ram. And that +100W at idle is going to cost over $40 bucks at year.

Better yet, a 3400G/3200G/2200G would be far better option for non gaming, as you get rid of a unnecessary dGPU.

:rolleyes: The power thing is always funny to me. That's assuming we don't use sleep or ever shut it off, right? I mean, I could spend $2 on an LED bulb in my house, replace an incandescent and save an equivalent wattage over time... or I should upgrade my PC to save on power. Since my time is not free... I think I know which I would choose.

And the time and expertise to swap components is not free and non-trivial either.

If someone wanted to go that route I certainly wouldn't dissuade them, but I wouldn't be enthusiastic about it either. To get an upgrade that is "worth the trouble" I am firmly thinking a 3600 is a minimum step. And yeah, that's "hundreds of dollars" either way. Because its multiples of 100. And if I am using the same PSU that I bought when I got the 8xxx setup, that needs replacing too at this point because I am not trusting new hardware to that old of a PSU... so on and so forth. Your personal risk tolerance may vary.

BUT... if they said they were going to flip that sweet, sweet FX on eBay for $100+! I mean, let's go :D (but that's not throwing it away)
 

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
:rolleyes: The power thing is always funny to me. That's assuming we don't use sleep or ever shut it off, right? I mean, I could spend $2 on an LED bulb in my house, replace an incandescent and save an equivalent wattage over time... or I should upgrade my PC to save on power. Since my time is not free... I think I know which I would choose.

And the time and expertise to swap components is not free and non-trivial either.

If someone wanted to go that route I certainly wouldn't dissuade them, but I wouldn't be enthusiastic about it either. To get an upgrade that is "worth the trouble" I am firmly thinking a 3600 is a minimum step. And yeah, that's "hundreds of dollars" either way. Because its multiples of 100. And if I am using the same PSU that I bought when I got the 8xxx setup, that needs replacing too at this point because I am not trusting new hardware to that old of a PSU... so on and so forth. Your personal risk tolerance may vary.

BUT... if they said they were going to flip that sweet, sweet FX on eBay for $100+! I mean, let's go :D (but that's not throwing it away)

Generally speaking, it is a personal decision, i do understand that. I just dont see any real reason to keep using a FX other than not wanting to spend the extra money and you are already throwing away some of that every month.

And about time, thats another issue, for example my neighbour had a G2020, the MB died, he got a used Athlon 2 x4 since he just do web browsing, after 2 months he started to have issues again, not to mention i had to lean him a HD5450 because W10 does not work OK with a 785G IGP... After that i finally convinced him to get a 200GE/A320/8GB... no issues for over a year now.
 

chrisjames61

Senior member
Dec 31, 2013
721
446
136
FX Bulldozer/Piledriver sucks, Opteron Bulldozer/Piledriver was better.
Practically no worthwhile gains whatsoever in games, apps, etc from going from 2.4/2.7 GHz to 4.7 GHz.
Buying for overclocking/higher clocks is useless and wasn't worth it.

While Sandy Bridge is the completely opposite with a monstrous gain from 2.4 GHz to 4.4-5 GHz.
Buying for overclocking/higher clocks is useful and was worth it.

Move from Xeon 1260L to i7-2600K is a big deal in scaling. Since, they were priced the same the 2600K was the better deal at launch.
Move from Opteron 3280 to FX-8150 is a big joke in scaling. Since, the Opterons were going to be priced lower, the Opterons were a better deal.

Oh please. Overclocking "wasn't worth it"? On chips that routinely could be overclock a full GHz or more? Or saying there is no scaling between a 2.4 GHz Steamroller chip vs 4.7? Lol! That is absurd. As far as comparisons to the Intel chips you trotted out. Who cares? The thread is about 8350's so quit with the Intel thread crap.
 
  • Like
Reactions: The red spirit

chrisjames61

Senior member
Dec 31, 2013
721
446
136
No one should use a FX today, no matter the performance.
That is ridiculous. I am sure there are people all over the world, South America, Many parts of Asia, Russia, China and Eastern Europe, the middle east, Africa where the 8350's and even older gear is what they use because that is all they can afford. Not everyone lives in North America or western Europe and has much disposable income to draw from.
 

NostaSeronx

Diamond Member
Sep 18, 2011
3,683
1,218
136
Oh please. Overclocking "wasn't worth it"? On chips that routinely could be overclock a full GHz or more? Or saying there is no scaling between a 2.4 GHz Steamroller chip vs 4.7? Lol! That is absurd. As far as comparisons to the Intel chips you trotted out. Who cares? The thread is about 8350's so quit with the Intel thread crap.
Go backwards, since going up has such bad scaling.

Opteron 3280/3380 has better perf/watt/cost for AMD compared to FX-8100+/FX-8300+. Sandy Bridge loses performance 3 times faster than Bulldozer when TDP/clocks are pushed down.

i7-2600K 95W -> 1260L 45W => 3x Bulldozer's loss within 50W TDP difference.
FX-8150 125W -> Opteron 3280 65W => Baseline(1/3rd SandyB's) loss within 60W TDP difference.

In a power-constrained environment Bulldozer is better than Sandy Bridge. The Opteron models are very much better than the FX models. Since, it is in Bulldozer/Piledriver best Freq/Volt zone. Bulldozer is a mobile core => big caches, wide OoO, better utilization of units(compared to K8/Greyhound which had linear static decode x -> scheduler x*), etc.

* Decode A -> Retire A(24-entry) -> Scheduler A(8-entry) -> ALU or AGU Cluster A
Decode B -> Retire B(24-entry) -> Scheduler B(8-entry) -> ALU or AGU Cluster B
Decode C -> Retire C(24-entry) -> Scheduler C(8-entry) -> ALU or AGU Cluster C
^-- Three single issue pipelines
Whereas Bulldozer;
Decode ABCD goes forward to Retire A(128-entry) and feeds into Scheduler A(40-entry) -> EX0/EX1/AGLU0/AGLU1
^-- Single four issue pipeline
 
Last edited:

chrisjames61

Senior member
Dec 31, 2013
721
446
136
Go backwards, since going up has such bad scaling.

Opteron 3280/3380 has better perf/watt/cost for AMD compared to FX-8100+/FX-8300+. Sandy Bridge loses performance 3 times faster than Bulldozer when TDP/clocks are pushed down.

i7-2600K 95W -> 1260L 45W => 3x Bulldozer's loss within 50W TDP difference.
FX-8150 125W -> Opteron 3280 65W => Baseline(1/3rd SandyB's) loss within 60W TDP difference.

In a power-constrained environment Bulldozer is better than Sandy Bridge. The Opteron models are very much better than the FX models. Since, it is in Bulldozer/Piledriver best Freq/Volt zone. Bulldozer is a mobile core => big caches, wide OoO, better utilization of units(compared to K8/Greyhound which had linear static decode x -> scheduler x*), etc.

* Decode A -> Retire A(24-entry) -> Scheduler A(8-entry) -> ALU or AGU Cluster A
Decode B -> Retire B(24-entry) -> Scheduler B(8-entry) -> ALU or AGU Cluster B
Decode C -> Retire C(24-entry) -> Scheduler C(8-entry) -> ALU or AGU Cluster C
^-- Three single issue pipelines
Whereas Bulldozer;
Decode ABCD goes forward to Retire A(128-entry) and feeds into Scheduler A(40-entry) -> EX0/EX1/AGLU0/AGLU1
^-- Single four issue pipeline
All you do is trot out numbers that mean nothing nor prove anything. While I have experienced the opposite or a quick visit to YouTube you can see people actually playing games with these cpu's stock vs overclocked.
 

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
That is ridiculous. I am sure there are people all over the world, South America, Many parts of Asia, Russia, China and Eastern Europe, the middle east, Africa where the 8350's and even older gear is what they use because that is all they can afford. Not everyone lives in North America or western Europe and has much disposable income to draw from.

We are talking about people who can and they dont. If you dont have money you are going to keep using what you have, regardless of performance or anything else.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,357
20,173
146
We are talking about people who can and they dont. If you dont have money you are going to keep using what you have, regardless of performance or anything else.
Or they simply do not need anything more. When my son was young, he and his friends all played on console. I went to console gaming so we could co-op together. Subsequently, I used a AMD 3800+ dual core on a MSI K8N-SLI for the next nearly 7 years. It handled what I was doing with it, with just a storage upgrade later in its life. I could certainly have afforded to build new systems, there was just no point. The argument about electricity costs is a silly one. I can get that whole year's difference in power cost, by skipping one night, just one night, of eating out.

I referred to perspective, and that is what keeps us from agreeing here. Your's is the entrenched dogma that utterly dominates forum discussions. Mine is, I hope, evolving beyond those talking point memes.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,357
20,173
146
This channel spends a lot of time with the FX CPUs.


He tunes them well, and juices them for most of what they have to give, without going bananas. Once he has the 6300 and 8350 tuned and compared, you can see the 6300 struggles more with frame pacing. 8350 is getting it done remarkably well, but those VRM temps are no buneo hour after hour, I fried a board like that. I was too lazy to actively cool them, and the hyper 212 was certainly no help there. Case cooling was excellent, THOR v2 full tower, but it was too much for too long. Total War: Warhammer marathons will destroy the unworthy. The MSI 970 Gaming was definitely unworthy. A noteworthy negative point on the CPU though, as many coupled it with boards that claimed to support it, but lacked the power delivery necessary to do so under sustained loads.

He has another video on the 8350 where he is spot on too. We go back to the generic testing that fails to convey the true game play experience. Who cares what the FPS is in the sections that do not hammer the CPU. But there are the bar graphs, showing you how great the old i3 was in Witcher 3. Hit a area with a lot more A.I. and other stressors and it falls on its face. The 8350 really was better than all the haters claimed. Odd how I do not see those same people roasting Intel for the same issues right now. Remember what Anand said, There are no bad CPUs, only bad prices. The 8350 was a good value, even factoring cooling and board costs, at the expense of power and thermals.
 

ondma

Platinum Member
Mar 18, 2018
2,718
1,278
136
This channel spends a lot of time with the FX CPUs.


He tunes them well, and juices them for most of what they have to give, without going bananas. Once he has the 6300 and 8350 tuned and compared, you can see the 6300 struggles more with frame pacing. 8350 is getting it done remarkably well, but those VRM temps are no buneo hour after hour, I fried a board like that. I was too lazy to actively cool them, and the hyper 212 was certainly no help there. Case cooling was excellent, THOR v2 full tower, but it was too much for too long. Total War: Warhammer marathons will destroy the unworthy. The MSI 970 Gaming was definitely unworthy. A noteworthy negative point on the CPU though, as many coupled it with boards that claimed to support it, but lacked the power delivery necessary to do so under sustained loads.

He has another video on the 8350 where he is spot on too. We go back to the generic testing that fails to convey the true game play experience. Who cares what the FPS is in the sections that do not hammer the CPU. But there are the bar graphs, showing you how great the old i3 was in Witcher 3. Hit a area with a lot more A.I. and other stressors and it falls on its face. The 8350 really was better than all the haters claimed. Odd how I do not see those same people roasting Intel for the same issues right now. Remember what Anand said, There are no bad CPUs, only bad prices. The 8350 was a good value, even factoring cooling and board costs, at the expense of power and thermals.
Seriously??? They do it in thread after thread.