• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

Question The FX 8350 revisited. Good time to talk about it because reasons.

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Furious_Styles

Senior member
Jan 17, 2019
374
126
86
I fixed a couple, but I still have a 6(3?)00 here where the kid was having trouble so he decided to follow some Internet guides and that included reseating his CPU (which had been working fine for years) so he "does" that but when he put it back on, he did it with the lever down - not up - so the CPU was sitting mainly on top of the socket. He then proceeded to put the heatsink on and when he was tightening it down the HS just moved to the side so a tremendous amount of the pins squished down.

I was really optimistic I could make it work when he said bent pins. A couple hours and a few railed posts later I decided he was right and that new keychain dongle had been created instead ;)
Here we go, worst one I've attempted so far.
DSC03683.JPGDSC03682.JPG
 

DAPUNISHER

Super Moderator and Elite Member
Moderator
Aug 22, 2001
23,129
4,896
146
I did find this:


RX5700
That is the video I posted in the OP :D

Only game in common looks to be BFV. Phil is MP, other vid is Campaign or Benchmark? Interestingly, MP is harder on the CPU, and the 3080 is obviously a faster card, but the avg fps look to be low 80's for both. Might be looking at the expected performance improvement using AMD, but it not being apples to apples benchmarks = ?
 

DAPUNISHER

Super Moderator and Elite Member
Moderator
Aug 22, 2001
23,129
4,896
146
I paired the 8350 with a 2070 Super and 2x8GB DDR3 1333 CL9, everything is stock - 4.2GHz boost, for starters. I have 2x4GB 1866 I will use later, the 1333 does 1600 simply changing it to the speed so that will be next. Using the wraith Prism for these stock tests. I have the same board as RA Tech - GA- 990FXA-UD3 with a HyperX fury SSD.

Fallout 4 1440p maxed out everything except motion blur, stays at the 60 capped most of the time, but then there are drops into the 40 and 30s once in a while. Adaptive sync keeps it smooth with the exception of a rare 1/4 sec hitch in the more stressful parts of the city.

Tomb Raider 2013 1440p maxed out including TressFX, adaptive sync on, stays locked at my monitor's 75Hz with a rare drop of a few frames, but you'd never know it without checking the frame counter due to Gsync compat.

Modern stuff is boring in the sense of - I know it is going to run everything great, and not have a problem with my TV or monitor's refresh rates. But old stuff is mysterious and interesting aka the box of chocolates.
 
  • Love
Reactions: therealmongo

Furious_Styles

Senior member
Jan 17, 2019
374
126
86
I was expecting much worse from the description. If none of them break off, I think you got this. Real question is - whether or not it is worth your time? For fun and challenge, I vote yes. Otherwise back to the keychain idea.
I'm going to give it to a friend to upgrade his computer that he uses for desktop/business use. Probably will take 30m-1hr to get right. If it had broken pins I probably wouldn't even bother.
 

bononos

Diamond Member
Aug 21, 2011
3,727
69
91
I paired the 8350 with a 2070 Super and 2x8GB DDR3 1333 CL9, everything is stock - 4.2GHz boost, for starters. I have 2x4GB 1866 I will use later, the 1333 does 1600 simply changing it to the speed so that will be next. Using the wraith Prism for these stock tests. I have the same board as RA Tech - GA- 990FXA-UD3 with a HyperX fury SSD.
.........
How are the old motherboards holding up with the high power consumption/heat of the FX generation? I think AMD had to deal with the tdp issue with mb manufacturers for the first time when mb's were throttling.
 

DAPUNISHER

Super Moderator and Elite Member
Moderator
Aug 22, 2001
23,129
4,896
146
How are the old motherboards holding up with the high power consumption/heat of the FX generation? I think AMD had to deal with the tdp issue with mb manufacturers for the first time when mb's were throttling.
The board is a rev. 3.0 and is easily handling the stock 8350@4.2GHz boost. However, it does lack the heat pipe between the NB and VRMs that RA Tech's rev. 4.0 has. So as a precaution, I will actively cool everything when I get to overclocking.

The FX 6100 that was in it, did not run hot at all. I can see why reviewers like Phil and RA Tech like the 6 thread so much. It responds well to overclocking without burning down the house, and was sitting at a very sweet price point, for years.
 

amd6502

Senior member
Apr 21, 2017
865
304
136
If anyone here has a spare FX processor - 6 or 8-core - with bent pins, I'd be happy to try fixing it.
If you aren't going to use it for more demanding gaming or heavy crunching I highly recommend looking for an AM3+ Opteron on ebay. I got an 3365 for under $40 and am very pleased. These are starting to become old retired server chips, and they are tuned to a TDP of 65W. The Opteron 3365 has a top non-boost frequency of 2300 MHz at which it only draws 0.975 V and matches top bin FX like 8300 and, I'd guess, consumes somewhat under 40W all cores under full load. At all core boost of 2600 MHz it's almost the same (in voltage or wattage) as what an 8350 would draw. At 3.3ghz boost it's overvolted like a lower bin 8320 would be. So they picked silicon that worked very well at the speeds where the system would get peak perf/watts.
 

DAPUNISHER

Super Moderator and Elite Member
Moderator
Aug 22, 2001
23,129
4,896
146
Forget overclocking, everyone knows RGB is the best way to increase FPS. All kidding aside, it runs great, and I am not going to overclock, since others have already explored max fps with this CPU. It is doing well with adaptive sync which is all it needs to do. Next up is Batman Arkham Knight. Game came out about the time I bought a FX 8320e for sub $100. I am not going to try to beat up the 8350 by using CPU limited settings. If it can do 1440p async well, with everything on max including PhsyX that is what matters to me.
 

Attachments

DAPUNISHER

Super Moderator and Elite Member
Moderator
Aug 22, 2001
23,129
4,896
146
See, if you can undervolt it instead and save some power. I’ve had very good experiences undervolting amd cpus.
That is a great idea, cooler running is always better. :beercheers: Especially this time of year in Florida.
 

DAPUNISHER

Super Moderator and Elite Member
Moderator
Aug 22, 2001
23,129
4,896
146
RA Tech has a FX 8350 v. i3 7100 uploaded.

A couple of personal observations: I like his testing because he actually plays all the games, most importantly, multiplayer which can be the hardest on the CPU. And he hangs out on discord which is another real world usage when gaming. Forget anything windows is doing, or not having a fresh OS install, etc.

He throws a lot of shade at Gamers Nexus in his last few vids, and he ain't wrong IMO. Both Long hair and benchmark Steve trash the FX, but their testing methodology, based on their vids, is incapable of providing the real user experience. Scripts running benchmark suites, or babysitting it while it runs canned benchmarks, likely without audio, or even brief 60 second runs in single player, in a "demanding area" are inadequate for imparting the user experience, period. I know this first hand. I have mentioned before that HardOCP was ahead of its time on that. Telling you how the games felt when they played them on the hardware was unpopular. But all that empirical testing means nada if the real world experience with it is unpleasant.

Which brings me to the TL;DW: The i3 can have audio issues, assets not loading, frame pacing issues, and communication issues when the FX does not. When the 2/4 i3 is already maxed out, even light duty stuff like discord is too much for it. Heck, it can't even keep up with SoTR bench in his testing; audio issues and NPCs not loading in. The result is that the FPS numbers for the i3 are virtually meaningless.

 

legcramp

Golden Member
May 31, 2005
1,623
85
91
RA Tech has a FX 8350 v. i3 7100 uploaded.

A couple of personal observations: I like his testing because he actually plays all the games, most importantly, multiplayer which can be the hardest on the CPU. And he hangs out on discord which is another real world usage when gaming. Forget anything windows is doing, or not having a fresh OS install, etc.

He throws a lot of shade at Gamers Nexus in his last few vids, and he ain't wrong IMO. Both Long hair and benchmark Steve trash the FX, but their testing methodology, based on their vids, is incapable of providing the real user experience. Scripts running benchmark suites, or babysitting it while it runs canned benchmarks, likely without audio, or even brief 60 second runs in single player, in a "demanding area" are inadequate for imparting the user experience, period. I know this first hand. I have mentioned before that HardOCP was ahead of its time on that. Telling you how the games felt when they played them on the hardware was unpopular. But all that empirical testing means nada if the real world experience with it is unpleasant.

Which brings me to the TL;DW: The i3 can have audio issues, assets not loading, frame pacing issues, and communication issues when the FX does not. When the 2/4 i3 is already maxed out, even light duty stuff like discord is too much for it. Heck, it can't even keep up with SoTR bench in his testing; audio issues and NPCs not loading in. The result is that the FPS numbers for the i3 are virtually meaningless.

How many of those games were relevant though when the FX chip came out? Why isn't he comparing it to like a 3770K or 4670K but instead uses a DUAL CORE processor...


Edit: Okay, I see what the point of the video is now... and it's not really to compare these CPUs but to throw shade at long hair girl and annoying accent guys.
 
Last edited:
  • Like
Reactions: AnitaPeterson

DAPUNISHER

Super Moderator and Elite Member
Moderator
Aug 22, 2001
23,129
4,896
146
How many of those games were relevant though when the FX chip came out? Why isn't he comparing it to like a 3770K or 4670K but instead uses a DUAL CORE processor...


Edit: Okay, I see what the point of the video is now... and it's not really to compare these CPUs but to throw shade at long hair girl and annoying accent guys.
You still missed the point my good man. The i3 7100 and FX 8350 were in the same price bracket. By Q1 2017 when the 7100 came out with an MSRP of $117, the FX 8350 could be found for under $100 during Newegg flash deals. And it is demonstrative of how complacent Intel had become, releasing a 2/4 i3 just months before Zen launched. The FX 8350 released in 2012, and the fact that in 2021 it still aged better than a competitively priced 2017 offering from Intel is astonishing, from a bang for buck perspective.

As Anand said, there are no bad CPUs, only bad pricing.

Also, we covered what legends the old i7s were in this thread, so no need to rehash that.
 

fleshconsumed

Diamond Member
Feb 21, 2002
6,146
1,540
136
RA Tech has a FX 8350 v. i3 7100 uploaded.

A couple of personal observations: I like his testing because he actually plays all the games, most importantly, multiplayer which can be the hardest on the CPU. And he hangs out on discord which is another real world usage when gaming. Forget anything windows is doing, or not having a fresh OS install, etc.

He throws a lot of shade at Gamers Nexus in his last few vids, and he ain't wrong IMO. Both Long hair and benchmark Steve trash the FX, but their testing methodology, based on their vids, is incapable of providing the real user experience. Scripts running benchmark suites, or babysitting it while it runs canned benchmarks, likely without audio, or even brief 60 second runs in single player, in a "demanding area" are inadequate for imparting the user experience, period. I know this first hand. I have mentioned before that HardOCP was ahead of its time on that. Telling you how the games felt when they played them on the hardware was unpopular. But all that empirical testing means nada if the real world experience with it is unpleasant.

Which brings me to the TL;DW: The i3 can have audio issues, assets not loading, frame pacing issues, and communication issues when the FX does not. When the 2/4 i3 is already maxed out, even light duty stuff like discord is too much for it. Heck, it can't even keep up with SoTR bench in his testing; audio issues and NPCs not loading in. The result is that the FPS numbers for the i3 are virtually meaningless.

Ignoring personal stuff this is pretty brutal. One could say games of the past relied more on single thread performance and were far less multi-core optimized so fast dual core like i3-7100 was fine, but on the other hand there were some pretty strong indicators even in the Q6600/E8400 days that showed Q6600 aging much better than E8400.

I had i3-4170 in my HTPC and it was noticeably sluggish waking up from sleep for the first 10-15 seconds which annoyed me enough to upgrade to 2400G and then later 3400G both of which provided much better experience.
 

DAPUNISHER

Super Moderator and Elite Member
Moderator
Aug 22, 2001
23,129
4,896
146
Ignoring personal stuff this is pretty brutal. One could say games of the past relied more on single thread performance and were far less multi-core optimized so fast dual core like i3-7100 was fine, but on the other hand there were some pretty strong indicators even in the Q6600/E8400 days that showed Q6600 aging much better than E8400.

I had i3-4170 in my HTPC and it was noticeably sluggish waking up from sleep for the first 10-15 seconds which annoyed me enough to upgrade to 2400G and then later 3400G both of which provided much better experience.
The i3 7100 has never been fine IMO. There were already games that were taxing a 2/4 i3 in 2017. And I have read a fair number of comments from salty Kaby Lake owners the last year or so. They feel like they have been smacked in the face for buying it. Their $330 i7 became a $125 i3 within a couple of years, and their i3 and i5 have aged like milk. The whole upgrade path meme fell apart, because used 7 series i7s are usually too expensive, which is historically the case with the i7 i.e. holding value. They can get a new i3 and board, for about the same money. And 8th gen released the same year; to add insult to injury.

Kaby Salt Lake I dub thee. :p Everyone glances over 7th gen, but if any CPU series deserves ire, it is top of the list IMO.
 

Ranulf

Golden Member
Jul 18, 2001
1,680
200
106
You still missed the point my good man. The i3 7100 and FX 8350 were in the same price bracket. By Q1 2017 when the 7100 came out with an MSRP of $117, the FX 8350 could be found for under $100 during Newegg flash deals. And it is demonstrative of how complacent Intel had become, releasing a 2/4 i3 just months before Zen launched. The FX 8350 released in 2012, and the fact that in 2021 it still aged better than a competitively priced 2017 offering from Intel is astonishing, from a bang for buck perspective.

As Anand said, there are no bad CPUs, only bad pricing.

Also, we covered what legends the old i7s were in this thread, so no need to rehash that.
I picked up my 8350 in mid 2014 at $170. So about $50 cheaper than a haswell i5 k model.

Good video by RAtech overall. His numbers in Division2 and Farcry5 matched my own experiences on the 8350 (in dx11 and win7 no less). Farcry5 has always been a weird game for most systems performance wise (cpu or gpu tests), so much so I have tended to discount it in reviews.
 

DAPUNISHER

Super Moderator and Elite Member
Moderator
Aug 22, 2001
23,129
4,896
146
I picked up my 8350 in mid 2014 at $170. So about $50 cheaper than a haswell i5 k model.

Good video by RAtech overall. His numbers in Division2 and Farcry5 matched my own experiences on the 8350 (in dx11 and win7 no less). Farcry5 has always been a weird game for most systems performance wise (cpu or gpu tests), so much so I have tended to discount it in reviews.
Those i5s were great gamers at the time. The Devil's Canyon I used for my son's build served him well. He put over a 1000hrs into ARMA III with it, which loved a couple of fast cores. I think that 4690K did 4.5GHz at default vcore? don't remember anymore. But if I'd of bought him a 4790K for another $130, it would have stayed around a lot longer.

I happily gamed on an overclocked FX8320e back then, before swapping for the 8350. It never made enough difference despite the 500MHz? better max clock speed, to warrant the swap. Should have stuck with that 8320 until I built the first of many Ryzen systems, in 2017.
 

Ranulf

Golden Member
Jul 18, 2001
1,680
200
106
Yeah, haswell was probably the last i5 cpu's that were worth the money and even then... the i7 just had more longevity. It was noticable by the haswell gen and certainly obvious by skylake in 2015. By 2014 I regretted not buying the 2600k instead of the 2500k. Not that I hated the 2500k.
 

NostaSeronx

Diamond Member
Sep 18, 2011
3,280
824
136
In-regards to a revisit to the threading type used in FX-processors. Uncovered more of a timeline.

Sometime 2017 to September 2019: Planned to use the prior perpetual MIPS32/64 license then MIPS Open
After that in 2019 to Present: Moved to RISC-V ISA (which MIPS Open&8th gen MIPS eventually did as well)

Pretty sure it is a CMT design and will be using Opteron, FX, Sempron branding. In the same places as EPYC, Ryzen, Athlon respectively.

-RISC-V Opteron targets;
https://www.supermicro.com/Aplus/motherboard/Opteron3000/ <== AMD only has kept APU Opterons on site: https://www.amd.com/en/opteron
-RISC-V FX target the prior FX markets but significantly reduced TDP.
-RISC-V Sempron target is probably extra low-TDP versions of FX.

BMI/BMI2/TBM/ABM ?? and is in Public Review
SVE ?? and is about to enter Public Review, eventually.
MMX/SSE/NEON ?? v0.95 - 06/07/2021
LUT, AES, SHA, SM3, SM4 ?? v0.9.2 probably reliant on bitmanip and vector(etc/VAES/VSHA/VSM3/VSM4 forms)

Timeframe before actual announcement, the above all have to be frozen & ratified. Then, AMD would need to deploy $35,000 or higher to use RISC-V trademark/logo commercially.

Quesswork/Speculation:
Opteron-MD[Many Die] = 3D Processor: Up to six top dies(8-core/4-proc) and one bottom die(3x 64-bit DDR) => Below 2x315mm2 and above 315mm2
Operton/FX/Sempron -DD[Dual die] = 2.5D Processor: One 8-core/4-proc die and one IOD die(1x 64-bit DDR) => Seattle die size across two chips.
Opteron/FX/Sempron -SD[Single die] = Monolithic Procoessor: One 4-core/2-proc and integrated IO(1x 32-bit DDR) => Ontario die size

BD-XV -> This:
Critical Fast & Slow = LVT/HVT -> LVT/HVT (Same)
Non-critical Slow Path = RVT -> RVT (Same)
Non-critical Fast Path = RVT -> LVT (Different)

Should be closer to Cortex-A57(use Hiroshige Goto image), Hence, the A1100 reference from above.
 
Last edited:
  • Like
Reactions: amd6502

ASK THE COMMUNITY