Question The FX 8350 revisited. Good time to talk about it because reasons.

Page 16 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

NTMBK

Lifer
Nov 14, 2011
10,400
5,636
136
I remember arguments back in the day when Intel was accused of making their compilers favour Intel chips first, then others later (not that there were many "others"). So this of course in turn downgraded the FX line of cpus when benchmarking. I use to have links to those articles about this, bookmarked but have lost the backups of those bookmarks... more than a decade ago now all that was.

Agner Fog had a good blog post: https://www.agner.org/optimize/blog/read.php?i=49
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,210
29,823
146
Haha, that 8 core, sub 2ghz Jaguar CPU set the bar low and wide 😂

I would expect as we see more and more console ports that are exclusive to the newest generation consoles many older CPUs will choke, probably the FX CPUs and the 4c/8t Intel parts being the notable casualties. Notable relative to this thread, I think the world at large will mainly only notice the issues with the i7s :D
This post is aging well.
 
  • Like
Reactions: blckgrffn

Abwx

Lifer
Apr 2, 2011
11,783
4,692
136
i think my system is a bit short. i just did a Cinebench R15 run and the score came out to 1420cb. All 16 cores running at 4 GHz. Ref clock of 205. Temps were 43C on processor 2 and 39C on processor 1. Have not tried running it with all 32 cores yet since i actually do not know if all 32 cores can run at 4 GHz without burning up the sockets :) Perhaps that will be the next experiment. The problem is the Noctua coolers. They were good for keeping the temps of my former 61xx ES processors at around 55C when doing IntelburnTest runs for hours on end, but i have no idea how they would fare with another 100w of power hitting them per socket.

CB R15 when i had the 61xx ES procs installed (all 48 cores at 3.0 GHz at 1.2v) scored around 3225cb. These are doing 1420cb with just 16 cores. i'd say that is a pretty good showing for the Piledriver cores compared to the K10.

Ah... yes. That would give you 4 nonshared cores. Having excess cores allows the Piledrivers to run as they should on all 8 cores.

PS: These 16 cores are pulling about 540w at the wall during the CB R15 run. So the power draw of having all 4 sockets populated with these 6380ES and running all 64 cores at 4 GHz would be something else. i've got dual redundant 1400w power supplies so that would be okay, but i don't think the motherboard could take that.

Most efficient is to run all cores at quite low frequency and 4GHz+ only on ST and an handfull of threads, using a single core in a module is highly inefficient since two cores use a common front end that will consume the same as if it was loaded with two thread.

FI a stock FX8370E had a 4.3GHz ST boost and used only 65W with all cores loaded at about 3.35GHz, that would amount to 260W CPU power with your setting using all 32 cores and be quite more efficient than boosting 16 cores@4GHz within 16 modules, actually MT score should go up at a way lower power and same ST perf.
 

rvborgh

Member
Apr 16, 2014
195
94
101
Most efficient is to run all cores at quite low frequency and 4GHz+ only on ST and an handfull of threads, using a single core in a module is highly inefficient since two cores use a common front end that will consume the same as if it was loaded with two thread.

FI a stock FX8370E had a 4.3GHz ST boost and used only 65W with all cores loaded at about 3.35GHz, that would amount to 260W CPU power with your setting using all 32 cores and be quite more efficient than boosting 16 cores@4GHz within 16 modules, actually MT score should go up at a way lower power and same ST perf.

Hi, The reason i downcored this way (ie using "Compute Unit" is that i read specifically in section "2.4.3 Processor Cores and Downcoring" of the Family 15h BIOS and Kernel Developers Guide, that the disabled cores are CC6 power gated and shut off. It states:

"• Both cores of a compute unit must be downcored if either core needs to be downcored.
• Exception: • The odd core of a compute-unit can be [software] downcored without downcoring the even core for 16-core processors in the G34 package as long as the silicon revision is not OR_B2.
• Clocks are turned off and power is gated to downcored compute units. The power savings is the same as CC6. "

Since these 6380 ES are 16 core and silicon revision OR_CO, this exception thankfully applies to me. i wanted to reduce the heat being pushed up into my heat sinks, as well as give a single core access to all the shared resources and this seems to work.

i've really only spent time trying to characterize the outer envelope of frequencies which can be run with stability. It sounds like going by what you stated that it might be an interesting experiment graphing frequency/voltage vs say Cinebench R15 results to see what the most efficient point is.

The main problem with these Piledrivers (aside from relatively low single thread IPC vs more modern processors) is the software. Very little software seems to be able to leverage the cores. One of the RTS games i play is horribly coded in this aspect, and as soon as its working data set exceeds the L3 cache, frame rates take a plunge, no matter how high the frequency :(

PS: i will have to see if i can find the stock power states of the FX8370E. i would not mind experimenting with these and setting the power states the same.
 

zir_blazer

Golden Member
Jun 6, 2013
1,219
511
136
I remember arguments back in the day when Intel was accused of making their compilers favour Intel chips first, then others later (not that there were many "others"). So this of course in turn downgraded the FX line of cpus when benchmarking. I use to have links to those articles about this, bookmarked but have lost the backups of those bookmarks... more than a decade ago now all that was.
 

rvborgh

Member
Apr 16, 2014
195
94
101
Really neat. Any experience with the 6366 HE? How would 4 of those compare to this?

Sorry i missed your reply. The 6366HEs would run extremely slowly. Pb0 power state on those is only 3.1 GHz. Since they are a production processor power states are locked down, and the only way to up the speed would be to increase the ref clock using OCNG on a supported SuperMicro server motherboard.

i had a pair of 6328s (3.5 GHz pb1 (all core turbo) and 3.8 (pb0 - single core turbo)) and i played with them for about a month before these 6380 ES. i upped the ref clock on those quite easily and hit over 4 GHz. For a production chip i think those would be the way to go. They were the highest frequency Opteron 63xx made.
 
Last edited:
Aug 16, 2021
134
96
61
Sorry i missed your reply. The 6366HEs would run extremely slowly. Pb0 power state on those is only 3.1 GHz. Since they are a production processor power states are locked down, and the only way to up the speed would be to increase the ref clock using OCNG on a supported SuperMicro server motherboard.

i had a pair of 6328s (3.5 GHz pb1 (all core turbo) and 3.8 (pb0 - single core turbo)) and i played with them for about a month before these 6380 ES. i upped the ref clock on those quite easily and hit over 4 GHz. For a production chip i think those would be the way to go. They were the highest frequency Opteron 63xx made.
6366HEs should be pretty decent, perhaps better than FX 6100 or FX 8300, mostly due to way bigger caches and double memory bandwidth. That's in poorly threaded tasks, in highly multithreaded tasks Opteron should beat them both while sucking less watts.
 
  • Like
Reactions: DAPUNISHER

blckgrffn

Diamond Member
May 1, 2003
9,637
4,198
136
www.teamjuchems.com
Getting a friend of mine to come play Deep Rock with us and I had forgotten his “main” was a 8320, I had remembered it as a Core 2 duo. So it’s “better” but trying to decide if it’s worth hoping the OG Antec Eco Neo 520 holds up to the GTX 980 I am loaning him, if he can fit a stick of ram in the first slot under the massive Zalman, having him remove the hard drive cage because even that GPU is too long (cases from like 15 years ago are so quaint) OR just loaning him the pretty but not selling for what I need it to sell for 12th Gen i3 build I have aging in the office and calling it a day.

Leaning hard towards the i3 or having him just invest in that 3600 with free board at MC and still giving him the 980. That’d be a huge upgrade from the 8320/GTX 460 anyway.
 

blckgrffn

Diamond Member
May 1, 2003
9,637
4,198
136
www.teamjuchems.com
Wow, still rocking a GTX 460. Such a classic. I had acouple of them. Surprised that they play modern games. Maybe won't play DX12-only games.

It is the absolute minimum card listed for Deep Rock. So... I am thinking it would work? I just don't want to have him playing some slide show and taking a long time to load in while the rest of us are playing what might look and feel like a totally different game. That wouldn't bring him back next week :)

I think he got his money out of it!
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,210
29,823
146
Looks like it might pack as much punch as a Radeon iGPU. Only missing lots of things like DX12 support.
Thems fightin' words! :p The Vega11 is the 2400G on that chart. Which said chart isn't going to be at all accurate in 2022 for the 460. 1GB of vram, legacy drivers, and lack of modern features would have it trailing considerably in newer games it can run. Hopefully it doesn't have any rendering issues with DRG.

BTW, one of the members started an Anandtech gaming discord. They are talking about playing DGR, you should join.

As to the FX. I played Fallout 4 1080 max settings with the RX 6400. It was great paired with a Ryzen 5600 in a B550, but it cannot hold 60fps with the FX. It does it most of the time, but it can drop into the 30s. Weak CPU, PCIE 2.0 x4, DDR3, and no SAM. Not the right card for this system, no doubt about it.
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,695
136
Wow, still rocking a GTX 460. Such a classic. I had acouple of them. Surprised that they play modern games. Maybe won't play DX12-only games.
Looks like it might pack as much punch as a Radeon iGPU. Only missing lots of things like DX12 support.

Fermi did get sort-of DX12 support with the last(?) drivers released for it. It's only Feature level 11_0, so it's not "real" support in that sense. But it will run DX12.

The lack of Vulkan support on the other hand is costly.
 
Aug 16, 2021
134
96
61
The lack of Vulkan support on the other hand is costly.
Not on FX systems at least. Vulkan especially in Doom was bad for those chips, because Vulkan is more demanding on CPU side and FX chips are already not so great at that. Sure GTX 460 is even worse, but DX 11 is by far better. It puts less load on CPU without sacrifices to GPU performance.
 

blckgrffn

Diamond Member
May 1, 2003
9,637
4,198
136
www.teamjuchems.com
Thems fightin' words! :p The Vega11 is the 2400G on that chart.

I SAID MIGHT. :p

Yeah, I have a 6500XT that would be an easier "drop in" for his system given low power and it's nice & tiny but that's so gross I didn't really consider it. The GTX 980 can stand on its own, but needs some juice. That said, I didn't realize it right away but the 2x 6 pin the GTX 980 requires is also required for the GTX 460. Wasn't that cute we held the line for about a decade before going full wackadoodle on GPU power consumption?

I'll keep the Discord in mind :)

I met my man for lunch and he's got the 12th gen i3 + GTX 980ti for a tour of duty since we play weekly (for like 60-90 minutes! being dads of little ones gives us so much free time :D) and that means he can get up and running and then decide how much effort to put into his older PC. I gave him a shoebox with a 980, ddr3 ram kits, 500GB evo ssd (so he can remove his drive cage and not lose capacity when his 500GB 3.5" has to be retired!!) and PCIe Trendnet Archer because his wifi is an old USB G dongle :D

Having a smaller amount of random kit at my house ensures less recreational PC building. For now.
 
Last edited:

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,210
29,823
146
That's also a very wrong selection of card too. With such kneecapping, you lose a lot of performance. I think that even GTX 1050 Ti would beat RX 6400 kneecapped so badly.
If I stumble across a 1050ti for $50 or so I will let you know if that is accurate. :D I expect that with playable settings for the cards the 6400 will still win easily with maybe a rare outlier. TPU tested the RX6600 bandwidth and at PCIE 2.0 x8. https://www.techpowerup.com/review/amd-radeon-rx-6600-xt-pci-express-scaling/28.html

Games just don't use as much bandwidth as many think they do in the vast majority of titles.

EDIT: I like messing with stuff, I have much better cards I pair too. It can run Witcher 3 1080 maxed with hairworks with a GTX 1080 at locked 60fps all day, and all night.
 
Last edited:
Aug 16, 2021
134
96
61
If I stumble across a 1050ti for $50 or so I will let you know if that is accurate. :D I expect that with playable settings for the cards the 6400 will still win easily with maybe a rare outlier. TPU tested the RX6600 bandwidth and at PCIE 2.0 x8. https://www.techpowerup.com/review/amd-radeon-rx-6600-xt-pci-express-scaling/28.html

Games just don't use as much bandwidth as many think they do in the vast majority of titles.

EDIT: I like messing with stuff, I have much better cards I pair too. It can run Witcher 3 1080 maxed with hairworks with a GTX 1080 at locked 60fps all day, and all night.
Jeebs, where did you find RX 6400 for 50 bucks? Anyway, that's just an average performance, the problem is that it usually affects 1% lows very disproportionally. That's the main thing that ruined RX 6500 XT even in old games.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,210
29,823
146
Jeebs, where did you find RX 6400 for 50 bucks? Anyway, that's just an average performance, the problem is that it usually affects 1% lows very disproportionally. That's the main thing that ruined RX 6500 XT even in old games.
I paid about $113 for the 6400, I meant I'd pay $50 for a 1050ti.

I did not have a bad gaming experience with the setup. The dips were occasional and only the hard drops into the 30s a few times an hour were detectable without afterburner to tell me. Playing on a VRR monitor. I am going to try it in SteamOS instead of win11 pro and see how that goes. I don't know how it will treat the old FX, should be fun finding out.
 

blckgrffn

Diamond Member
May 1, 2003
9,637
4,198
136
www.teamjuchems.com
I want $50 GTX 1050s too - we are getting there with 1060 6GB sightings at $100 but that's still too much.

Check out this clearance! The 8320 is going to ride again tonight! Rock & Stone!

lmao, I am glad that card is a blower because that case has almost no airflow.

gtx_980.jpg
 
  • Like
Reactions: Hotrod2go

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,210
29,823
146
I don't want a 1050ti at any price. $50 is my mess with it and sell it on price.

Is that a Zalman cooler I see in that pic?