mikeymikec
Lifer
- May 19, 2011
- 19,973
- 14,296
- 136
I'm just going to say the way that guy handles boards makes me feel deeply uncomfortable.
The FX board died on him, so, there's that.I'm just going to say the way that guy handles boards makes me feel deeply uncomfortable.
Reminds me of the time I saw techies at a computer shop tossing brand new HDDs and playing catch with them. I guess if it falls, they send it in for warranty.I'm just going to say the way that guy handles boards makes me feel deeply uncomfortable.
I remember arguments back in the day when Intel was accused of making their compilers favour Intel chips first, then others later (not that there were many "others"). So this of course in turn downgraded the FX line of cpus when benchmarking. I use to have links to those articles about this, bookmarked but have lost the backups of those bookmarks... more than a decade ago now all that was.
This post is aging well.Haha, that 8 core, sub 2ghz Jaguar CPU set the bar low and wide 😂
I would expect as we see more and more console ports that are exclusive to the newest generation consoles many older CPUs will choke, probably the FX CPUs and the 4c/8t Intel parts being the notable casualties. Notable relative to this thread, I think the world at large will mainly only notice the issues with the i7s![]()
i think my system is a bit short. i just did a Cinebench R15 run and the score came out to 1420cb. All 16 cores running at 4 GHz. Ref clock of 205. Temps were 43C on processor 2 and 39C on processor 1. Have not tried running it with all 32 cores yet since i actually do not know if all 32 cores can run at 4 GHz without burning up the socketsPerhaps that will be the next experiment. The problem is the Noctua coolers. They were good for keeping the temps of my former 61xx ES processors at around 55C when doing IntelburnTest runs for hours on end, but i have no idea how they would fare with another 100w of power hitting them per socket.
CB R15 when i had the 61xx ES procs installed (all 48 cores at 3.0 GHz at 1.2v) scored around 3225cb. These are doing 1420cb with just 16 cores. i'd say that is a pretty good showing for the Piledriver cores compared to the K10.
Ah... yes. That would give you 4 nonshared cores. Having excess cores allows the Piledrivers to run as they should on all 8 cores.
PS: These 16 cores are pulling about 540w at the wall during the CB R15 run. So the power draw of having all 4 sockets populated with these 6380ES and running all 64 cores at 4 GHz would be something else. i've got dual redundant 1400w power supplies so that would be okay, but i don't think the motherboard could take that.
Most efficient is to run all cores at quite low frequency and 4GHz+ only on ST and an handfull of threads, using a single core in a module is highly inefficient since two cores use a common front end that will consume the same as if it was loaded with two thread.
FI a stock FX8370E had a 4.3GHz ST boost and used only 65W with all cores loaded at about 3.35GHz, that would amount to 260W CPU power with your setting using all 32 cores and be quite more efficient than boosting 16 cores@4GHz within 16 modules, actually MT score should go up at a way lower power and same ST perf.
I remember arguments back in the day when Intel was accused of making their compilers favour Intel chips first, then others later (not that there were many "others"). So this of course in turn downgraded the FX line of cpus when benchmarking. I use to have links to those articles about this, bookmarked but have lost the backups of those bookmarks... more than a decade ago now all that was.
Really neat. Any experience with the 6366 HE? How would 4 of those compare to this?
6366HEs should be pretty decent, perhaps better than FX 6100 or FX 8300, mostly due to way bigger caches and double memory bandwidth. That's in poorly threaded tasks, in highly multithreaded tasks Opteron should beat them both while sucking less watts.Sorry i missed your reply. The 6366HEs would run extremely slowly. Pb0 power state on those is only 3.1 GHz. Since they are a production processor power states are locked down, and the only way to up the speed would be to increase the ref clock using OCNG on a supported SuperMicro server motherboard.
i had a pair of 6328s (3.5 GHz pb1 (all core turbo) and 3.8 (pb0 - single core turbo)) and i played with them for about a month before these 6380 ES. i upped the ref clock on those quite easily and hit over 4 GHz. For a production chip i think those would be the way to go. They were the highest frequency Opteron 63xx made.
Wow, still rocking a GTX 460. Such a classic. I had acouple of them. Surprised that they play modern games. Maybe won't play DX12-only games.That’d be a huge upgrade from the 8320/GTX 460 anyway.
Wow, still rocking a GTX 460. Such a classic. I had acouple of them. Surprised that they play modern games. Maybe won't play DX12-only games.
Indeed. That card is what, nearly 10 years old?I think he got his money out of it!
Indeed. That card is what, nearly 10 years old?
Thems fightin' words!Looks like it might pack as much punch as a Radeon iGPU. Only missing lots of things like DX12 support.
Wow, still rocking a GTX 460. Such a classic. I had acouple of them. Surprised that they play modern games. Maybe won't play DX12-only games.
Looks like it might pack as much punch as a Radeon iGPU. Only missing lots of things like DX12 support.
That's also a very wrong selection of card too. With such kneecapping, you lose a lot of performance. I think that even GTX 1050 Ti would beat RX 6400 kneecapped so badly.PCIE 2.0 x4
Not the right card for this system, no doubt about it.
Not on FX systems at least. Vulkan especially in Doom was bad for those chips, because Vulkan is more demanding on CPU side and FX chips are already not so great at that. Sure GTX 460 is even worse, but DX 11 is by far better. It puts less load on CPU without sacrifices to GPU performance.The lack of Vulkan support on the other hand is costly.
Thems fightin' words!The Vega11 is the 2400G on that chart.
If I stumble across a 1050ti for $50 or so I will let you know if that is accurate.That's also a very wrong selection of card too. With such kneecapping, you lose a lot of performance. I think that even GTX 1050 Ti would beat RX 6400 kneecapped so badly.
Jeebs, where did you find RX 6400 for 50 bucks? Anyway, that's just an average performance, the problem is that it usually affects 1% lows very disproportionally. That's the main thing that ruined RX 6500 XT even in old games.If I stumble across a 1050ti for $50 or so I will let you know if that is accurate.I expect that with playable settings for the cards the 6400 will still win easily with maybe a rare outlier. TPU tested the RX6600 bandwidth and at PCIE 2.0 x8. https://www.techpowerup.com/review/amd-radeon-rx-6600-xt-pci-express-scaling/28.html
Games just don't use as much bandwidth as many think they do in the vast majority of titles.
EDIT: I like messing with stuff, I have much better cards I pair too. It can run Witcher 3 1080 maxed with hairworks with a GTX 1080 at locked 60fps all day, and all night.
I paid about $113 for the 6400, I meant I'd pay $50 for a 1050ti.Jeebs, where did you find RX 6400 for 50 bucks? Anyway, that's just an average performance, the problem is that it usually affects 1% lows very disproportionally. That's the main thing that ruined RX 6500 XT even in old games.