[Techspot] The Best CPU for the Money: Intel Core i3-6100 vs. i3-4360, i5-4430 & AMD

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
How does anyone consider the FX "an upper end product" in a world of Xeon's and Extreme Editions? Seriously, the FX-8350 debuted at a retail price of $199.00 in October 2012. The Intel i7 3970X debuted a month later at the retail price of $1,000.00.

I just dropped the mic.

What price did the FX-9590 debut at?
 

dark zero

Platinum Member
Jun 2, 2015
2,655
138
106
Vishera in fact is just a Sandy Bridge competitor. If we put the Core i5 2500K or even the Ivy Bridge counterpart in those test, they also have a very bad time due the tech involved.

So the chart is now like this if we consider everything and not only games:

Tier...
FX 83xx series -- Between SB/IB Core i5/i7 -- HW/BW Core i5 -- SKL Core i3

FX 63XX series -- FX 81XX series -- Between SB/IB Core i3/i5 -- HW Core i3 -- Between SKL Pentium DC/Core i3

FX 43XX Series -- FX 61XX series -- SB/IB Core i3 -- Between HW Pentium and Core i3 -- SKL Pentium DC
 

SPBHM

Diamond Member
Sep 12, 2012
5,056
409
126
Bingo! We have a winner here. Gtx960 running Crysis3 at high settings @1080p is just not enough to show any CPU advantage that comes from overclocking.

it's enough to show 30 vs 40FPS for minimum framerate, I think the bottleneck is not just the GPU for this, but also something with the CPU that is not solved with higher clock for cores (like the inferior l3 and memory controller?)

You know what ??? Using the R9 380 + the 8-Core CPU will be much better than the Core i3 + GTX960 on the Games that support Mantle or DX-12 titles.

So if we had the same review with the AMD FX8320E + R9 380 it would be faster and with lower frame-times than the Core i3 6100 + GTX960 on Battlefield 4 MP, Thief, Civilization BE etc.

Shame they didnt included a Tonga GPU as well.

mantle or DX12? ouch, I'm talking about most games, not just a few.

we know from Digital Foundry that the AMD GPUs suffer a bigger loss when moving from an i5 to i3 on DX11 games, but I would like to see this scenario also with slower i5s and AMD CPus
if you are buying an AMD CPU it's very likely that you would consider an AMD GPU, so the test would be more interesting if they had those.
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,001
3,357
136
Does anyone know if they used the Metro Last Light Redux benchmark ?? Because at the same settings they used on the review, the FX8150 + HD7950 @1 GHz is way faster than the Core i5 + GTX 960 of the review.

I get 56fps average vs 40 on the Core i5 + 960.

Gaming_02.png
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The FX is getting thoroughly beat in games. They also tested the i3-6100 with 2133MT/s and 3000MT/s RAM. I would like to see how low timing 2133MT/s performs in comparison though.

Wow, in a typical fashion of AT, the defense and cherry-picking for i3s is stronger than ever. Let's point just close a blind eye to all the popular AAA games where FX performs well and all multi-threaded non-gaming applications because well apparently 99.9% of computer users just play games and nothing else. :sneaky:

AnandTech's CPU sub-forum has become an utter embarrassment of tech knowledge and extreme biased cherry-picking from the time I've joined this forum.

You cannot just focus in on games where an i3 does well and games where the FX would trounce it handily. That's not an objective comparison.

http--www.gamegpu.ru-images-stories-Test_GPU-Action-STAR_WARS_Battlefront-test-starwars_proz.jpg

http--www.gamegpu.ru-images-stories-Test_GPU-strategy-Ashes_of_the_Singularity-test-Ashes_proz_980.jpg

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Call_of_Duty_Black_Ops_III-test-blackops3_proz.jpg

http--www.gamegpu.ru-images-stories-Test_GPU-Retro-The_Witcher_3_Wild_Hunt_-_Hearts_of_Stone-test-w3_proz.jpg

http--www.gamegpu.ru-images-stories-Test_GPU-MMO-Dota_2_Reborn-test-dota2_proz.jpg

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Mad_Max_-test-MadMax_proz.jpg

http--www.gamegpu.ru-images-stories-Test_GPU-strategy-Total_War_Arena-test-arena_proz.jpg

http--www.gamegpu.ru-images-stories-Test_GPU-MMO-Armored_Warfare_-test-AW_proz.jpg
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Continued...

http--www.gamegpu.ru-images-stories-Test_GPU-Simulator-F1_2015_-test-f1_proz.jpg

http--www.gamegpu.ru-images-stories-Test_GPU-MMO-World_of_Warships_Beta_-test-ws_proz.jpg


Are there AAA games where AMD's FX series doesn't perform well? Absolutely.
Are there AAA games where and i3 doesn't perform well? Absolutely.

This comparison also doesn't even take into account that many gamers with FX series CPUs have been using them for 2-3 years already. This is like saying "Oh look, a $170 GTX960 4GB is as fast as a $449 HD7950 from 2012. FAIL AMD."

:whiste:

The whole point of assessing how good a product is overall is it to look at the entire picture and context. I might as well claim that an i3 is 27% faster in minimum fps vs. an i7 4770K based on FO4. "i7 sucks, extra cores and HT are useless!!" Is that a reasonable assessment of the overall performance of the i7 4770K vs. an i3 4360 in games + multi-tasking?
http://www.techspot.com/review/1089-fallout-4-benchmarks/page6.html

It's constantly amusing how certain users on AT always ignore the multi-threaded performance of FX CPUs in well-threaded games + constantly ignore the overall performance of FX vs. i3 outside of games as if it doesn't even matter. There are a lot of people who play games but also use their computers for things other than games. How well does an i3 perform in that case against a 4.5-4.7Ghz FX or an i5/i7?

we know from Digital Foundry that the AMD GPUs suffer a bigger loss when moving from an i5 to i3 on DX11 games, but I would like to see this scenario also with slower i5s and AMD CPus

We also know that an i3 is garbage for multi-tasking of any kind, would get wiped out by an overclocked i7 920 from 2008 but apparently that doesn't matter. We only buy our computers for games, just games and nothing else. Let's see how an i3 does in the real world when I am taking a break from work but still have 30-50+ tabs open, Skype with international clients/friends, various Excel, PDF, MS Word files. Let me guess, I now have to close everything and restart my computer so that I can get the perfect performance I see in benchmarks online?

All those users who were using i5/i7/FX CPUs for the last 2-6 years should have just jumped into the future and purchased a Skylake i3, right?

Should we just ignore that future games will use more threads and focus on how a gamer is better off stepping up to an i5 or is this about Intel vs. AMD as always here?

If I weren't going to upgrade or do a new build for three or four years and had no interest in OCing I'd get a cheaper socket 1150 i5 and reuse some DDR3. Newegg has an i5 4460 for $177. At least thats what I would do if it were down to just the CPUs in that Techspot review.

Exactly. Trying to decide between what's better an i3 or a 3-year-old power hungry and low IPC AMD architecture in late 2015 is missing the point that they are both poor products overall for anyone who wants to build a rig for 5+ years today and use it for a wide variety of tasks. Even a nearly 5-year-old used 2500K OC is better than any Skylake i3; so what are we arguing here which CPU sucks more? :)

What is a gamer going to do for future AAA games that use 4+ threads with that i3? What is a gamer going to do with low IPC AMD CPU for games that barely scale with more cores and love IPC?

The best budget gaming CPU is none of these. It's going to be a used Core i5 K series @ 4.5Ghz+ with a $30-40 CPU cooler.
 
Last edited:

SPBHM

Diamond Member
Sep 12, 2012
5,056
409
126
Does anyone know if they used the Metro Last Light Redux benchmark ?? Because at the same settings they used on the review, the FX8150 + HD7950 @1 GHz is way faster than the Core i5 + GTX 960 of the review.

I get 56fps average vs 40 on the Core i5 + 960.

Gaming_02.png

my guess is that it's Metro Last Light Redux built in benchmark, but yes, that's pretty poor it could also be a gameplay scene, and it could also be a different game (metro 2033 redux), this kind of stuff, the lack of clarity on how these people test the games is pretty bad.
 

AtenRa

Lifer
Feb 2, 2009
14,001
3,357
136
my guess is that it's Metro Last Light Redux built in benchmark, but yes, that's pretty poor it could also be a gameplay scene, and it could also be a different game (metro 2033 redux), this kind of stuff, the lack of clarity on how these people test the games is pretty bad.

I can tell you that for Metro Last Light Redux the Core i5 2500K and the FX 8150 both with the same HD7950 @ 1GHz at 1080p High settings and Tessellation Normal (lower IQ settings than what they tested on the Review) they have the same average fps. The Core i5 2500K is better with less stuttering though but the fps are almost the same.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
my guess is that it's Metro Last Light Redux built in benchmark, but yes, that's pretty poor it could also be a gameplay scene, and it could also be a different game (metro 2033 redux), this kind of stuff, the lack of clarity on how these people test the games is pretty bad.

Funny enough the bench you posted shows that none of the CPUs could even manage 30 fps minimums at 1080P, implying the game is heavily GPU limited. I guess playing at sub-30 fps minimums, with drops to 24-25 fps is more cinematic on a Skylake i3. I guess it's the better CPU for cinematic PC gaming experience when paired with a GTX960. Sounds like a great gaming PC on the brink of 2016. ;)
 
Last edited:

dark zero

Platinum Member
Jun 2, 2015
2,655
138
106
Oh.... I forgot VM on my chart... But the FX Octacore is between an i5 and i7 no matter the generation in terms of how much machines can manage the CPU. Modules and HT are not as helpful as expecting, but modules are far better than Logic cores in that matter. However the weak IPC and weak cores kills the advantage.

On the server aspect it changes since Logic Cores are supported by even stronger cores.
 

erunion

Senior member
Jan 20, 2013
765
0
0
Wow, in a typical fashion of AT, the defense and cherry-picking for i3s is stronger than ever. Let's point just close a blind eye to all the popular AAA games where FX performs well and all multi-threaded non-gaming applications because well apparently 99.9% of computer users just play games and nothing else. :sneaky:

AnandTech's CPU sub-forum has become an utter embarrassment of tech knowledge and extreme biased cherry-picking from the time I've joined this forum.

You cannot just focus in on games where an i3 does well and games where the FX would trounce it handily. That's not an objective comparison.

This is an entirely inappropriate post.

Did you even click the link in the OP? The single game benchmark that the OP included was actually fairly middle of pack. If he had chosen a newer AAA, like GTA5, the FX would have been shown in an even worse light.

Plus, the OPs comments echoed the findings of the linked article.
 

Hi-Fi Man

Senior member
Oct 19, 2013
601
120
106
Whoa guys don't shoot the messenger!

I think most would agree that the i3 is the better CPU for the majority of games on PC and that trend doesn't seem to be changing because for years people have said "wait for the highly multithreaded games!" but nothing has really changed that much and this is probably a consequence of the consoles weak CPUs. Sure the FX is good in other areas besides games but that is besides the point of the thread.

As for those who say the FX is better for multitasking, well most tasks don't require a lot of CPU time especially while you're playing a game. I mean you certainly can't use office, PDF and a browser at the same time while playing a game can you? The only background task that I can think of that would use a lot of CPU time without user intervention would be encoding which is something most don't do. Before you say you don't know what you're talking about, let me say this, I've used an i3-4130 as my main CPU before. So that means this i3-4130 had to play games with skype (group+video), iTunes, Firefox and possibly torrents all in the background and it did so without skipping a beat. In fact it did so just as well as my friend's FX 8320 system who also ran similar stuff in the background.

I've had the pleasure in the past four years of running the following CPUs in my main system: Xeon X5650, i7 2600K, i5 4670, i5 4690K, i3-4130 and Phenom II X4 965. All of the following CPUs have lived up to my expectations in terms of multitasking.

P.S. Not to start WWIII but why do people have 50+ tabs???
 

MiddleOfTheRoad

Golden Member
Aug 6, 2014
1,123
5
0
This is an entirely inappropriate post.

Did you even click the link in the OP? The single game benchmark that the OP included was actually fairly middle of pack. If he had chosen a newer AAA, like GTA5, the FX would have been shown in an even worse light.

Plus, the OPs comments echoed the findings of the linked article.

What are you talking about? The overclocked FX actually won a newer AAA game in the article cited by the OP -- namely Batman. And all the other "victories" by the Intel chips were barely by 3 to 4 frames per second. That is such a fractional difference when you are running around 60 fps -- that most people probably can't even see it with the naked eye.

Anyone running any of the chips (Intel or AMD) in this comparison are going to get an excellent gaming experience regardless. The 2 or 3 fps difference between the various chips in the majority of the benchmarks is arbitrary.

BTW, the benchmarks that RussianSensation posted are definitely more compelling than I would have expected. I knew the FX octocores were strong at integer (which is what I've used them for) -- didn't realize how much stronger they seem to running for modern Multithreaded games.
 
Last edited:

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Batman? The game that was actually recalled because it's so broken?

There's a good endorsement for AMD.
 

SPBHM

Diamond Member
Sep 12, 2012
5,056
409
126
Funny enough the bench you posted shows that none of the CPUs could even manage 30 fps minimums at 1080P, implying the game is heavily GPU limited. I guess playing at sub-30 fps minimums, with drops to 24-25 fps is more cinematic on a Skylake i3. I guess it's the better CPU for cinematic PC gaming experience when paired with a GTX960. Sounds like a great gaming PC on the brink of 2016. ;)

funny thing is that this game runs at 60FPS on the consoles most of the time, Techspot probably used settings that are just to high for the 960, and as far as I know this is not a notoriously CPU limited game anyway.

as always how you test can change a lot of things, and most sites are very superficial just using the built in benchmark or not playing enough,same game and settings, just a different test scene can change things considerably

c3_r.png




c3_j.png


one reason why the Eurogame/Digital Foundry tests are so good, they actually show you the test scene, the sites that don't could at least describe what they are doing and not just show results and not even properly write the name of the game.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
What about the cost of buying the new motherboard, DDR4 and CPU in order to upgrade to Skylake?

Happy looks down at his signature. :) not very much..but I went with high price ram but my cpu runs @ 3.8 (stock 3.7) with XMP memory enabled. Havent tried block overclocking yet.

Compared to a AMD 8350 system, my system will have a upgrade path to a Kaby lake K cpu late next year. Do you think a Zen will drop in the old AMD motherboards?
 
Last edited:

TheELF

Diamond Member
Dec 22, 2012
3,973
730
126
funny thing is that this game runs at 60FPS on the consoles most of the time,
Aren't you in tunnels all of the time in metro? Only encountering grups of monsters? I only played it a bit,but limited view distance and not much objects make for a light game.

The in game benchmark is intense,the game itself is a coridor shooter, or at least that is what I saw.
 

DrMrLordX

Lifer
Apr 27, 2000
21,629
10,841
136
Yes, it did. All the E is an FX-8300 (debuted in Japan in 2012) with a different multiplier. Same chip, only difference is more mature yield. The FX-8320E is a rebrand -- much like the Radeon 300 series is to the Radeon 200 series. The reality is that Vishera is a 2012 design.

Eh. That's disengenuous. All the post-wk29 2014 chips have favorable updates to voltage scaling versus older Visheras. The 8320e is further binned for low leakage. It most certainly is not an FX-8300.

What happened to the 4330 and 6330 FX chips?

I want to know why they didn't bother benching the 860k or 7870k. I mean seriously, if you're going to have i3s in there, why not AMD's most-modern 2M/4T chips as well?
 

SPBHM

Diamond Member
Sep 12, 2012
5,056
409
126
Aren't you in tunnels all of the time in metro? Only encountering grups of monsters? I only played it a bit,but limited view distance and not much objects make for a light game.

The in game benchmark is intense,the game itself is a coridor shooter, or at least that is what I saw.

not all the time, but the consoles seem to hold it well at 60FPS
https://www.youtube.com/watch?v=eUT819D26uY

so I would guess it's not to bad for the CPUs, but the built in benchmark might be worse than the rest of the game, kind of like Thief was (basically made to show some advantage for Mantle in that case I guess)

the techspot test is to GPU limited at times, CPUs still have some impact, but not that much (I understand trying to create a "realistic" combination with a cheaper VGA, but considering some of the results they should have used lowered settings for some games, and as I said an AMD GPU would be great or even better)

pclab test is more interesting due to less GPU bottleneck and more varied games
http://pclab.pl/art66945-10.html

basically the same thing as before, i3 beats in some games, 8 core FX in others... but overall I think it looks better for the i3, specially considering other factors like power and platform.
 
Aug 11, 2008
10,451
642
126
I would call it a clear win for the i3 in this test. It wins 9 games to two with a couple of ties. Most of the games are close, but in a few the FX falls off badly.
 

crashtech

Lifer
Jan 4, 2013
10,524
2,111
146
Are there AAA games where AMD's FX series doesn't perform well? Absolutely.
Are there AAA games where and i3 doesn't perform well? Absolutely. have just jumped into the future and purchased a Skylake i3, right?

Leaving aside for a moment that Skylake i3s are absent from all your examples, there is only one game in them where a Haswell i3 doesn't do well, Armored Warfare, where it posts a 33 fps minimum (some would still consider that playable as a minimum). So even in your own examples, a budget Haswell dual core scores eminently playable framerates in 9 out of 10 games, and Skylake has improved HT even further as it has each generation, note the large difference between Sandy i3 and Haswell i3. Skylake cores have new added execution resources that seem to be aimed at increasing throughput in SMT.
 
Last edited:

mysticjbyrd

Golden Member
Oct 6, 2015
1,363
3
0
http://www.techspot.com/review/1087-best-value-desktop-cpu/

Gaming_05.png


The FX is getting thoroughly beat in games. They also tested the i3-6100 with 2133MT/s and 3000MT/s RAM. I would like to see how low timing 2133MT/s performs in comparison though.

AMD CPUs often does poorly when paired with a weak GPU.

This list reeks of cherry picked titles as well.

What's more, you would be a fool to pick a dual core cpu going into the future! How is it the best bang for your buck when you have to upgrade in a year or two, because it can't handle the better optimized games of tomorrow? Buying an outdated AMD cpu isn't a better alternative though really, which is why waiting is the best option available atm. Well, unless you have money to throw around, in which case you shouldn't be in the market for i3s and AMD cpus anyways.
 
Last edited:

TheELF

Diamond Member
Dec 22, 2012
3,973
730
126
What's more, you would be a fool to pick a dual core cpu going into the future! How is it the best bang for your buck when you have to upgrade in a year or two, because it can't handle the better optimized games of tomorrow?
That is a very brave thing to say looking at the optimization wise crap titles we got over the last years.

In addition inquisition got optimized to run only one worker thread,and the day one patch/optimization of COD BlackOps 3 was the same they lowered the threads because it wouldn't start up even on the i5s if this is the level of the "games of tomorrow" ...
Developers are starting to see that synchronizing a lot of threads is extremely hard,they see massive frame drops even on the consoles and they are really trying to make the games run well on the consoles.

How is it the best bang for your buck when you have to upgrade in a year or two
Because you spend only half the money on a dual and in 1-2 years you will be able to get a 1-2 gen newer dual for the money you saved.
It is not worth it anymore to buy a monster PC and keep it for many many years.