[Techspot] The Best CPU for the Money: Intel Core i3-6100 vs. i3-4360, i5-4430 & AMD

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ancalagon44

Diamond Member
Feb 17, 2010
3,274
202
106
I think a better comparison would have been Kaveri or Godaveri. I mean, nobody buys Piledriver anymore. Right? A 4 "core" Kaveri vs dual core i3 would be more interesting, especially since the Kaveri should be better for games since it clocks higher and has higher IPC than Piledriver.

EDIT: Something interesting I just found out - according to Anandtech's bench page, Kaveri is still not as fast as Thuban clock for clock. Its still about 20% behind.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
That is a very brave thing to say looking at the optimization wise crap titles we got over the last years.

In addition inquisition got optimized to run only one worker thread,and the day one patch/optimization of COD BlackOps 3 was the same they lowered the threads because it wouldn't start up even on the i5s if this is the level of the "games of tomorrow" ...
Developers are starting to see that synchronizing a lot of threads is extremely hard,they see massive frame drops even on the consoles and they are really trying to make the games run well on the consoles.


Because you spend only half the money on a dual and in 1-2 years you will be able to get a 1-2 gen newer dual for the money you saved.
It is not worth it anymore to buy a monster PC and keep it for many many years.
sorry but that sounds idiotic. who the hell would want a stuttery inconsistent experience in some games with a dual core and then go through the hassle of getting yet another dual core cpu again for a minuscule improvement? anyone buying a dual core cpu now for modern games is an idiot.
 

Ancalagon44

Diamond Member
Feb 17, 2010
3,274
202
106
sorry but that sounds idiotic. who the hell would want a stuttery inconsistent experience in some games with a dual core and then go through the hassle of getting yet another dual core cpu again for a minuscule improvement? anyone buying a dual core cpu now for modern games is an idiot.

See the first graph on the first page. The i3 is perfectly capable of producing a good experience in Crysis 3.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
See the first graph on the first page. The i3 is perfectly capable of producing a good experience in Crysis 3.
well I was mainly thinking about just a plain dual core not an i3. and I hate to break it to you but where you test in game makes a HUGE difference. Crysis 3 eats even an i5 for lunch in spots and will even bring me down to right at 60 fps for min with my oced 4770k. turning off HT in that spot and I am in the 40s.

and framerate is not the only thing that matters as some games can feel pretty choppy even if the fps counter looks fine. Batman Arkham Knight feels no where near as smooth if I disable HT on my 4770k and will peg the cpu at times. Watch Dogs hitches and stutters in spots if disable HT and feels terrible. Witcher 3 will hitch a little in some spots too with HT off.

bottom line is really even an i5 does not cut it anymore in some games with settings a high end gpu can run.
 
Last edited:

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,001
126
These benchmarks are missing something rather important:
67026.png

Overclocking the 8320E to 4.6 Ghz causes it to have load power consumption that's about 200W higher than the TDP of 95W. The i3s should use about 60W under load--consistent with the TDP without overclocking.

How much extra does it cost to install a PSU that can sustain an extra 140W on its 12V rail? How much does it cost to get an AIO liquid cooler that's powerful enough to cool a 200W CPU? How much extra does it cost to get a case that has good enough airflow to cool all the components when there's an extra 140W in the case?

I am highly skeptical of the claim that the 8320E at 4.6 Ghz is actually in the same price range as the i3s and lower i5s.


Gaming is not AVX code running on eight cores. While gaming, often several cores on a four module FX are sitting idle or only lightly used, not the case for an i3, often not the case for an i5.
 

TheELF

Diamond Member
Dec 22, 2012
3,967
720
126
well I was mainly thinking about just a plain dual core not an i3. and I hate to break it to you but where you test in game makes a HUGE difference. Crysis 3 eats even an i5 for lunch in spots
Yeah exactly,95% of crysis 3 plays at 60FPS on a dual but everybody only counts welcome to the jungle which is a 5min passage of the game.

batman and watchdogs are two perfect examples of crappy programming that do not run well no matter what system you have.

A budget gamer will be happy with the pentium,even if it is a weaker cpu then the i3 i5 i7 ,because there is something called v-sync and frame limiters which makes games run smooth.
 

MiddleOfTheRoad

Golden Member
Aug 6, 2014
1,123
5
0
Eh. That's disengenuous. All the post-wk29 2014 chips have favorable updates to voltage scaling versus older Visheras. The 8320e is further binned for low leakage. It most certainly is not an FX-8300.

Exact same die and architecture. A mature yield doesn't mean it is a 2014 designed chip. It was designed in 2012.
 

Rickyyy369

Member
Apr 21, 2012
149
13
81
I am of the belief that there is no reason to buy an AMD FX CPU in 2015 - at least not for gaming. But I actually walked away from that review a little impressed that this incredibly out of date architecture is still only a couple % behind in most games. FX has aged pretty well.
 

mysticjbyrd

Golden Member
Oct 6, 2015
1,363
3
0
sorry but that sounds idiotic. who the hell would want a stuttery inconsistent experience in some games with a dual core and then go through the hassle of getting yet another dual core cpu again for a minuscule improvement? anyone buying a dual core cpu now for modern games is an idiot.

Exactly, and that's assuming you can even get a replacement with intel changing the sockets all the time.

I am of the belief that there is no reason to buy an AMD FX CPU in 2015 - at least not for gaming. But I actually walked away from that review a little impressed that this incredibly out of date architecture is still only a couple % behind in most games. FX has aged pretty well.

I wouldn't say it's a matter of aging well so much as the technology is approaching a brick wall. I have had this i7-930 for like 5 years now, and it's still doing fine more or less. If I could OC it, I would be doing better, but I think this board is half shot. Anyways, consider back to like 2000. Could you keep a cpu from 2000 to 2005, and still be only a few % behind of current generation tech? Not even close. 1Ghz CPUs compared to 3 GHz, some with dual cores.
 
Last edited:

dark zero

Platinum Member
Jun 2, 2015
2,655
138
106
I am of the belief that there is no reason to buy an AMD FX CPU in 2015 - at least not for gaming. But I actually walked away from that review a little impressed that this incredibly out of date architecture is still only a couple % behind in most games. FX has aged pretty well.

FX and SB are still useful if you want to go emulate older, but way better (in terms of plot and gameplay) games.
 

majord

Senior member
Jul 26, 2015
433
523
136
Exactly, and that's assuming you can even get a replacement with intel changing the sockets all the time.



I wouldn't say it's a matter of aging well so much as the technology is approaching a brick wall. I have had this i7-930 for like 5 years now, and it's still doing fine more or less. If I could OC it, I would be doing better, but I think this board is half shot. Anyways, consider back to like 2000. Could you keep a cpu from 2000 to 2005, and still be only a few % behind of current generation tech? Not even close. 1Ghz CPUs compared to 3 GHz, some with dual cores.

Exactly.

Everyone is clutching at straws to make these CPU's look exciting based on gaming performance vs an old FX - The results of which only demonstrate that current games cannot utilize 8 threads properly. You would have to have been living in a cave for a very long time to be impressed by that. Not to mention the fact Intel have had a [large] IPC advantage for how many years now? I mean, wholy shit.. Is it slow news week or something?

We're also at the dawn of DX12 titles, which we already know will improve the situation regarding both CPU bottlenecks in general, but also improve MT, So whilst I do think the present is more important than the future , It's a little late in the "pre DX12 era" to be making the call that higher thread counts aren't needed for gaming. A little future proofing is a good thing.

What I find asounding is none of these reviews compare these new i3's to their actual compeititon, the 7 series APU's, to see how much the improved IGP has caught up. This is the real problem (potentially) for AMD, as it eats away at their only advantage they've had to counter the lower x86 performance vs HW i3's
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Ryse at 1080 with all settings maxed except no motion blur and no SSAA. this was tested in very graphically demanding part of the game where the gpu was typically well over 90 and approaching 99% at least when testing with the 4770k fully enabled.


4770k at 4.3

Min, Max, Avg
84, 130, 100.767

this was perfectly smooth of course.


4770k using just two cores at 4.3 and HT

Min, Max, Avg
58, 122, 71.740

this was mostly smooth but some stuttering for sure. no way could vsync be enjoyed like this since it was dropping below 60 fps. a normal i3 would be MUCH slower too than two cores of a 4770k at 4.3 with HT and would be a mess if trying to maintain 60 fps.


4770k using just two cores at 4.3 and no HT

Min, Max, Avg
36, 102, 51.314

this was a pretty terrible experience and not smooth as the cpu was pegged basically the whole time. and two cores of my 4770k at 4.3 would blow away any real dual core cpu. a normal Pentium dual core would be dropping well below 30 fps all over the place.


EDIT: screenshot added to show area tested: http://postimg.org/image/vxx8guarn/full/
 
Last edited:
Aug 11, 2008
10,451
642
126
And here's a link to the gaming page of benches: http://www.techspot.com/review/1087-best-value-desktop-cpu/page4.html

Gaming seems like a mixed bag, often the FX is as good as any of them if not better. Seems like this thread exists to start arguing, not much else.

Seriously? Are we looking at the same charts? Overclocked it wins one benchmark. Otherwise it it pretty consistently at the bottom, even with a huge over clock. The differences are relatively small because they are only using a 960 at fairly high settings.
 

DrMrLordX

Lifer
Apr 27, 2000
21,582
10,785
136
A mature yield

Mature yield? It's binned for low leakage. It can run on boards like ratty-arsed 760G boards that won't accommodate an 8300 except at rock-bottom clockspeeds. It's a whole new animal in terms of how and where you can use it.

Vishera ain't what it used to be.