Haswell Celerons officially launched.

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

nwo

Platinum Member
Jun 21, 2005
2,309
0
71
What distinguishes a Haswell Celeron from Haswell i3 (and vice versa)?

Well the major difference is that i3s have HT and Celerons do not.

But the confusing part to me is Celeron vs. Pentium. Other than the small difference in cache size and clock speed, I fail to see the difference between the two.
 
Last edited:

SPBHM

Diamond Member
Sep 12, 2012
5,056
409
126
i3s have no turbo boost,

it's mostly the difference in clock and HT, also AVX support,
and the IGP, i3s (SB and higher) have quick sync Pentiums and lower do not, and haswell i3s have faster IGP (GT2) compared to Pentium and lower (GT1)

also most of the Pentiums and lower only support lower clocked memory.
 

AtenRa

Lifer
Feb 2, 2009
14,001
3,357
136
Well, to begin, i3s have HT and Celerons do not. They also have turbo boost, Celerons certainly do not have either one of these two major features.

But the confusing part to me is Celeron vs. Pentium. Other than the small difference in cache size and clock speed, I fail to see the difference between the two.

Core i3 doesnt have Turbo Boost, Only Core i5 and i7 have it.
 

NTMBK

Lifer
Nov 14, 2011
10,232
5,013
136
Core i3 doesnt have Turbo Boost, Only Core i5 and i7 have it.

But it only really makes a difference in mobile- an ultrabook i5 might have a ~50% clock speed increase from Turbo, while a desktop i5 would boost ~7%.
 

Tsavo

Platinum Member
Sep 29, 2009
2,645
37
91
What? No 12C, $75 Celeron?

Intel is clearly losing the path here!
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,414
8,356
126
Well, to begin, i3s have HT and Celerons do not. They also have turbo boost, Celerons certainly do not have either one of these two major features.

But the confusing part to me is Celeron vs. Pentium. Other than the small difference in cache size and clock speed, I fail to see the difference between the two.

celeron is a terrible brand. maybe its only purpose is to get people to move up to the pentium.

blueshirt: "well, we do have this one over here, but it's a celeron."
customer: "oh i'm not sure about that"
blueshirt: "but for only $30 more we have this with a pentium"
customer: "oooo shiny"
 

Torn Mind

Lifer
Nov 25, 2012
11,635
2,649
136
celeron is a terrible brand. maybe its only purpose is to get people to move up to the pentium.

blueshirt: "well, we do have this one over here, but it's a celeron."
customer: "oh i'm not sure about that"
blueshirt: "but for only $30 more we have this with a pentium"
customer: "oooo shiny"

Yes. Many will eschew it based on reputation alone, a rep built in the days of Pentium 4 and before. By releasing such cheap chips, Intel is likely concerned folks will just pick the cheapest option instead of a $99 Pentium or $120 i3, so they toss the Celeron branding instead of more favorable marketing such as a bar graph between a Celeron and E6550.
 

nwo

Platinum Member
Jun 21, 2005
2,309
0
71
celeron is a terrible brand. maybe its only purpose is to get people to move up to the pentium.

That's a good point. The way I look at it is that Intel is trying to take as much of AMD's low end segment as they can. Celerons are dirt cheap, and they are much more attractive over AMD's budget Semprons. Once you get a Celeron for $30-40... You will eventually upgrade to an i3/i5/i7. That's Intel's thinking.
 

86waterpumper

Senior member
Jan 18, 2010
378
0
0
I call bs on saying that a g1610 cannot run 1080p video without loading up to 100 percent. Something was wrong with the setup. Even a 1037u, (1.8 ghz) doesn't do this. I agree that h81 is a nice chipset for the money.
 

nwo

Platinum Member
Jun 21, 2005
2,309
0
71
I call bs on saying that a g1610 cannot run 1080p video without loading up to 100 percent. Something was wrong with the setup. Even a 1037u, (1.8 ghz) doesn't do this.

Yeah, doesn't sound right to me either. Sounds like a driver related issue to me.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
I can understand that product segmentation is vital to increasing profits and thus lining executives' pockets with even more money, that's all fine and dandy, but what I'm concerned about is the pace of adoption of new ISA extensions. At this rate AVX will become standard in years, not to mention AVX2, which gave a lot of the forum members a hard-on. It was also touted as the next big thing in computing. Scanter-gather instructions and auto-vectorization in particular made some members of this very forum ecstatic even more so then FMA. It's a wonder at least to me why Intel doesn't want to encourage the usage of those new extension as much as it can. A mere 10% improvement in IPC coupled with worse overclock-ability in general doesn't exactly drive me to dump my top-of-the-line MOBO and upgrade to a new CPU a motherboard.
PS. There seems to be a scarcity of reviews, the only one I managed to find is on X-bitlabs, but the graphs seem to me to be a bit cluttered. I'd like to see how much IPC improved with graphs next to each other.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
I can understand that product segmentation is vital to increasing profits and thus lining executives' pockets with even more money, that's all fine and dandy, but what I'm concerned about is the pace of adoption of new ISA extensions. At this rate AVX will become standard in years, not to mention AVX2, which gave a lot of the forum members a hard-on. It was also touted as the next big thing in computing. Scanter-gather instructions and auto-vectorization in particular made some members of this very forum ecstatic even more so then FMA. It's a wonder at least to me why Intel doesn't want to encourage the usage of those new extension as much as it can. A mere 10% improvement in IPC coupled with worse overclock-ability in general doesn't exactly drive me to dump my top-of-the-line MOBO and upgrade to a new CPU a motherboard.
PS. There seems to be a scarcity of reviews, the only one I managed to find is on X-bitlabs, but the graphs seem to me to be a bit cluttered. I'd like to see how much IPC improved with graphs next to each other.

I fully agree on this. It is a little different as such. AVX is already standard in quite abit of code. And AVX2 is easy as well with autovectorization and such.

However it does create abit of the physx vs no physx case. Because if we talk about a realtime demanding application of some sort. Be it a game or something else. The no AVX vs AVX will be a relatively large difference. And maybe the dealbreaker for actual benefit due to the lowest bar approach. Celerons are already the bottom of the food chain. And ISA extensions is the real place to harvest performance benefits. And its simply getting dumb now.

We now have non AVX chips, AVX, AVX2(with 256bit paths) and "soon" Skylake/Xeon Phi with AVX3.2(512bit unit, 256bit paths).

Celerons and Pentiums with Haswell should have had AVX in my mind. There is no excuse. The segmentation should be AVX2 and the 256bit paths.
 

KompuKare

Golden Member
Jul 28, 2009
1,013
924
136
The lack of AVX/AVX2 is truly mysterious. If it is as vital to Intel's plans as the hype indicated, it should be on all their core derived CPUs. Obviously they designed it so it could be disabled, but if they had put a bit more work into that they could have designed it so that by blowing a certain AVX fuse, Celerons and Pentiums run the code at half speed or whatever. That way they would get to have their cake (market segmentation) and eat it too (AVX code adoption). Truly strange.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
The lack of AVX/AVX2 is truly mysterious. If it is as vital to Intel's plans as the hype indicated, it should be on all their core derived CPUs. Obviously they designed it so it could be disabled, but if they had put a bit more work into that they could have designed it so that by blowing a certain AVX fuse, Celerons and Pentiums run the code at half speed or whatever. That way they would get to have their cake (market segmentation) and eat it too (AVX code adoption). Truly strange.

That wouldnt make sense. Then you can just run the non AVX code that your compiler also create.
 

KompuKare

Golden Member
Jul 28, 2009
1,013
924
136
That wouldnt make sense. Then you can just run the non AVX code that your compiler also create.

Well, that was the simplistic idea of course. Obviously this would have meant doing more work for Celeron/Pentium so they'd have to make the decision that AVX code adaptation is strategic.

But the would have to make it so that for a certain program which benefits from AVX/AVX2, the Celeron/Pentium AVX-code path would still be faster that the non-AVX one. Something like this:
Haswell i3/i5/i7 +200% speedup
Celeron/Pentium +100% speedup
Normal code path: o% speedpu (obviously)
That would make it worthwhile for developers to optimise for AVX (because not everything can be done with a magic compiler flag), and Intel can still sell their more expensive chips too.
 

jhu

Lifer
Oct 10, 1999
11,918
9
81
Not so much. Those that know/make use of AVX will know which product to buy.

The point is to spread the presence of AVX so that developers will want to target it. AVX2 is as significant as SSE2 was when it was introduced. But, I guess Intel doesn't see it that way. Chicken amd egg problem
 

NTMBK

Lifer
Nov 14, 2011
10,232
5,013
136
Not so much. Those that know/make use of AVX will know which product to buy.

Intel should want AVX(2) to be used in every single piece of software out there. The more that AVX is used, the better that Intel's latest processors look- both for someone sat on a Core 2 Quad desktop, and for someone trying to decide whether to buy a Haswell tablet or an iPad Air.

Intel are intentionally reducing the number of AVX chips in existence. The fewer AVX chips out there, the less incentive there is for a software developer to implement AVX, and hence the less AVX-aware software in existence. It's just a flat out idiotic move from Intel.
 

NTMBK

Lifer
Nov 14, 2011
10,232
5,013
136
And there is a high chance that Cherry Trail whould not support it either.

That's fair enough, though- the CPU core has literally not got the ability to execute AVX code. But intentionally fusing it of on Haswell parts is just dumb.
 

Torn Mind

Lifer
Nov 25, 2012
11,635
2,649
136
That's a good point. The way I look at it is that Intel is trying to take as much of AMD's low end segment as they can. Celerons are dirt cheap, and they are much more attractive over AMD's budget Semprons. Once you get a Celeron for $30-40... You will eventually upgrade to an i3/i5/i7. That's Intel's thinking.

I don't think Semprons are the main competition; those chips seems to be "leftovers" retailers just want to get rid of. It's the A4-4000 series they are competing against, which has the reputation of having better graphics, but the graphics performance is actually quite similar.
 
Last edited: