So how does intel Core M HD 5300 compare to Intel Iris Pro?

john5220

Senior member
Mar 27, 2014
551
0
0
How is this performance comparison?

is next gen intel iGPU much better than HD4000 etc?

specifically ultrabook I am looking into buying a Acer Aspire S3 ultrabook but wondering if they might come with core m next year
 

III-V

Senior member
Oct 12, 2014
678
1
41
Should be, what, 80% faster than HD 4000? Not sure about Iris Pro.

Core M is mostly going into tablets. Broadwell will still appear as Core i3/i5/i7, which would be what'd go in your ultrabook. Broadwell-U is about 3 quarters out.
 

Burpo

Diamond Member
Sep 10, 2013
4,223
473
126
http://www.ultrabookreview.com/5165-broadwell-ultrabooks/
"the 5th generation Core platform should deliver up-to increased CPU performance and improved graphics over the 4th generation Haswell solution with Intel HD 4400 chips, while running more efficient at the same time.

On top of those, the high efficiency side of the Broadwell mobile line, also known as Intel Core-M (successor of the Haswell Core Y) is expected to spur a fair-number of extra-thin and fanless designs."

Acer Aspire Switch 12

https://www.youtube.com/watch?v=Jq1Vd3rLf8Y
 
Last edited:

tarlinian

Member
Dec 28, 2013
32
0
41
How is this performance comparison?

is next gen intel iGPU much better than HD4000 etc?

specifically ultrabook I am looking into buying a Acer Aspire S3 ultrabook but wondering if they might come with core m next year

HD5300 definitely not going to be faster than any current Iris Pro chip. You simply can't make up for the extra EUs, increased TDP, and eDRAM.

I think 30%+ improvement over HD4400 is probably a reasonable expectation though...
 

Roland00Address

Platinum Member
Dec 17, 2008
2,196
260
126
HD5300 definitely not going to be faster than any current Iris Pro chip. You simply can't make up for the extra EUs, increased TDP, and eDRAM.

I think 30%+ improvement over HD4400 is probably a reasonable expectation though...

Exactly this Iris pro has 66% more EUs there is a limit on how impressive this new architecture will be in a 5w tdp. Also the 5300 max turbo will be 850 MHz with 24 eus vs 20 eus at 1100mhz for 15w haswell.

I am expecting great performance on a fanless tablet but there is limits.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,786
136
How is this performance comparison?

is next gen intel iGPU much better than HD4000 etc?

HD 4000? You mean the one on Ivy Bridge that came in 2012? Sure. Maybe about 30%.

Against Iris Pro? No way at all. You simply can't make up for that big of a difference in two generations nowadays, let alone one. You'll probably have to wait for 10nm Cannonlake in 2017 *and* with eDRAM enabled to merely match Iris Pro. If you are lucky.

I think 30%+ improvement over HD4400 is probably a reasonable expectation though...

I highly doubt this. I think it'll be a very good result if at 6W cTDPup mode Core M devices can merely match HD 4400 running at much higher 15W.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
I believe there will be some benchmarks that even at default TDP the HD5300 could be very close to HD5000 at 15W TDP.

But once again the naming is misleading, HD5300 at 4-6W TDP could not be even close to 28W TDP HD5100.
 

john5220

Senior member
Mar 27, 2014
551
0
0
ok so how come those smartphones with 800 series snapdragon and adreno 330 GPU are so powerful compared to intel HD 4000?

And they use so little watts?

Also what about competing against a Iris HD 5100? can the HD 5300 which is higher compete with a HD 5100?
 

dahorns

Senior member
Sep 13, 2013
550
83
91
ok so how come those smartphones with 800 series snapdragon and adreno 330 GPU are so powerful compared to intel HD 4000?

And they use so little watts?

Also what about competing against a Iris HD 5100? can the HD 5300 which is higher compete with a HD 5100?

Uh, what is this based on?

I mean, the HD 4000 gets something like 44,000 on ice storm unlimited for the graphics score. Even Adreno 420 seems to top out at 20,000.

Yes we are obviously talking completely different power consumption numbers, but HD 4000 is also really old.
 

kimmel

Senior member
Mar 28, 2013
248
0
41
ok so how come those smartphones with 800 series snapdragon and adreno 330 GPU are so powerful compared to intel HD 4000?
You mean less than half as powerful...
And they use so little watts?
because they are less than half as powerful...

It boggles the mind that Qcom and Apple have succeeded in marketing so well that people keep posting this uninformed junk.
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,595
136
You think it even competes with dispatcher and tessalation tricks?
I agree, but just different level of marketing bs for different segments.
The level of uninformed junk is about the same.
 

III-V

Senior member
Oct 12, 2014
678
1
41
HD 4000? You mean the one on Ivy Bridge that came in 2012? Sure. Maybe about 30%.

Against Iris Pro? No way at all. You simply can't make up for that big of a difference in two generations nowadays, let alone one. You'll probably have to wait for 10nm Cannonlake in 2017 *and* with eDRAM enabled to merely match Iris Pro. If you are lucky.



I highly doubt this. I think it'll be a very good result if at 6W cTDPup mode Core M devices can merely match HD 4400 running at much higher 15W.
Your estimations are way, way off. HD 4600 was found to be ~24% faster on average than HD 4000 on AnandTech's Iris Pro review, which included HD 4600 and HD 4000 in their results.

Gen 8 should be 30-40% faster than Gen 7.5. This would be a little less than HSW Iris Pro, particularly at lower resolutions.
You think it even competes with dispatcher and tessalation tricks?
I agree, but just different level of marketing bs for different segments.
The level of uninformed junk is about the same.
What in the world are you going on about?
 
Last edited:

Roland00Address

Platinum Member
Dec 17, 2008
2,196
260
126
Your estimations are way, way off. HD 4600 was found to be ~24% faster on average than HD 4000 on AnandTech's Iris Pro review, which included HD 4600 and HD 4000 in their results.

Gen 8 should be 30-40% faster than Gen 7.5. This would be a little less than HSW Iris Pro, particularly at lower resolutions.

What in the world are you going on about?

You do realize this is a 5w sku we are talking about. You can't compare a 5w sku with the 77w intel hd4000 to the 57w Iris Pro.

For review lets look at the tdp of the entire chip and compare the two (IP is irish Pro to make the table line up, info from anandtech bioshock infinite 1366x768. I picked this game for all the reviews had it, it is just showing a point the game does not matter too much)
45.2 Intel IP 5200 57w i7 4950HQ with 47w in bios
43.5 Intel IP 5200 47w i7 4950HQ with 47w in bios
27.0 Intel HD 4600 84w i7 4770k
21.7 Intel HD 4000 77w i7 3770k
20.4 Intel HD 5000 15w i5 4250U
17.4 Intel HD 4400 15w i7 4500u
16.4 Intel HD 4000 17w i7 3517u

Why are the intel hd4000 the same exact gpu have such different scores. Either it is thermally limited or cpu limited. Lets look at the specs of the gpus to find out.

45.2 Intel IP 5200 57w EDRAM+ 40 @ 200 - 1300 (Boost) MHz
43.5 Intel IP 5200 47w EDRAM+ 40 @ 200 - 1300 (Boost) MHz
27.0 Intel HD 4600 84w 20 @ 350 - 1250 (Boost) MHz
21.7 Intel HD 4000 77w 16 @ 650 - 1150 (Boost) MHz
20.4 Intel HD 5000 15w 40 @ 200 - 1100 (Boost) MHz
17.4 Intel HD 4400 15w 20 @ 200 - 1100 (Boost) MHz
16.4 Intel HD 4000 17w 16 @ 350 - 1150 (Boost) MHz

So you are telling me a 40EU part vs a 20EU part only gains 17% when it has double the GPU performance. And this can't be a cpu bottleneck since the 20EU has a better cpu (i7 1.8 to 3.0 turbo vs i5 1.3 to 2.6)

The 77w part should not be kicking the ass of the 15w 40 EU part for it has almost double the resources (notice the frequency went up 13% on the 84w part so not quite double) unless the gpu can't use its gpu turbo to its full potential due to the tdp wall and not wanting to go over that tdp. 27.0 for 20 EU 84w vs 20.4 for 40 EU 15w.

Also compare the 17w 4000 vs 77w 4000, 16 EU same turbo frequency 21.7 vs 16.4.

Your estimations are way, way off. HD 4600 was found to be ~24% faster on average than HD 4000 on AnandTech's Iris Pro review, which included HD 4600 and HD 4000 in their results.

Gen 8 should be 30-40% faster than Gen 7.5. This would be a little less than HSW Iris Pro, particularly at lower resolutions.
And you think a similar thing will not happen with a 5w part? I am sorry but we will be lucky to break 30 fps in Bioshock Infinite Value 1366x768.

Or to put another way 49917 is the 3d Mark 1.2 Unlimited for the Intel 5300 Core M reference tablet. Surface Pro 3 gets 48173 on Intel HD4400 on a 15w sku and the surface pro 3 sometimes throttles. That is a sub 4% difference. Now 3d Mark is not always real game performance but to make a 15w Intel HD4400 hit 20fps instead of 17.4 we would need a 15% improvement and we need to achieve that in a soc with 1/3rd the tdp. Hoping for 72% improvement in 1/3rd the tdp (to hit 30fps) is dreaming let alone 260% improvement to make 17.4 equal to 45.2. In a 5w tdp

Lets put it this way if Intel was able to achieve this then Nvidia would fire their Maxwell GPU Engineers that did the amazing and hire the Intel GPU engineers in a heartbeat. They will be throwing money at them for that type of an improvement in efficiency would be astounding.

witeken said:
We'll have have a review soon.
Smart man, I made the same point earlier but somehow got sucked in :p
 

tarlinian

Member
Dec 28, 2013
32
0
41
I highly doubt this. I think it'll be a very good result if at 6W cTDPup mode Core M devices can merely match HD 4400 running at much higher 15W.

I was assuming that the GT2 part in U-Broadwell will also be called HD5300. Upon further thought, that's probably not true. Yes, HD5300 will at best be similar to HD4400 at 15 watts.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
nomenclature is a mess as always, HD5300 will actually be slower than HD4600 or HD5100.
 

Roland00Address

Platinum Member
Dec 17, 2008
2,196
260
126
I was assuming that the GT2 part in U-Broadwell will also be called HD5300. Upon further thought, that's probably not true. Yes, HD5300 will at best be similar to HD4400 at 15 watts.

We only know the clock speeds of the 5300 but this is what we know about Broadwell Graphics so far.

We know this for sure Names and EUs
Intel HD Graphics - 12 EU cheaper chips such as pentiums
Intel HD Graphics 5300 - 24 EU 800 or 850 mhz depending on sku
Intel HD Graphics 5500 - 24 EU
Intel HD Graphics 5600 - 24 EU
Intel HD Graphics 6000 - 48 EU
Intel Iris Graphics 6100 - 48 EU
Intel Iris Pro Graphics 6200- 48 EU
Intel Iris Pro Graphics P6300 - 48 EU


Everything below is guesses
Why the P6300 has a p in the name I do not know if I were to guess that is a signal for the ED Ram and overclockable socket chip that is rumored. Everything below is also a guess (based off haswells numbering scheme) but:
5300 5w soc
5500 15w and 28w tdp ultrabook type processors
5600 37w and 47w mainstream dual and quad core processors (that said who uses a 37w dual core mainstream chip anymore, the only times I can think are laptops with quad core motherboards but they offer a cheaper dual core option since they only have to swap the cpu and some people want the cheaper option., and desktop tdp.
6000 is 48 EU in the 15w tdp, the nicer ultrabooks like the haswell macbook air
6100 is 48 EU with a minimum tdp if I were to guess 37w and above
Pro 6200 48 EU plus ED Ram. The best laptop socket you can get.
Pro p6300 48EU plus ED Ram. The rumored unlock desktop processor with edram.
 

III-V

Senior member
Oct 12, 2014
678
1
41
You do realize this is a 5w sku we are talking about. You can't compare a 5w sku with the 77w intel hd4000 to the 57w Iris Pro.
Smart man, I made the same point earlier but somehow got sucked in :p
Well, you could have just left it at the quote above. But really, the issue is that the question that the OP's asking doesn't make much sense. Core M is Broadwell-Y, and the vast majority of ultrabooks are going to be using Broadwell-U. The thread title would be better written as "how do apples compare to oranges?"

Comparing apples to apples though, in this case, Haswell-U to Broadwell-U, should be fairly similar to what I had said, even though the comparison I made was with desktop models.
nomenclature is a mess as always, HD5300 will actually be slower than HD4600 or HD5100.
Yes, there's no doubt that Intel's nomenclature is a mess, and easily the worst of the three big GPU vendors.
 

Roland00Address

Platinum Member
Dec 17, 2008
2,196
260
126
Well, you could have just left it at the quote above. But really, the issue is that the question that the OP's asking doesn't make much sense. Core M is Broadwell-Y, and the vast majority of ultrabooks are going to be using Broadwell-U. The thread title would be better written as "how do apples compare to oranges?"

Comparing apples to apples though, in this case, Haswell-U to Broadwell-U, should be fairly similar to what I had said, even though the comparison I made was with desktop models.

Okay I think everyone is on the same page now :)

Yes, there's no doubt that Intel's nomenclature is a mess, and easily the worst of the three big GPU vendors.

And the sad fact is with Haswell HD, Iris, and Iris Pro Intel had a chance to start over and do everything right with the naming. How hard would it have been to do this.

Better Name=What Intel Really named it

ED Ram+40 EU
Iris Pro 4900 = Intel Iris Pro 5200
40 EU
Iris 4700 Graphics = Intel Iris 5100
Iris 4600 Graphics = Intel Iris 5000
20 EU
AHD 4400 Graphics = Intel HD4600 Laptop
AHD 4300 Graphics = Intel HD4400 Ultrabook
AHD 4200 Graphics = Intel HD4200 Tablet
10 EU
Intel HDv4 Graphics = Intel HD Graphics

Just

4 Easy Tiers and they all tell you the generation real easy and where on the product stack

Iris Pro
Iris
Advanced High Definition (AHD) or you can do another name like HD+ or something
HDv4 showing this can do 1080p and it is version 4 of the graphics but it won't make people think this is anything gaming. The difference in the fact it is a 4 vs a 4000 type number can easily tell people it is the cheap one.

Then with broadwell it is the same pattern just replace the 4 with a 5. You are giving a person a rough idea of where it is on the performance stack and you get to tell them feature set in the name in an easy fashion.
 

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
Nomenclature is overrated in my opinion. The name doesn't matter. Neither does the company. The name tells you nothing. Not the clock speed, nor the amount of shaders, nor anything about the architecture. If you see some name -- and you happen not to know the details of what it represents -- you go to Wikipedia and you look it up.

Intel's naming isn't too bad. Their top SKUs aren't named HD but Iris (Pro). To differentiate, the SKU family number is N+1, because it uses GTn+1.
So if there is a new Genx, they'll increase the number (e.g. from 3 to 4, 4 to 5), and if it is >GT2, the number is also increased.

This was also the case with Sandy Bridge, which had HD2000 and HD3000. Ivy Bridge had HD2500 and HD4000.


Much ado about nothing.
 

Roland00Address

Platinum Member
Dec 17, 2008
2,196
260
126
Nomenclature is overrated in my opinion. The name doesn't matter. Neither does the company. The name tells you nothing. Not the clock speed, nor the amount of shaders, nor anything about the architecture. If you see some name -- and you happen not to know the details of what it represents -- you go to Wikipedia and you look it up.

I agree with techies, but normal people do not think that way. Normal people do not realize that wikipedia may have computer knowledge like that or even if they go their they do not understand the difference between an EU, a ROP, or a clock speed.

Thus naming does not matter for the techies but the normal people. Being able to say 4th generation came out in 2013. 1 to 9 is the speed, 9 being best. And even if you get a 6 it is nowhere as good as a 9 and you can tell that by looking for a Iris Pro or Iris or HD moniker.

In effect Nvidia's current naming strategy. Except with Intel it would be better for their generations are homogenous you don't have to worry about rebrands for Intel keeps the gpu technology updated with their cpus. They are not going to throw a Penryn 4500HD class gpu on a Haswell chip but you find this stuff with Nvidia where the lower numbers may be based off a 5 year old architecture.

Much ado about nothing.
I do agree there is limits to naming. I would never want the name to tell me stuff like rops, shaders, clock speed for that is too much info for the name. That is why you have intel.ark, wikipedia, or preferably someplace easy to read on the box or the manufacturer's website.

You should be balancing an easy to remember name, and a name that tells you some info but is not overwhelming.

If name was all that was important (and wikipedia) you would name the graphics after dead US presidents. You see this GPU is amazing for it is the Washington, the Nixon's are the integrated. What about the middle ones, are the Johnsons or the Jacksons any good.
 
Last edited:

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
Fair enough, but I honestly think that your purchase decision should not be based on specs alone. This is what reviews are for, although I admit it isn't realistic to expect that from "normal people". But this isn't the case just with things like parts nomenclature. How about whole products like iPhones? I'm very certain most people would be just as good off with a cheap, lower-end phone instead of an expensive $800 one.
 

Roland00Address

Platinum Member
Dec 17, 2008
2,196
260
126
Fair enough, but I honestly think that your purchase decision should not be based on specs alone. This is what reviews are for, although I admit it isn't realistic to expect that from "normal people". But this isn't the case just with things like parts nomenclature. How about whole products like iPhones? I'm very certain most people would be just as good off with a cheap, lower-end phone instead of an expensive $800 one.

I agree but you have to understand you make a lot of decisions on instinct and not rational thought. Did you look at the nutritional label before lunch? At the supermarket how many things did you but due to the picture or using buzz words like low carbs? Did you look at the label to see if it had less carbs than the other brand. Same thing with price how about the cents per ounce?

Very simple names are better than confusing names. Why because we only have so much decision capability per day and simple honest names make life easier. Sometimes we do not realize our error till later would you be pissed if your $900 convertible tablet can not play starcraft ii or diablo even though you thought it could when you bought it?