Broadwell ULV performance review and benchmarks

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
It's a disappointment that you can't read these results properly. Difference is higher than 20% in Vantage. In 3dmark11 the difference is roughly 50% despite being clocked lower. 3dmark is not that important, Haswell did quite good there. Gaming performance is an unanswered question.

Haswell U's 3DMark11 results are almost 50% higher while 3DMark Vantage is only 10-20% better(when we isolate it to higher end scores which feature dual channel memory).

Games reflect that by being in the 15-25% range for playable settings but much higher 40-50% with newer games and higher(unplayable) settings.

While from HD 3000 to HD 4000 we got 80-90% average gains with some games having 2.5x difference.

Excuse me if I am disappointed with a brand-spanking new 14nm process with "revolutionary" gains that brings a product that's far less than half of before and no better than what other companies(like Nvidia) does without the shrink.

You can spin it, bake it and fry it but it does not change that its a disappointment. Do you really think even 30% gains are impressive for a GPU which is embarassingly parallel, bottom of the barrel gaming performance, on a new process?

(Who's going to buy eDRAM equipped chips when the competition will be even further ahead and doesn't even have cost or power advantage?)
 
Last edited:

Dufus

Senior member
Sep 20, 2010
675
119
101
More Core i7-5500U benchmarks: http://news.laptop.bg/revyuta/revyu...ot-novoto-broadwell-pokolenie-na-kompaniyata/

3.11 in Cinebench R11.5. At least the CPU is pretty good. We might be able to see 3.25-3.3 points on the top 5600U part.

This isn't far from Core i7 4558U(28W)'s score of 3.5 points.

A 3.3 score for i7-5500U has already been posted in the Cinebench thread.
http://forums.anandtech.com/showpost.php?p=37036512&postcount=248

My own testing of 3DMark with i7-5500U results in power limiting so they don't really paint a true picture of performance clock for clock.
 
Last edited:

TreVader

Platinum Member
Oct 28, 2013
2,057
2
0
Relieved that the CPU isn't poor performing but I am curious to see battery life tests. If 14nm leaks more than 22nm and they up the clock speed I don't see how they will avoid a decrease in battery life.

The graphics performance is, as usual for Intel, highly disappointing.
 

greatnoob

Senior member
Jan 6, 2014
968
395
136
Relieved that the CPU isn't poor performing but I am curious to see battery life tests. If 14nm leaks more than 22nm and they up the clock speed I don't see how they will avoid a decrease in battery life.

The graphics performance is, as usual for Intel, highly disappointing.

CPU performance is nothing new, it matches the 2410m (2.3ghz) exactly though at half the TDP.
 

Nothingness

Diamond Member
Jul 3, 2013
3,315
2,385
136
My own testing of 3DMark with i7-5500U results in power limiting so they don't really paint a true picture of performance clock for clock.
But it's representative of devices where the CPU is used and that's a concern since you don't buy a CPU but a system.
 

mikk

Diamond Member
May 15, 2012
4,308
2,395
136
Haswell U's 3DMark11 results are almost 50% higher while 3DMark Vantage is only 10-20% better(when we isolate it to higher end scores which feature dual channel memory).


Haswell ULT scores ~3300 (same driver, dualchannel, comparable CPU) versus 4300 Broadwell ULT (which I doubt is a higher end result). This is a 30% difference. And by the looks of it Broadwell ULT GPU clocks 100-150 Mhz lower than Haswell ULT. I remember that Haswell ULT clocked at around 1000 Mhz mostly in tests, I doubt Broadwell ULT can hold a steady max Turbo of 900/950 Mhz. Clock per clock efficiency seems quite a bit better in 3dmark. A "unlimited" TDP model like Broadwell-K isn't necessarily lower clocked than its HSW 47W/65W predecessor.
 

Dufus

Senior member
Sep 20, 2010
675
119
101
But it's representative of devices where the CPU is used and that's a concern since you don't buy a CPU but a system.
Yes, it's important to realize if the device / system you might be interested in cripples the performance of the CPU so it is very handy in that respect.

The i7-5500U system I tested is power limited by the manufacturer to 15W which is it's TDP. TDP is a specification to provide a minimal power and cooling solution for worse case scenario, not the maximum power the processor can use. However the CPU supports unlimited power, unlimited being the usual up to 4kW for up to 42 days IIRC. Also from that testing there is plenty of thermal headroom to run higher than 15W so providing the power delivery of the system is up to it then if on AC power why not run it with enough power that it does not throttle?

Personally I would like to see the full performance the CPU is capable of and use that for my buying decisions, of course also taking into account the hardware it might be used in.
 
Last edited:

thunng8

Member
Jan 8, 2013
167
72
101
Yes, it's important to realize if the device / system you might be interested in cripples the performance of the CPU so it is very handy in that respect.

The i7-5500U system I tested is power limited by the manufacturer to 15W which is it's TDP. TDP is a specification to provide a minimal power and cooling solution for worse case scenario, not the maximum power the processor can use. However the CPU supports unlimited power, unlimited being the usual up to 4kW for up to 42 days IIRC. Also from that testing there is plenty of thermal headroom to run higher than 15W so providing the power delivery of the system is up to it then if on AC power why not run it with enough power that it does not throttle?

Personally I would like to see the full performance the CPU is capable of and use that for my buying decisions, of course also taking into account the hardware it might be used in.


That is unrealistic. Intel sells other CPUs that have different TDP figures. If they allowed a 15w cpu to go to unlimited power what is the use of producing 28w, 45w or even desktop CPUs.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
It's pretty neat. I just hope that OEMs put it into good systems that make the correct user experience focused trade offs.

The OEMs need to drop the super high resolution panels (esp. since Windows DPI scaling is pretty bad) which drain battery life as well as the crazy obsession with paper-thin, and instead focus on better touch pads, sturdier industrial design, and higher quality displays (color, viewing angles, etc.)

Sounds like you're describing a MacBook
 

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
Relieved that the CPU isn't poor performing but I am curious to see battery life tests. If 14nm leaks more than 22nm and they up the clock speed I don't see how they will avoid a decrease in battery life.

The graphics performance is, as usual for Intel, highly disappointing.

My gut feeling is OCing headroom for the desktop parts is gonna be just as bad or even worse for than the non-Devil Canyon's Haswell chips. If Intel ups the stock clocks by 100MHz as usual but the headroom falls to the 4 to 4.1GHz range overclocking will be dead for anybody with a lick of common sense.

To think now paying an extra $100 for a ~15% OC is silly...
 

Dufus

Senior member
Sep 20, 2010
675
119
101
That is unrealistic. Intel sells other CPUs that have different TDP figures. If they allowed a 15w cpu to go to unlimited power what is the use of producing 28w, 45w or even desktop CPUs.

As already stated TDP is a specification to provide a minimal power and cooling solution for worse case scenario, not the maximum power the processor can use. A 15W TDP means designing a system for at least that amount of power to run it reasonably at it's specified clocks. Same for other TDP's.

I have a i7-4700MQ with a TDP of 47W. Running Linpack using it's extra 2 unlocked turbo bins and some bclk adjustment to run 4 cores at 3.5GHz (default is 3.2GHz) results in a package power of 80W. Power delivery is okay but thermals are the limiting factor allowing only for a minute or so at that power before TCC kicks in, dependent on room temperature.


i7-5500U power limits

2hcp55j.png
 
Last edited:

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
60% with a product "clock for clock" that doesn't exist.

And it's 60% with 4(20% more) extra EU's. So, a 40% performance improvement over Haswell at same clocks and at same EU's?

If that's your math, then it should be a 50% improvement per clock per EU. (60% / 1.2).
 

mikk

Diamond Member
May 15, 2012
4,308
2,395
136
20% more EUs does not result in 20% more performance (unfortunately), there are too many other limitations and bottlenecks which affects GPU performance. Vantage doesn't look so great but 3dmark11 seems 50% faster with a lower GPU clock. Hopefully 3dmark11 is more representative than Vantage for games. We should see the first ULT tests soon.
 
Mar 10, 2006
11,715
2,012
126
I'm hoping Skylake-U + eDRAM will actually launch in 2H 2015. That should be an even bigger graphics boost.
 

mikk

Diamond Member
May 15, 2012
4,308
2,395
136
28W GT3 parts were disappointing from Haswell. These 48 EU parts really need edram to make a big difference. I doubt this will happen before H1 2016. Maybe we will see Broadwell ULT+ edram H2 2015.
 
Mar 10, 2006
11,715
2,012
126
28W GT3 parts were disappointing from Haswell. These 48 EU parts really need edram to make a big difference. I doubt this will happen before H1 2016. Maybe we will see Broadwell ULT+ edram H2 2015.

Is Broadwell ULT + eDRAM even a planned SKU?
 

hhhd1

Senior member
Apr 8, 2012
667
3
71
I'm confused about the harping of graphics performance on an ULV part. The point of these chips is power sipping.

I have never bought an Intel laptop powered by their graphics thinking its graphics performance was better than good enough for web and office type stuff.

If someone really wants decent graphics they will have to get a non-ULV or look for a discrete added, like a GT 820M or beefier.

Haswell's integrated Graphics is better than the GT 820 you mention
(in some benchmarks)

http://www.videocardbenchmark.net/gpu.php?gpu=Intel+HD+4600
http://www.videocardbenchmark.net/gpu.php?gpu=GeForce+820M&id=2785