There has been a lot of debate about whether intel CPU's follow their tdp limit or whether they use more power than they are rated for. So I decided to measure power on my laptop under a variety of conditions.
Lenovo y580
I7-3630qm / gtx 660m / 8GB 1600 mhz RAM / 1080p / Plextor m5m mssd + WD Black 500 GB HDD
Using ‘Hampton Energy Monitor’ ($20 at canadian tire, probably not terribly accurate assuming +/- 5%)
All values BEFORE the power brick. Values may vary by a couple watts (generally for example its 30 watts +/- 2 watts)
(Note: my notebook bios is funny for some reason, the igp is not allowed to go over 650mhz, its was before but not now)
Idle
Min brightness, ‘power saver’ settings, igp, wifi off: 12 watts (CPUID HW monitor – CPU : 4.5 watts)
+ max brightness: 17 watts
+ wifi: 17 watts
+ ‘max performance’ setting (idle CPU clock from 1.2 Ghz to 3.2 ghz: 19 watts (CPU: 6-7 watts)
+ dgpu (O/C to 1085/2500): 19 watts (I believe that even with optimus set to dgpu if there is little to no load the igp will be used)
+open monitoring software + chrome (6 tabs): 28 watts. Forces dgpu active.
(Note: Unusually enough having GPU-Z, MSI afterburner active forces the dgpu active, even if its turned off in nvidia control panel).
Do Intel’s mobile chips exceed their TDP values?
Base power consumption as measured above is 28 watts.
Handbrake (Big buck bunny encode): 68 watts (same procedure as in a thread a couple months ago). 40 watts for CPU package, 3.2 ghz, average encoding speed 178 fps.
Cinebench R 11.5 64 bit multi: 64 watts (37 watts CPU package, 3.2 ghz, 6.37 points)
Furmark IGP : 43 watts (CPUID HW monitor 10.3-12 watts for igp). IGP speed 650 mhz
Furmark (IGP) + Cinebench R11.5 : 76 watts @ 3.2 ghz, 6.27 points : 62 watts @ 2.4 ghz, 5.0 points
(Running Furmark on the IGP results in an extremely laggy UI (like wait 2 seconds for a character being typed to show up, on DGPU there is no lag).
Running Cinebench or Handbrake on the CPU clearly shows that the 45 watt mark is not being exceeded. Furmark consumes minimal power (because the bios will not allow the igp to boost above 650 mhz). Running both consumes 76 watts, throttling down to 62 watts after multiple runs. 76 watts - 28 watts = 48 watts which considering is the the whole system, has a relatively high degree of uncertainty, and is after the power brick (which I highly doubt is more than 85% efficient) would indicate that the 3630qm is not exceeding tdp. I expect that with a boosted IGP power more power would be required but boost would be disabled (probably would run at 1150 mhz igp and 2.4 ghz CPU using ~76 watts of power) to stay within power limits.
Now with a dgpu
Furmark (DGPU – O/C 1085/2500): 84-86 watts
Furmark (DGPU – 950/2500 - boost levels): jumps between 74-85 watts (cannot find out why this jump is occurring, there is no throttling of the CPU or gpu that I can see)
Furmark (DGPU) + Cinebench R11.5 : Massive power fluctuations from 74-78 watts (2.4 ghz CPU) to 92-101 watts (CPU speed 3.2 ghz). GPU at 950/2500 no throttling. 5.47 points.
Furmark appears to suck as much power as its given regardless of performance. Under furmark the 660m alone at stock is using around 46-57 watts, slightly more when overclocked. However under gaming load this is MUCH lower (considering under furmark the voltage of the 660m jumps to 1.0875 from 0.9375 under gaming load). GPU does not throttle even with the OC. Boosted CPU and GPU under furmark appear to use around 95 watts. It appears the furmark sucks as much power as it can. Adding a CPU load (cinebench at 2.4 ghz) to furmark uses EXACTLY the same amount of power as furmark alone (no throttling).
Games (IGP is only running at 650 mhz)
Gw2 (DGPU)- Lion’s Arch: 72-77 watts, 2.4 ghz CPU, 17-22 fps at site (lots of people).
Gw2 (IGP) – Lions’s Arch: 52 watts, 3.2 ghz CPU, 5-6 fps. Reduce settings to 17-22 fps level, 54-56 watts. (IGP is using 7 watts according to GPU-z)
Tomb Raider (IGP-43 watts-6fps), DGPU(73 watts – 35 fps). (Power would be higher because of increased CPU load at 35 fps).
Gaming looks pretty good. Running gw2 in a crowded region (30+ people on screen) with a dgpu uses less power than furmark on the gpu alone, roughly the same amount of power as a full CPU + IGP load on the CPU.
3dmark: 31 watts on IGP (base power is 20 watts)
Conclusion
Intel's mobile chips do not appear to be exceeding their power limits IF kept to non boost levels. I expect that a i7 quad can use more than 45 watts if turbo is active and the gpu is boosting over 650 mhz under very heavy loads.
Surprisingly using a dgpu under gaming loads does not boost power consumption substantially. 75 watts average in GW2 vs 55 watts with the IGP is not a large gain (and would be even less of a difference if the igp was running at 1150 mhz). I do not expect iris to have any power advantage over a discrete GPU.
Edit: Realized that with my games testing the dgpu was being used even if the game was running on the igp. Note that the dgpu adds about 10 watts to the gaming numbers but then realize that the igp is only running at 650 mhz and that at 1150 mhz it would be consuming an estimated 20 watts (twice as much).
Lenovo y580
I7-3630qm / gtx 660m / 8GB 1600 mhz RAM / 1080p / Plextor m5m mssd + WD Black 500 GB HDD
Using ‘Hampton Energy Monitor’ ($20 at canadian tire, probably not terribly accurate assuming +/- 5%)
All values BEFORE the power brick. Values may vary by a couple watts (generally for example its 30 watts +/- 2 watts)
(Note: my notebook bios is funny for some reason, the igp is not allowed to go over 650mhz, its was before but not now)
Idle
Min brightness, ‘power saver’ settings, igp, wifi off: 12 watts (CPUID HW monitor – CPU : 4.5 watts)
+ max brightness: 17 watts
+ wifi: 17 watts
+ ‘max performance’ setting (idle CPU clock from 1.2 Ghz to 3.2 ghz: 19 watts (CPU: 6-7 watts)
+ dgpu (O/C to 1085/2500): 19 watts (I believe that even with optimus set to dgpu if there is little to no load the igp will be used)
+open monitoring software + chrome (6 tabs): 28 watts. Forces dgpu active.
(Note: Unusually enough having GPU-Z, MSI afterburner active forces the dgpu active, even if its turned off in nvidia control panel).
Do Intel’s mobile chips exceed their TDP values?
Base power consumption as measured above is 28 watts.
Handbrake (Big buck bunny encode): 68 watts (same procedure as in a thread a couple months ago). 40 watts for CPU package, 3.2 ghz, average encoding speed 178 fps.
Cinebench R 11.5 64 bit multi: 64 watts (37 watts CPU package, 3.2 ghz, 6.37 points)
Furmark IGP : 43 watts (CPUID HW monitor 10.3-12 watts for igp). IGP speed 650 mhz
Furmark (IGP) + Cinebench R11.5 : 76 watts @ 3.2 ghz, 6.27 points : 62 watts @ 2.4 ghz, 5.0 points
(Running Furmark on the IGP results in an extremely laggy UI (like wait 2 seconds for a character being typed to show up, on DGPU there is no lag).
Running Cinebench or Handbrake on the CPU clearly shows that the 45 watt mark is not being exceeded. Furmark consumes minimal power (because the bios will not allow the igp to boost above 650 mhz). Running both consumes 76 watts, throttling down to 62 watts after multiple runs. 76 watts - 28 watts = 48 watts which considering is the the whole system, has a relatively high degree of uncertainty, and is after the power brick (which I highly doubt is more than 85% efficient) would indicate that the 3630qm is not exceeding tdp. I expect that with a boosted IGP power more power would be required but boost would be disabled (probably would run at 1150 mhz igp and 2.4 ghz CPU using ~76 watts of power) to stay within power limits.
Now with a dgpu
Furmark (DGPU – O/C 1085/2500): 84-86 watts
Furmark (DGPU – 950/2500 - boost levels): jumps between 74-85 watts (cannot find out why this jump is occurring, there is no throttling of the CPU or gpu that I can see)
Furmark (DGPU) + Cinebench R11.5 : Massive power fluctuations from 74-78 watts (2.4 ghz CPU) to 92-101 watts (CPU speed 3.2 ghz). GPU at 950/2500 no throttling. 5.47 points.
Furmark appears to suck as much power as its given regardless of performance. Under furmark the 660m alone at stock is using around 46-57 watts, slightly more when overclocked. However under gaming load this is MUCH lower (considering under furmark the voltage of the 660m jumps to 1.0875 from 0.9375 under gaming load). GPU does not throttle even with the OC. Boosted CPU and GPU under furmark appear to use around 95 watts. It appears the furmark sucks as much power as it can. Adding a CPU load (cinebench at 2.4 ghz) to furmark uses EXACTLY the same amount of power as furmark alone (no throttling).
Games (IGP is only running at 650 mhz)
Gw2 (DGPU)- Lion’s Arch: 72-77 watts, 2.4 ghz CPU, 17-22 fps at site (lots of people).
Gw2 (IGP) – Lions’s Arch: 52 watts, 3.2 ghz CPU, 5-6 fps. Reduce settings to 17-22 fps level, 54-56 watts. (IGP is using 7 watts according to GPU-z)
Tomb Raider (IGP-43 watts-6fps), DGPU(73 watts – 35 fps). (Power would be higher because of increased CPU load at 35 fps).
Gaming looks pretty good. Running gw2 in a crowded region (30+ people on screen) with a dgpu uses less power than furmark on the gpu alone, roughly the same amount of power as a full CPU + IGP load on the CPU.
3dmark: 31 watts on IGP (base power is 20 watts)
Conclusion
Intel's mobile chips do not appear to be exceeding their power limits IF kept to non boost levels. I expect that a i7 quad can use more than 45 watts if turbo is active and the gpu is boosting over 650 mhz under very heavy loads.
Surprisingly using a dgpu under gaming loads does not boost power consumption substantially. 75 watts average in GW2 vs 55 watts with the IGP is not a large gain (and would be even less of a difference if the igp was running at 1150 mhz). I do not expect iris to have any power advantage over a discrete GPU.
Edit: Realized that with my games testing the dgpu was being used even if the game was running on the igp. Note that the dgpu adds about 10 watts to the gaming numbers but then realize that the igp is only running at 650 mhz and that at 1150 mhz it would be consuming an estimated 20 watts (twice as much).
Last edited:
