General question: Why the increase in notebook battery life in recent years?

LMF5000

Member
Oct 31, 2011
84
0
61
My last laptop purchase was in around 2010, and back then the average real-world battery life used to be 2.5 hours of light use (or 1.5 hours if it was a gaming laptop).

Several years of desktop builds later, I'm now looking for a 15.6" laptop. The capacities of the batteries hasn't changed much (still around 50-70Wh), but I am amazed at the jump in battery life since then.

One review of a decently powerful laptop (core i7 + Nividia GTX860M) actually said "the battery life came close to the worst in our list, at just over 4 hours". I notice that now most low-power laptops quote 5-7 hours, and some from Fujitsu even go up to 23 hours (!!).

These sort of figures were unthinkable for common inexpensive laptops a few years ago (ignoring some special long-life models). My question is, what's changed internally, for a modern laptop to be consuming less than half the power of one from 4-5 years ago to do the same work?

It seems unlikely to be the CPU or GPU - the TDPs of these components is roughly unchanged from back then (~45W). Nor is it the disk drive, as I'm comparing ones with rotating platters, not SSDs. Could it be the switch from CCFLs to LEDs for display backlighting? Or motherboard power consumption? Or something else I've missed?
 

Commodus

Diamond Member
Oct 9, 2004
9,215
6,820
136
The CPU plays a large part, actually. It's a combination of the manufacturing processes (denser chips are typically more efficient) with smarter energy usage, such as shutting down parts of chips or individual cores that aren't necessary. An example is in the 13-inch MacBook Pro: Apple got an extra hour of longevity with the latest model through a simple processor switch. Remember that TDP is the ceiling for power consumption, not where the chip sits all the time.

Besides that, more efficient displays and graphics make a difference (NVIDIA's Maxwell architecture is about delivering better performance with less power draw, as I recall).
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
Exactly. intel did a huge amount of work to change the amount of power used by the CPU during the 99% of the time that it's sitting there waiting for you to move the mouse or click a key. It lowers the speed, shuts off cores, etc. until some piece of work actually needs all the power it can give. The same for the integrated graphics.

When intel was first pushing Ultrabooks they also worked on lowering the idle power use for every other part of the laptop, so (for example) the wifi card might also partially power itself down in between sending and receiving packets.

All of that work trickled down from the intial $1,500 ultrabooks to the $400 laptops of today.
 

LMF5000

Member
Oct 31, 2011
84
0
61
Interesting... can you be more specific as to how a modern CPU makes such an improvement? And maybe a wild guess at how big that improvement is in terms of watts during low intensity tasks?

Frequency scaling is something my 2010 laptops already had, though not the ability to completely power down a core. As for GPU, I doubt the Nvidia 8xxx series popular in those days had any power saving abilities to speak of.
 

Mushkins

Golden Member
Feb 11, 2013
1,631
0
0
Interesting... can you be more specific as to how a modern CPU makes such an improvement? And maybe a wild guess at how big that improvement is in terms of watts during low intensity tasks?

Frequency scaling is something my 2010 laptops already had, though not the ability to completely power down a core. As for GPU, I doubt the Nvidia 8xxx series popular in those days had any power saving abilities to speak of.

http://en.wikipedia.org/wiki/Thermal_design_power

Less heat = less power. Less power = longer battery life.

Simply put, the past few generations of processors have been skirting up on a wall, there's only so much compute performance we can pull out of current technology, and current technology is *beyond* "good enough" for 99% of consumers so there's no money in pumping more R&D into it like they used to. Instead they've been focusing on smaller dies and doing the same thing with less power, because that's precisely where the money is: mobile devices.
 

corkyg

Elite Member | Peripherals
Super Moderator
Mar 4, 2000
27,370
240
106
In addition, contemporary displays use LEDs rather than CFLs and that makes a significant difference. Also, current "books" do not have optical drives, and that is another significant elimination of power consumption. And, SSDs use less power than HDDs. It all adds up.
 
Feb 25, 2011
16,994
1,622
126
As much as we like to get giddy over new CPUs, I'd also point out that the capacity of those batteries has gone way the hell up. (Most new laptops seem to be double or triple the capacity of the old ones.)

And SSDs don't really use much less/more power than 2.5" HDDs.

http://www.anandtech.com/show/3734/seagates-momentus-xt-review-finally-a-good-hybrid-hdd/7

Compare the numbers for the Momentus 5400.6 to the X25-M, or any modern SSD, for that matter. The 7200rpm version wasn't much different.
 

ninaholic37

Golden Member
Apr 13, 2012
1,883
31
91
Also, current "books" do not have optical drives, and that is another significant elimination of power consumption.
Ah I never use my optical drive, maybe I should get out the screwdriver and take it off. What else can I chuck from the insides of this old laptop? Wireless and bluetooth parts for sure (I use ethernet), speakers maybe (I use headphones), I guess I could throw out the hard drive and just boot to USB instead too... I never use the microphone jack... or the F12 key... and those LED lights that flicker when something is working are just annoying. Sounds like this could be a fun power-saving adventure, like those people who demolish houses :D
 
Last edited:

sm625

Diamond Member
May 6, 2011
8,172
137
106
Well for one thing, the discrete GPU is now completely switched off when not gaming. (Optimus) That came out in 2010 but it wasnt really until 2011 that it fully penetrated the market. That was the single largest leap for machines with a discrete GPU. The 2nd biggest leap was probably power gating of cores. The 3rd biggest was probably the 32nm process itself. And the 4th biggest was probably the 22nm process leap. After that you have panel self refresh. Lower power RAM would fit in there somewhere, but those have been very small incremental changes usually 0.05V at a time.
 

xgsound

Golden Member
Jan 22, 2002
1,374
8
81
Some of the new CPUs that run a ~2000 passmark only use 7.5 to 15 watts. Add lower power memory and increased battery capacity and you are well on your way to long battery life.
 

Valantar

Golden Member
Aug 26, 2014
1,792
508
136
Interesting... can you be more specific as to how a modern CPU makes such an improvement? And maybe a wild guess at how big that improvement is in terms of watts during low intensity tasks?

Frequency scaling is something my 2010 laptops already had, though not the ability to completely power down a core. As for GPU, I doubt the Nvidia 8xxx series popular in those days had any power saving abilities to speak of.

Between the introduction of the Core iX chips and I've Bridge, the most popular cpus shifted from the M series (28-35W) to the U series (17W). Thanks to better clock speed control mechanisms and turbo boost (plus improved architectures and lithographies) that happened without any real loss in performance. The i5 520m in my old ThinkPad (2.4-2.9 GHz @ 35W) has the same maximum turbo speed as the i5 3437U (1.9-2.9 GHz @ 17W), and thus performs better (due to increased IPC), at least in bursty workloads, while idling at far lower power levels. In addition to that, there's increased power gating, IGP scaling, and so on. Which leads us to today's Broadwell-U, which is back at the high minimum speeds of the i5 520, but turbos better and runs at 15W (mostly due to the 14nm process). Most 15" laptops today run dual core 15-17W chips. Some have quad cores, and lose quite a bit of battery life - but all the same efficiency gains apply to those as well.

All of this also adds up to less heat produced, so you can have smaller fans that use less power at maximum, running at lower speeds. Fan blade design has also improved, both in terms of noise and air flow. This reduces power draw quite a bit as well. Many Ultrabooks shut their fans off most of the time.

In addition to this, OEMs have started using the far more power efficient eDP display interface, WiFi/BT cards have improved quite a bit, more features have moved onto the CPU die (removing first the SB, then at least for a few chips, the NB as well), and the amount of optional/unused features in computers have been cut drastically (ports, controllers). LED displays were common on mainstream laptops when I last worked in computer retail, five or six years ago. But of course those have improved as well (IGZO, for example).

Still, my ThinkPad X201 manages 6-7 hours of regular use with its 6-cell battery (down from 7-8 when it was new), but for its time that was considered pretty stellar. These days, that's pretty normal for a thin-and-light.
 
Last edited:

Hulk

Diamond Member
Oct 9, 1999
5,196
3,829
136
Interesting... can you be more specific as to how a modern CPU makes such an improvement? And maybe a wild guess at how big that improvement is in terms of watts during low intensity tasks?

Frequency scaling is something my 2010 laptops already had, though not the ability to completely power down a core. As for GPU, I doubt the Nvidia 8xxx series popular in those days had any power saving abilities to speak of.


Intel has put on an all out push for lower power over the last 3 or 4 years. I suggest reading both of the articles below if you want to bring yourself up to date.

http://www.anandtech.com/show/7003/the-haswell-review-intel-core-i74770k-i54560k-tested/2

http://www.anandtech.com/show/8355/intel-broadwell-architecture-preview/4
 

Topweasel

Diamond Member
Oct 19, 2000
5,437
1,659
136
Besides the obvious work Intel has put into the platform, OS makes a huge difference as well. During the SB days I was setting up several dozen new PC's. One department couldn't give up XP yet. So I am doing nearly the same work on two machines (same specs). I start the Win 7 machine first do some work power then power up the XP machine. By the time I power down the Win 7 machine the XP machine has only been on half the time and I have done maybe a third of the work. Yet at power down of the Win 7 unit it is at 93% charge and the Xp machine was at 74%. The 7 had projected over 6 hours left. The XP machine only 1 and 45 minutes left.

The OS better managing cpu, memory and other resources, along with coding becoming simpler (or more efficient), more powerful CPU's, better power saving on CPU's means that the problem is being hit on all sides. Efficient code with a better CPU means means its done quicker. The PS managing the resources better means that only the equipment need for a task is actually used. The better the processor is at managing its power means that both when active and when idle its attempting to use as little power as possible.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
Interesting... can you be more specific as to how a modern CPU makes such an improvement? And maybe a wild guess at how big that improvement is in terms of watts during low intensity tasks?

It's not so much CPU... the biggest jump EVER came from the 4th Gen Core "Haswell" generation. With that every hardware manufacturer and OS manufacturer worked to coordinate their efforts in power management. Before that they used to work all independently, so they can't power down. Since Haswell they can.

It's similar to how vertically integrated Smartphone manufacturers purpose-build it for lower power, but they brought that down into horizontal integration.

With Haswell it brought 50%+ improvement per WHr.
 

mikeymikec

Lifer
May 19, 2011
21,128
16,333
136
I'm pretty sure that battery capacity has increased as well. I fixed up a laptop with a C2D processor in recently and outfitted it with a new battery (and SSD). I'm getting >4hrs light use out of it; I'm pretty sure that would have been unheard of back in the Vista era of laptops from which this one is from.

But yes, I think the processor efficiency increases since then have played a huge part as well. IMO it won't be long before laptop chargers are about the size of current netbook chargers (ie. larger plugs rather than a block), or maybe even phone chargers.