That in itself didn’t strike me as too odd. Different users have definitely been reporting very variable results in times of battery life, and some of them have been well below half the ten hours that Apple claims. Our video rundown showed as much and our poll found that the largest group of readers was reporting five hours or less.
Given that Consumer Reports was seeing as little as 3.75 hours in its own tests, that would be good enough reason to withhold a recommendation. But it was the high-end results the organization reported that puzzle me …
In a series of three consecutive tests, the 13-inch model with the Touch Bar ran for 16 hours in the first trial, 12.75 hours in the second, and just 3.75 hours in the third. The 13-inch model without the Touch Bar worked for 19.5 hours in one trial but only 4.5 hours in the next. And the numbers for the 15-inch laptop ranged from 18.5 down to 8 hours.
The numbers in bold are our emphasis, and it’s these numbers I can’t understand. Apple’s own estimates are certainly close to ideal conditions, and yet the company claims only ten hours while Consumer Reports lists 16, 18.5 and 19.5 hours among its results. I literally haven’t heard a single other report getting anywhere remotely close to these numbers. I don’t know if it could last that long with only the screen on.
The organization is a highly credible one with a careful, scientific testing regimen, right down to measuring screen brightness with an external meter.
During the tests, we set each laptop screen to remain on. We use an external meter to set the display brightness to 100 nits—a typical level you might use indoors or out. And, we turn off any automatic brightness adjustment in the laptop’s settings.
We also update every computer’s operating system before we begin any testing. We began our tests several weeks ago, but repeated the battery tests using macOS Sierra 10.12.2 after it was released. We saw no difference in the results.
From that description, it’s hard to see what could explain both the variability reported and the incredibly high maximum numbers. But the numbers seem so hard to believe, I have to think that something must have gone wrong.
Given the source, I think we can immediately dismiss schoolboy errors like allowing the screens to auto-dim or even letting the machines go to sleep. But perhaps non-human error is to blame?
One possibility is that the meter used to measure screen brightness was faulty. If it was giving inconsistent readings, that could explain the variability, and if it was greatly over-reading brightness in some tests, that could result in the brightness being set to extremely low levels and thus explain the extremely long battery-life reported in some test.
If the brightness was set to a really low level, you would have expected the testers to notice this visually and to question the meter readings, but it’s not unknown for people to believe numbers from a machine over the evidence from their own eyes.
A second possibility is that the app auto-opening the webpages failed, or even that the timing app failed to register the correct times. Those might seem unlikely events, but I really cannot see any way the high figures can be correct.
Consumer Reports has sent the test logs to Apple, so perhaps the mystery will be explained. In the meantime, I’ve suggested that it re-run the tests with a fresh brightness meter to see whether the same high-end numbers are still achieved. We’ll update with any response.
Update: Apple says it is looking into it.
FTC: We use income earning auto affiliate links. More.