I'm surprised that the author hasn't mentioned the positive effect it has had. After upgrading the OS and seeing the new menu entries I reported a number of applications that would show up as using a lot of energy and within a few releases there was a change note usually mentioning some simple fix which dramatically reduces the wake up's of the application or removing a long running animation that didn't add any value. This little menu addition has probably helped the battery life for macbook users more than any other change.
I suspect that's the real reason why Apple is weighting wakeups heavily and exaggerating the energy impact: it shames developers into making their apps more efficient.
Except Chrome, which on every laptop I've ever used seems to decrease battery life by about a 1/3rd compared to Safari and is always marked as using significant power.
(Currently 10.9.5 with chrome 44 on a 2014 macbook air, but I've had the issue on newer computers at work)
This can be seen exceptionally when watching videos where some implementations prefer Flash. I often fall asleep to documentaries (maybe not a typical use case) and I've seen massive gains in battery life using Safari (using HTML5 video) vs. Chrome (using Flash). Granted, this isn't a fair comparison as one is using a "native" implementation and one is using Flash. But I've seen something like 250% gain in battery life – which sometimes means the difference between seeing the ending and not.
It's interesting watching one's projected battery life climb from 40m to 1h 40m simply by swapping one application for another. However, it's unfortunate some websites prefer Flash-based implementations over their native counterparts. Although this trend seems to be shifting in favor of those that are native. And I assume the smart folks at Google are taking note. YouTube, for example, has done a particularly spectacular job at adding AirPlay functionality to their HTML5 player.
My guess is that native HTML5 is just h.264, and most modern systems have a hardware decoder that would be much more efficient at decoding than a software implementation. Flash video is a wrapper around a number of codecs, and if the codec isn't h.264, then you're doing software decoding, which is far less efficient.
There is currently no good browser for apple. Safari (8.0.8) struggles with HTML5. Firefox (40.0.2) struggles with video/media of any sort and bogs down with multiple tabs. Chrome (44.0.2403.157 (64-bit)) drains your battery. I'm on a MacBook Pro (10.10.5, 13-inch, Early 2015). They are all uniformly disappointing.
At least for me, that's partly because Chrome makes it _so_ easy to "overuse" it. I'm right now writing this comment in a window with 29 tabs open, I have three other chrome windows with 22, 17, and 14 tabs open respectively. It's sitting there at 42.3 in the Energy Impact column and 32.23 in the Avg column - but to be fair it's got a whole bunch of stuff going on (two separate gmail instances and a fair few other javascript-heavy webapps as well). I've probably go more bogomips going on in this one browser window that all the machines I learnt to program on were capable of between them...
Hmm… comparing that to the amount of tabs and windows I have open in Safari right now (which is actually pretty low for me since I usually have more than 250 tabs open) seems to make Chrome look even worse…
Specifically (running Safari v7.1 on OS X v10.9.5), I have 18 windows open containing 2, 18, 26, 16, 2, 12, 17, 13, 21, 6, 2, 2, 1, 1, 2, 1, 1, and 4 tabs for a grand total of 147 tabs. Oh, and I forgot to mention, it’s also downloading two files in the background right now. It’s showing up with 13.2 in the Energy Impact column and 4.47 in the Avg column.
(Oh, and since I already have it open, I may as well mention that Firefox 39.0 shows up as around 14.7 in the Energy Impact column (although it keeps spiking to 23.9 every few seconds) and 2.99 in the Avg column with 2 tabs open in 1 window with no downloads).
I'm at 47.91 average impact for FF 40.0 with uBlock Origin and Ghostery, no Flash installed. Doing local web development and poking around some news sites to read about all these market crashes. Only a few tabs open at a time for me.
If your a developer at Google that is looking for a 20% time project here is your opportunity! Make some fixes, write it up and post about it somewhere and we will up vote your story and give you sweet HN karma. :)
Adding info because I didn't understand the reference. https://en.wikipedia.org/wiki/Cobra_effect "The cobra effect occurs when an attempted solution to a problem actually makes the problem worse. This is an instance of unintended consequences."
Because I'm the author of the post that this comment thread is about, and one of my primary conclusions in the post is that "Energy Impact" is inaccurate.
I am not sure they are weighting wakeups heavily. "but with a “tax” of 500 microseconds for each wakeup" could be a reasonable estimate of the time the OS needs after wakeup before it can hand over the CPU to the process that did the wake up. That time isn't spend in the process that wakes up the system, but can be attributed to it.
The article discusses this. Look again at the first table. Compare the power values (measured in Watts) which climb slowly, vs. the "Energy Impact" values which climb incredibly quickly.
If "Energy Impact" was a good measure it would correlate closely with real power consumption. It doesn't.
> Activity Monitor is a tool that was introduced in Mac OS X 10.9.
Activity Monitor was introduced with OS X 10.3 (https://en.wikipedia.org/wiki/Activity_Monitor). And was significantly revamped with OS X 10.9 to receive the various energy related measurements.
Wakeups are a bit tricky to actively measure, because with the Power Nap feature, the actual # of wakeups triggered by a particular application is a bit trickier to measure and varies based on what other activities are going on the system.
In general, from the get go, it is a bit of a myth that you can really know how much of a power budget is a consequence of a particular application. You can get a rough sense of relative impact on the overall power budget, but even that can be seriously skewed by a LOT of factors. At best, you can get a sense of, "if I sampled at the moment power stores hit a multiple of X, how often would I see the system doing something on behalf of each app that was draining power at that moment". Even that, is bloody hard to pull off, but it is your best "rough" approximation of what is going on.
I could have sworn I read somewhere that Activity Monitor also included GPU usage in its measurements—naturally, using the GPU more uses more power/battery.
They can't really on LEDs, though on OLED black is cheaper than other colors.
But unnecessary animations (like blinking text cursors) cause the entire graphics hardware stack to wake up all the way to the display, which is a serious power cost.
The original CGA/EGA/VGA text modes had a hardware cursor blinking option; I find blinking cursors useful, so it'd be interesting if modern GPUs added a low-power animation loop feature.
Their power cost is actually a pretty recent development. It used to be that it didn't matter because the display refreshed every frame no matter what, but in the last few years laptop displays added panel self-refresh. If the screen becomes static for a little while, the host side can stop driving it entirely and it'll keep the same picture up.
I'd guess it wouldn't be hard to keep two frames in memory, so you could get one blink at a time.
Except when it is transitioning color temperatures, I don't see why this would be the case. I reversed f.lux and it just calls some UpdateRGBLUT type function based on the color temperature. I think this LUT is applied to the frame buffer (perhaps this is how color profiles work?) even if f.lux is not being used.
First, f.lux periodically adjusts the LUT, so it does call this fairly often. Secondly, in my experience, adding this functionality seriously increases the odds that my laptop will switch from using the Intel GPU to the Nvidia GPU.
> Recent Intel hardware provides high-quality estimates of processor and memory power consumption ... But the big problem is that they are machine-wide measures that cannot be used on a per-process basis.
Would it not be possible for the OS to poll the CPU for these power consumption stats, and attribute the value returned to the currently running process?
Over time, I think it would be possible to see how much power the Intel CPU thinks each process is using.
Ideally you'd do an accounting update on task switch, along with all the other housekeeping. Then you don't need to do any polling. There's a "process accounting" infrastructure which perhaps could be adapted for this.
Work in bootstrapping energy consumption exists in academia; for example Pathak et al[0]. The changes in 10.10.4 could be an implementation of something along the same lines, by fitting a model to the measured power consumption.
As comex notes in the comments on the article, there appears to be a table of weights for various features, presumably obtained by regression.
Let me rephrase: There are temperature and battery charge sensors in the machines. We can use these to better calibrate the "virtual" energy usage metrics.
These only help you determine the state of the entire system. They provide no information about the impact individual applications or processes have especially not relative to each other.
You can't use temperature to accurately model power consumption, if for no other reason than that it's too heavily influenced by environmental factors.
"Battery charge" sensors are problematic because they're hard to keep in calibration without interrupting normal operation of the system.