Hello. I know this isn’t completely related to Linux, but I was still curious about it.
I’ve been looking at Linux laptops and one that caught my eye from Tuxedo had 13 hours of battery life on idle, or 9 hours of browsing the web. The thing is, that device had a 3k display.
My question is, as someone used to 1080p and someone that always tries to maximise the battery life out of a laptop, would downscaling the display be helpful? And if so, is it even worth it, or are the benefits too small to notice?
I don’t think it would matter that much since a desktop at 3k is very similar on modern hardware to a desktop at 1080.
But I’d be interested in someone who had the hardware to test this. Right now I use my laptop for school work, and in trying to squeeze every ounce of battery life I was running my display at 45hz instead of 60hz. I had a free day during the summer so I charged it up, ran a YouTube video on repeat and timed the battery life, then changed the display frequency and it was like a 2 minute difference. I also tried it while running a second 1080p monitor through hdmi and the difference was something like 10 minutes. Like, so small a difference or didn’t matter.
I don’t have the data sheet anymore so these numbers are anecdotal etc etc YMMV. The biggest change for me was buying a 65w PD battery bank and keeping that charged in my bag.
Yes, but by very little.
You’re saving on GPU processing, but that’s unlikely to be that much for browsing.
The display on my laptop is 4k and i can tell you i tried downscaling it was not as big a difference as simply turning the brightness down as low as was comfortable.
Short answer: no.
Long answer: also no, but in some specific circumstances yes.
Your display uses energy to do two things, change the color you see and make them brighter or dimmer. It honestly speaking has a little processor in it but that sucker is so tiny and energy efficient that it’s not affecting things much and you can’t affect it anyway.
There’s two ways to do the things your display does, one way is to have a layer of tiny shutters that open up when energized and allow light through their red, blue or green tinted windows in front of a light source. In this case you can use two techniques to reduce the energy consumption: open fewer shutters or reduce the intensity of the light source. Opening fewer shutters seems like it would be part of lowering the resolution, it when you lower the resolution you just get more shutters open for one logical “pixel” in the framebuffer (more on that later).
Another way to do what your display does is to have a variable light source behind each tinted window and send more or less luminance through each one. In this case there is really only one technique you can use to reduce the energy consumption of the display, and that’s turning down the brightness. This technique has the same effect as before when you lower the resolution. It’s worth noting that a “darker” displayed image will consume less energy in this case, so if you have an oled display, consider using a dark theme!
So the display itself shouldn’t save energy with a lowered resolution.
Your gpu has a framebuffer, which is some memory that corresponds to the display frame. If that display is running at a lower resolution, the framebuffer will be smaller and if it it’s running at higher resolution it’ll be bigger. Memory is pretty energy efficient nowadays, so the effect of a larger framebuffer on energy consumption is negligible.
Depending on your refresh rate, the framebuffer gets updated some number of times a second. But the gpu doesn’t just completely wipe and rewrite and resend the framebuffer, it just changes stuff that needs it, so when you move your mouse at superhuman speeds exactly one cursor width to the left in one sixtieth of a second, the framebuffer updates two cursor area locations in the framebuffer, the place the cursor was gets updated to reflect whatever was underneath and the place the cursor is gets updated with a cursor on it.
Okay but what if I’m doing something that changes the whole screen at my refresh rate? In that case the whole framebuffer gets updated!
But that doesn’t often happen…
Let’s say you’re watching a movie. It’s 60fps source material, so wouldn’t the framebuffer be updating 60 times a second? No! Not only is the video itself encoded to reflect that colors don’t change from frame to frame and that the thing decoding them doesn’t need to worry about those parts, the thing decoding them is actively looking for even more ways to avoid doing the work of changing parts of the framebuffer.
So the effect of a larger framebuffer on battery is minimized while playing movies, even when the frame buffer is huge!
But actually decoding a 3k movie is much more cpu intensive than 1080. So maybe watch in 1080, but that’s not your display or resolution, it’s the resolution of the source material.
Okay, but what about games? Games use the framebuffer too, but because they aren’t pre-encoded, they can’t take advantage of someone having already done the work of figuring out what parts are gonna change and what parts are. So you pop into e1m1 and the only way the computer can avoid updating the whole framebuffer is when the stuff chocolate doom sends it doesn’t change the whole framebuffer, like those imps marching in place.
But chocolate doom still renders the whole scene, making use of computer resources to calculate and draw the frame and send it to the framebuffer which looks up and says “you did all this work to show me imp arms swinging over a one inch square portion of screen area”?
But once again, chocolate doom takes more computer resources to render a 3k e1m1 than one in 1080, so maybe turn down your game resolution to save that energy.
Hold on, what about that little processor on the display? Well it can do lots of stuff but most of the time it’s doing scaling calculations so that when you run chocolate doom full screen at 1080 the image is accurately and as nicely as possible scaled across the whole screen instead of stuck at the top left or in the middle or something. So in that case you could actually make that little sucker do less work and take up less energy by running at the displays “native” resolution than if you were at 1080.
So when jigsaw traps you in his airport terminal shaped funhouse and you wake up with the exploder on your neck and a note in front of you that says “kill carmack” and no charger in your bag, yes, you will save energy running at a lower resolution.
E: running chocolate doom at a lower resolution, not the display.
You’ll have to downscale the resolution unless you have super human vision. I suspect that the laptop is configured to ~150% ootb which would mean those battery estimates are based on that as well.
I don’t think upscaling the text/UI and downscaling the whole screen are the same thing.
The one usually works best with the other, though.
EDIT: nm, I see what you were getting at in their comment now. Thy also meant downscaling the Text/UI, not upscaling.
No, the majority of the energy consumption is in the backlight.
Maybe if it allowed you to switch to integrated graphics versus discrete, putting the GPU to sleep.
For just browsing, even integrated graphics has been plenty since the beginning of the internet, maybe with some exceptions when Flash games reaced its pinnacle.
Using the iGPU might save power but the resolution doesn’t need to be turned down for that
That might save a bit of power, but your dedicated GPU is usually in an idle/powered down state until your compositor gives it specific applications to accelerate. for Nvidia laptops this is what the PRIME/Optimus feature does.
I’d think so. 3k is so many pixels to compute and send 60 times a second.
But this video says the effect on battery life in their test was like 6%, going from 4k to 800x600. I can imagine that some screens are better at saving power when running at lower resolutions… but what screen manufacturer would optimize energy consumption for anything but maximum resolution? 🤔 I guess the computation of the pixels isn’t much compared to the expense of having those physical dots. But maybe if your web browser was ray-traced? … ?!
Also, if you take a 2880x1800 screen and divide by 2 (to avoid fractional scaling), you get 1440x900 (this is not 1440p), which is a little closer to 720p than 1080p.
Your GPU doesn’t need to re-render your entire screen every frame. Your compositor will only send regions of the screen that change for rendering, and most application stacks are very efficient with laying out elements to limit the work needed.
At higher resolutions those regions will obviously be larger, but they’ll still take up roughly the same % of the screen space.
But you don’t lower the amount of pixels you use. You just up the amount of pixels used to display a “pixel” when lowering the resolution. So the same amount of power is going to be used to turn those pixels on.
My PowerBook G4 might be a bit dated, but running other resolutions than native is quite heavy on that thing. Your built-in display can handle one resolution only - anything else will require upscaling.
Your GPU can probably do that upscaling for cheap. But cheaper than rendering your desktop applications? 🤷♂️
You’ll have to benchmark your particular device with powertop.
Isn’t rescaling usually done by the display driver? I am fairly certain this is the case for external displays. Are laptop displays any different?
Edit: with “display driver” I mean the hardware chip behind the display panel, dedicated to converting a video signal to the electrical signals necessary to turn on the individual pixels.
For an external display I’d bet the case is the hardware driver for the panel.
At least my 17" Powerbook G4 with a massive 2560x1440 display does it in the software display driver. I’m sure some laptop panels do it in hardware as well, but seems there’s some very janky shit going on at least with laptops that have both integrated and discrete GPUs.
It would, but it would be a very small difference. Maybe 2-3% at most.
Unless you’re running games or 3D intensive apps no. Resolution is cheap on power under normal circumstances.
As a web developer, I noticed that some elements such as very big tables struggle to render on 4K but are absolutely fine at 1080p. I would assume that means the CPU and/or GPU are more taxed to draw at higher resolution, and therefore I assume they would draw more power. I might be mistaken. Do you speak by experience?
I’m a flutter dev, and I’ve seen testimonies from a former Windows 98 dev about limiting the number of redraws in the shell.
There’s deffo extra overhead, but it’s not linear - 4k being 4 times as many pixels as 1080p doesn’t mean 4k the work to render after the first frame, as the browser/framework will cache certain layout elements.
The initial layout is still expensive, though, so big tables will take longer, but that big table at high Res will probably be less chuggy when scrolling once loaded.
Your not gonna get much from that. Your much better off looking for more efficient processors. If your looking at a brand new tuxedo you have a pretty high budget already so id suggest waiting a bit and looking at the new mobile CPUs from intel and amd coming out which seem to have really good efficiency. Linux support for them should roll out pretty quick since its not like they have the same challenges as ARM chips being x86.