A nice 1080P Dolby surround video will eat more energy. The processor has to do things faster. Faster means more energy.
Horrible software written poorly to play it back uses even more CPU cycles to do nothing but exist.
It still shocks me as a guy who was all over computers as soon as they appeared (think Ohio Scientific, atari, classic mac) as to how horrifically inefficient they have become.
Computers are very fast now, funny how they still can at times lag just to track the mouse.
It is called bloat.
Poor ways to do simple things in the most complex way imaginable to the CPU.
Everyone assumes the compiler will automatically find the fastest possible solution. Pfft.
I recompiled a simple plugin on virtualdub. I saw a divide by 2 function. The magically lovely compiler should have seen that and did a shift instead.
Nope, the compiler did nothing.
I changed that to a simple shift, and (for heavens knows what reason)
the speed of the plugin doubled.
Anyway, things could be done MUCH faster, globally, with all software, if people (programmers) would ignore doing things the easy way and use their brains instead of the compiler.
lol, ive read on redit from old timer programmers how the new kids just type crap very quickly, over and over again. If it doesnt work, hit 'undo'. They take forever to make something that works, they do not wish to sit around thinking things through.
It is much faster in the end to use the brain rather than appearing to do things very fast to look busy to the boss.
ML would be much better, but that could make people brains explode.
(think of the poor children)