|
|
Subscribe / Log in / New account

Much hot air over blinking cursors

Much hot air over blinking cursors

Posted Feb 7, 2009 16:20 UTC (Sat) by mjg59 (subscriber, #23239)
In reply to: Much hot air over blinking cursors by jwb
Parent article: Much hot air over blinking cursors

Two wakeups per second, combined with some hysteresis on the drop back to the lower power mode to avoid rapid flicking between power states. It ends up with the card being in high power mode about 10% of the time.


to post comments

Much hot air over blinking cursors

Posted Feb 8, 2009 2:44 UTC (Sun) by dimi (subscriber, #2732) [Link] (8 responses)

Yes, but we should not forget that:
* if a computer is idle for ~10min, screensaver kicks in
* the cursor blinks only on windows with editor, so if you browse the web, read email, or read a PDF document there is no such blinking
* you have to be idle at the computer for the savings to happen since you type faster than the cursor blinks

When all of these are put together, the "load factor" of savings are very small, probably just a few percentages over a typical session. Even if you waste 2W as claimed (while the cursor is blinking on an otherwise idle box), that would translate to maybe 0.05-0.1W in real life.

Maybe that's significant somewhere, but please stop claiming we're saving 2W. We are not.

Much hot air over blinking cursors

Posted Feb 8, 2009 7:43 UTC (Sun) by PaulWay (guest, #45600) [Link] (7 responses)

> Even if you waste 2W as claimed (while the cursor is blinking on an
> otherwise idle box), that would translate to maybe 0.05-0.1W in real life.

> Maybe that's significant somewhere, but please stop claiming we're
> saving 2W. We are not.

So what you're saying is that Matthew Garrett measuring this with a meter is wrong and your unsupported and unjustified guesses are right. Matthew Garrett, who knows exactly what kind of hardware he's working with and what kind of power it consumes, is wrong and you, who offer no proof or knowledge of any kind, are right.

Yeah. Thanks for your useless contribution.

Matthew Garrett gave a talk on this at LCA, in which he basically pointed out that the GPU can be switched from updating at 60Hz to updating at 0Hz, also switching most of its functions off, whenever the graphics driver detects that we're not updating the screen regularly. It's basically about 90% of the power usage of that chip turned off in one go. The start up speed is faster than 1/60th of a second, so you can do it in the time the screen would normally refresh anyway. If that chip's using 5W - and I'd be surprised if even the Intel 915 GPUs use less than that - then that's easily going to deliver 2W of power.

Hope these facts help you,

Paul

Much hot air over blinking cursors

Posted Feb 8, 2009 14:33 UTC (Sun) by dimi (subscriber, #2732) [Link] (6 responses)

You are missing the point. I have not taken issue with the measurements that Matthew took, but with the _implied_ power savings.

What Matthew measured is the difference in power usage for an *idle* computer with and without a blinking cursor. Great. But from here to say that we'll be saving 2W over an entire user session is balderdash. It's like measuring the power needed to do the actual blink (which may be 20W over 10ms) and claiming we'll be saving 20W.

There is this notion of a "load factor". First we applied it to the blinking itself, and reduced the 20W instantaneous power usage to about 2W average for and idle computer with a blinking cursor.

To calculate real power savings over a user session you have to ask yourself: out of a typical session (8h?) how long would the computer have a blinking cursor and be totally idle? Not a lot (please re-read my original comment). The 2W savings would easily drop to 0.2W, likely even more.

Hope this helps,
Dimi.


Much hot air over blinking cursors

Posted Feb 8, 2009 22:29 UTC (Sun) by PaulWay (guest, #45600) [Link] (5 responses)

> You are missing the point. I have not taken issue with the measurements
> that Matthew took, but with the _implied_ power savings.

Oh, good. Then you know that most Linux installs these days have six text screens sitting in virtual space, waking up twice a second to turn their unseen cursor on and off. Yes, that's not a hardware feature. Matthew's been complaining about that one since LCA 2007. As far as I know it's still the case. Whether those wake-ups are coalesced by the kernel these days is anyone's guess.

Right now I have a terminal with two tabs underneath my Firefox window. I'd be more than happy to see any evidence that X, GNOME and gnome-terminal is smart enough to not have that wake-up still occurring twice a second given that the window is completely obscured. Likewise when I stop typing in this text field the bar cursor blinks on and off. I think you're underestimating the ubiquity of cursors in our GUIs.

The real problem here is that, as I understand what Matthew said in his talk, the only way to know for sure that we didn't write anything to the screen is by having no display code entered during the period. It would require much more extensive surgery on X (that no-one wants to do) to know that no pixels were touched even though display code was entered. So from what I understand of this, putting in a patch to turn off blinking cursors really can save 2W or more.

And I'd still be happy with a 0.2W saving totalled across millions of laptops.

Have fun,

Paul

Much hot air over blinking cursors

Posted Feb 8, 2009 22:48 UTC (Sun) by dimi (subscriber, #2732) [Link] (4 responses)

> Oh, good. Then you know that most Linux installs these days have six
> text screens sitting in virtual space, waking up twice a second to turn
> their unseen cursor on and off.

Of course I am aware. I have addressed this problem in my first post:
A cursor blinks only in a windows that has the following properties:
- it has focus
- it's waiting for keyboard input

Moreover, only _one_ windows at a time has focus.

This is trivial to test -- just put your six terminals side-by-side.

So your entire argument is based on a false premise. Considering this,
now you can see that the 2W savings can happen only when:
- the focus is on a window that has an editor
(not that common in a typical session)
- the computer is totally idle, like the mouse is not moving,
not playing music or video, etc
- the screensaver hasn't kicked in

How often does that happen?

Continuing claiming the 2W saving is like saying we save in fact 20W because there is a spike of 20W every few seconds. It's ridiculous.

Dimi.

Much hot air over blinking cursors

Posted Feb 8, 2009 23:36 UTC (Sun) by jwb (guest, #15467) [Link] (3 responses)

I think he's referring to the six real consoles, not xterms.

Much hot air over blinking cursors

Posted Feb 8, 2009 23:42 UTC (Sun) by dimi (subscriber, #2732) [Link] (2 responses)

That's a complete non-issue. The population of users that use only real consoles (no X) is too small for words. Besides, those are the ultra-technical users that are more than capable to change that setting themselves.

BTW, I'm not even sure if the change under discussion here applies to real consoles...

Much hot air over blinking cursors

Posted Feb 9, 2009 0:46 UTC (Mon) by nix (subscriber, #2304) [Link] (1 responses)

You don't get it. Those six text consoles are there for almost everyone,
including non-technical users, but *even though they're not visible* and
are on different virtual consoles, they *still* wake up the video card
twice a second to blink their invisible cursors on and off.

Much hot air over blinking cursors

Posted Feb 9, 2009 1:11 UTC (Mon) by dimi (subscriber, #2732) [Link]

This claim seems rather dubious -- do you have any reference to support it?

Nevertheless, *nobody* argued against a patch to fix this brain-dead behaviour. Obviously there would be no point in blinking a cursor on a hidden console...

The entire discussion was about turning off blinking on the regular cursor used in X, something that users are used to for a long time.


Copyright © 2025, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds