I assume the performance sucking comes from the fact that the video controller is constantly reading from the memory to refresh the screen, even if no one is looking at it or no monitor is plugged in.
Modern computers have power-saving features that disable the display after a period of no keyboard/mouse activity. Does that shut down the refresh in the video controller, thus eliminating the performance impact?
What about DVI? VGA CRTs have to be refreshed constantly, from the video controller (and a VGA LCD monitor, emulating a CRT, would too), but what about DVI LCD display?
Copyright © 2017, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds