> While the fashion right now is for every app to render things as a complete bitmap, that's just a fashion of what is considered 'good' this year, that swings back and forth over time, in part depending on the available processing power at each end.
Think this statement is a gross mischaractization of the last 30+ years of graphics development with a history of hard won experience with apps, operating systems and hardware that can't be hand-waved away as mere "fashion". Where is the yearly "swing" in the evolution of serial terminals to framebuffers to fixed function accelerators to programmable GPUs, which has taken decades? If there is anything cyclic, computing has existed for too short a time to see it
The best you can say is that the approach of sending high level drawing commands across the wire like X or NeWS is what is being implemented with HTML5 applications using JS, DOM, CSS and WebGL