An Early Look at GNOME 3.0 (Linux.com)
An Early Look at GNOME 3.0 (Linux.com)
Posted Mar 4, 2011 23:07 UTC (Fri) by elanthis (guest, #6227)In reply to: An Early Look at GNOME 3.0 (Linux.com) by walters
Parent article: An Early Look at GNOME 3.0 (Linux.com)
I can measurably show how many of the gnome-shell UI design idioms are actively harmful and in complete opposition of much of the work of the old HIG and general, well-known good design. I've had a few discussions with a few of the Microsoft UX folks (living in Redmond has its advantages, even if I am living in the Belly of the Best. ;) and they're backing up a lot of the hunches I had about this stuff. (On a side note, many Microsoft employees are pretty cool; many of them even have Linux machines for personal use, although they rarely contribute to FOSS because of the legal reasons. I'm good friends with one of the guys who previously worked on the core .NET networking layers, who quit because he got sick of the Microsoft corporate idiocy, despite loving .NET; he was pretty bummed that he's permanently barred from working on Mono because of his previous involvement with the upstream .NET source. I've heard from more than a few Microsoft games folks that they wish to God software patents would die, that copyright expired within five years, and that releasing source to projects after they enter the public domain be mandatory... too bad the lawyers and executives don't see things the way their engineers do.)
Back on topic of gnome-shell's UI anti-features, take the example of the application "menu." There's a reason all the major OSes use a drop-down/pop-up menu for that: pointer locality. A drop-down menu means that the following click will be close to the original click. With gnome-shell, I have to put my cursor in the upper left corner, and then possibly move it to the far right (on a 30" screen) to select the application I want. On small netbook screens, it may not be all that bad. On touch UIs, that issue is completely gone. On conventional desktop computers, it's a big problem that makes the gnome-shell UI harder to use, more stressful for the arm, etc. On the other hand, those kinds of good-for-mice menus are bad for touch. This goes to another point of design: you have to actually have a target device in mind. There's a reason that iOS and OS X have some very different UI behavior, despite being very similar looking from the 10,000 foot overview perspective. OS X is meant for Big Computers, and iOS is meant for Little Devices. You simply CANNOT make a UI that works well for both. gnome-shell has chosen in many places to take a UI that works well for little devices while in other places still being very clearly designed for big computers. It's a horrific mix of UI design patterns that ends up fitting neither class of device particularly well. The design isn't "incomplete," it's flat out incorrect. Unless gnome-shell plans on including two UIs (which would not be hard given its CSS/JS implementation) optimized for different screen-size and input-device classes, it's a dead-end UI project, based purely on measurable, quantifiable aspects of how the UI is interacted with based on three decades of collective UX knowledge and wisdom from the professionals.
As another example, take the separate widget set for the shell, which breaks consistency, harms discoverability, duplicates engineering effort, etc. There's no good reason why gnome-shell's run dialog should look like it does, for instance; it's needless variation from what every other app looks like. Gnome-shell's widget set is just as bad as XMMS or mplayer-ui or xine-ui or any of those other weird custom UI applications that GNOME used to stand firmly against with the investment in and adherence to a well-researched HIG. It's still not clear for intance that the "Windows | Applications" text in the overview are really just weird-looking tabs rather than being labels.
A final issue is the screen-edge magic for the applications/windows menu. Windows 7 has something similar for the show desktop (bottom right corner) that can easily be turned off, and for good reason: it's very difficult for me to use a mouse on Windows 7 when the laptop is sitting in my lap, and the mouse tends to move around on teh armrest a bit, often ends up in that corner (that's the reason corner hotspots are supposedly so great: they're very easy to get to with a mouse device), and all my windows go away. With gnome-shell, I'd end up with the same obnoxious effect, except there's no way to turn it off like in Win7. A touch device does not have this problem; but then, a touch device has absolutely no use for edge/corner hotspots, either. Desktop PCs don't usually have this problem, either, but laptop users can. So gnome-shell has a design that's optimal for touch devices, except where it's optimal for non-touch devices, except where it's optimal for small screens without pointers, except where it's not optimal for being used with notebooks at all... it's schizophrenic UI incarnate.
Back to your original comment, I'd like to note that "being GNOME 2" is not important to me, nor should it really be important to anyone. Being anything _specific_ is not important; being a good, usable, consistent, desktop-PC-friendly environment is what's critical.
The only problem I have with change in general is when it happens too often. Users don't like constant change. Once they learn something, they want it to keep working that way, unless the "way" is just horrifically difficult, slow, or error-prone. Moving icons and menus around for discoverability's sake means that a handful of new users save a few seconds each and hundreds of thousands of existing users lose several minutes each (it's easier to learn a new thing than to relearn an old habit). So I was irritated with how GNOME 2 and its core apps tended to move things around with every release while trying to home in on an optimal "best n00b friendly UI" at the expense of its existing userbase. Even in the Windows world, which is often criticized for not changing enough, I've seen multiple people go into a borderline rage over changes from XP to Vista/7. Things like the Add/Remove Applications menu for instance moved from Control Center to My Computer. I personally had never used WinXP much, so when I got my first Win7 machine, I ended up looking at My Computer for that menu first and was pleased to find it there. Other people threw a fit because they couldn't find it because they had 10 years of repetition training them to go to Start->Control Center->Add/Remove Applications, and now it was gone. Users hate change, unless the change actually fixes something that was annoying them in the first place.
Change is still necessary, sure, especially when you got it really wrong at first. But if you only got it a little wrong, or you just _think_ you got it wrong and are just wildly guessing at what might be better, change is bad bad bad, and it should not be done. In other words, "don't fix what ain't broke."