Personally, I think Apple has gone too far in attempting to unify their interfaces across devices, at least from the point of view of an advanced user.
As an example, the new auto-hide scrollbars mean you can no longer tell at a glance how much of the document there is or where you are in it. Unless some of the content is actually visibly clipped by the edge of the scrollable area, you have no indication that you're looking at something that scrolls.
The real problem with a unified interface is trying to wedge all the differing capabilities of your device set into a unifying metaphor that doesn't stretch too badly.
I'm coming at the problem from a game development point of view; I do console and mobile phone games, we've developed a cross-platform game engine that (in an OS kind of way) presents a unified programming interface regardless of the underlying hardware.
Or that's the theory, anyways. In practice, there's abstraction breakage, sometimes severely. As an example, consider the situation we faced when the Wii came along; we had a couple of implicit assumptions in the engine design that the Wii violated.
Single Pool of System Memory: The Wii has two banks of system memory, one slow and large, the other fast and smaller.
Many Gamepads, One Mouse/Keyboard: The Wii controllers all have lightgun capabilities, so they can all have screen positions.
Gamepads Don't Change: The Wii controller has dockable subcontrollers.
This kind of thing breaks your abstractions, often badly. You either wind up having to wrap a superset of all possible cases around everything (in which case the game has to understand the platform it's on to access the functional parts), use capability detection (in which case the game has to be modular based on the capabilities reported by the hardware), or find a new abstraction that hides the differences once again.
With our game engine, we have the luxury of retooling our abstractions whenever a new platform arrives; we don't need to have a stable API or ABI because we're effectively our only user.
General purpose operating systems and development ecosystems don't have that luxury. Even if you aggressively deprecate your old mechanisms for new software, you have to keep the old stuff working.
The same is somewhat true for GUIs. You can only change so much out from under your users before they revolt. More to the point, if you change things too much, and your users have to relearn everything, you lose all your "lock in" effect from user experience. If they have to re-learn everything, they may well chose to go learn a different system instead, perhaps one that won't yank the rug out from under them every time the major version number changes...