Mobile devices, even including tablets, spend more time in the environment that shows up the flaws in "treat remote things as local" models of computation from RPC right up to the modern time. Internet access is patchy, it's available one moment and gone the next, bandwidth varies enormously, two apparently simultaneous things experience apparently different network conditions for whatever freak reason.
So it turns out that you want to do a lot of work in the client. Instead of being just some dumb rendering and a lot of calls to a remote server where all the smarts live, you have two copies of everything, one in the client making things appear to work for the user instantly and another in the server keeping up with the client state whenever there's a window of Internet access. They could be exactly the same, but for a variety of reasons they are more likely to be quite different, maybe not even written in the same programming language.
If you do this perfectly you get a really nice app, something every Nexus / iPad / whatever owner is happy to be using and you contribute to the (false) notion that everywhere has Internet access all the time these days.
Obviously you can't ever quite get it perfect, but as you deviate from the ideal further and further you interfere with the illusion. A spinning "wait" icon appears and the user curses. Somewhat more frustratingly, something they just did magically undoes itself before their eyes as the client state catches up with a server that has lost an update, or reports that an action which seemed possible was actually rejected. Worst of all, the app just freezes, unable to either complete the intended action or return the user to their state of bliss it waits forever for the Internet.