Because Linux systems aren't always binary compatible. There's no guarantee that the FF binary shipped by Mozilla will actually run on a given distro. It's rather difficult for the FF folks to ensure that the binary is as widely compatible as it is, and they make some sacrifices in the process. For example, the bundle their own versions of quite a few libraries, which may not be built with the same settings as other apps' bundled versions and thus may not be perfectly compatible. FF's NSS library is a good example of this.
If there's a security hole in libpng, on a distro-shipped app all you need to do is update one package. If you're using vendor-supplied packages you need to find out which of your apps are affected and manually update them.
Why not share a common libpng, you ask? Well, in the case of libpng that's not unreasonable, but not all libraries have strong API/ABI compatibility guarantees. One app may need version 1.1 and another needs version 1.2.
OK, so parallel install the libraries. Sounds ok, right? Well, it does until you realize that the 1.1 and 1.2 libraries are _both_ linked into one process via a 3rd-hand library dependency, and everything falls in a heap. To work around this you need strict symbol versioning, and even then it's far from reliable.
I see two ways to tackle app distribution. One way is for the app to be completely self-contained, including all dependencies not part of the OS its self. This results in bloated installs (lots of duplicate libraries and data), security problems, app compatiblity problems ("DLL Hell"), memory bloat (DLLs/shared libs can't be shared in memory between different apps) and all sorts of other issues. If some rules are very strictly followed it can work, as Microsoft Windows proves. Ever packaged an app for Windows, though? It's not fun. Coding for the platform is also more complex: you must, for example, never assume that other DLLs are using the same C runtime as you, so you can't free() memory they've malloc()'d or vice versa, you can't pass a FILE* between DLLs, etc etc.
The other way is to ensure that the OS has a way to provide all dependencies in a central way, so apps don't carry them at all. To make this work practically the OS has to be responsible for app packaging and installation too, as it's hard to guarantee perfect compatibility of all libraries across distros especially given the problems with multi-versioning. This means that the app user is somewhat dependent on the OS to provide updated versions, but allows the OS vendor to ensure app compatibility and stability, eliminate shared library conflicts, etc.
I don't think either approach is "right" ... and personally, I wouldn't mind seeing improved distro compatibility in package formats, package names, etc so a package could be produced that'd reliably install on multiple distros. Sometimes, though, the package simply _couldn't_ be installed due to library versioning conflicts, so you'd still have to maintain a couple of versions or start bundling your libraries.
It's not a simple problem, and all the current solutions kind of suck.