(Disclaimer: Also a Google employee, although I don't work on Chrome).
As much griping as goes on about Google's forks and patches I generally
find their approach to be responsible and practical. The focus is on
getting something working well, even if that means patching and bundling
system libraries, followed by pushing those patches upstream and
unbundling, if possible. Or, occasionally, becoming upstream.
For open source projects, it's usually not an issue. You publish the
source of your application. You publish patches to or patched versions of
the necessary dependencies. Packagers/maintainers then have the option of:
1) Updating to the upstream version that has your patches
2) Patching the shipped version
3) Bundling the library with the application (in some manner)
Yes, it's a little extra work for the repository maintainer, but not much.
It's also a (very) little extra work for the application developer to make
sure they can be built and linked against the system version of the
library. As a former Fedora maintainer, I've had to resort to some version
of that several times. Including coordinating with the maintainers of
dependent packages. Of course, Ubuntu has kind of shot themselves in the
foot on this issue with the LTS stuff, but that's (kind of) beside the
Now, for closed source apps, the burden is on the application developer
and, frankly, I don't want them trying to use the more change-prone system
libraries. Bundle. Install into /opt/<application> and go away. That's
all they usually do anyway.
I'm not entirely up to speed on the Firefox/Mozilla situation, but the last
time I checked at least part of the problem was distributions packaging
mostly-internal unstable libraries as "system" libraries and then linking
other packages against them. Oh, and failing to understand/use ELF
versioning. Most distributions are perfectly capable of having multiple
versions of libraries installed and linked appropriately, provided the
library authors follow the .so versioning rules.