> Some of us would like choices that a little more in between.
Well Debian gets close to that somewhat.
As you know there are three main branches of Debian at any one time.. Stable, Testing, and Unstable.
With Debian's package management system is fairly flexible when dealing with those sorts of things. You can optionally configure 'apt-pinning' which gives different weights for different branches.
For example you can configure apt-get to always pull packages from Testing, but if they don't exist in Testing apt-get will pull them from Unstable.
If you configure Stable + Testing + Unstable in your system you can give the greatest weight to Testing. This way you can avoid a lot of the churn that comes into play when using Unstable. That way if you want to use OpenOffice.org 3, for example, it's avialable for Unstable, but not testing. So you can pull that specific package down and give it priority.
This sort of thing is what I do to get buy with Debian. My work system uses testing and my home systems use unstable. I'll use approx (a package proxy) to cache packages so that I don't waste bandwidth on copying down multiple copies of the same packages.
The major suckage about Testing, however, is that directly after a release it's mostly useless. Unless your involved in development your far better off tracking stable for some time, after a release, before switching up.
Now if you want to track 'Stable' and backport packages it usually takes a bit more effort.
For doing that your better of just leaving your apt-get stuff configured to use stable only for binary pakcages, but track testing or unstable for source packages. Also tracking backports.org is a good idea.
This way you can recompile packages from source specifically for stable. Almost all packages should backport themselves with little to no effort. Sometimes you have to recompile some dependences, but that's not usually much.
This is because if you pull binary packages from Testing then it has the tendency pull many more dependences then it actually needs. So you end up with a hybrid testing/unstable/stable system, which defeats the whole point of running stable.
But by recompiling packages you get the ability to benefit from stable systems, but get the newer versions of software you need for your production system.
This sort of approach is largely unsuccessfull with Ubuntu, unfortunately. This is because Ubuntu is much less disciplined about packages and backwards compatability. The only reason it works as well as it does in Debian is just through brute force developer hours.
However even if you use pure Unstable system you won't be able to keep up with Fedora. Fedora's a developer's playground and has the cutting edge features before any other system.
Debian benefits from this hugely. If it wasn't for Fedora being cutting edge and thrashing out bugs and doing all that sort of work.. that work would fall to Debian which would consume massive amounts of resources and manpower. I don't think Debian, as a orginization, would be able to cope with what Fedora does.
Without Fedora (and Ubuntu) Debian would be much more of a mess to deal with.
(and visa versa.. without Debian working on multiple arches and making sure that all the diverse software options worked with one another cleanly (for example: With Debian I can equaly choose between using Sendmail, Exim, or Postfix without breaking stuff badly.. and I can choose equaly between KDE, Gnome, LXDE, or XFCE, and even others. Everything 'just works', mostly.) then Fedora and others would be worse off)