when making a change (no matter how trivial) you would need to write a modified version of the entire file and do the appropriate fsyncs (for the file and the metadata)
with a database you just tell the database what changes you want done and it makes the changes (probably fare more efficiantly than you will, as well as being far less likely to have missed some corner case)
searching a xml text file involves reading the file, parsing it, then walking the data in memory (or reading the file, parsing it as you go, and aborting the parsing when you have the data you need)
databases have many tricks that they can use to speed the data access (indexes, binary data representation, etc)
as for locking the data in a 'hard for the user to use' format, I would say that putting it in a standard database that many different tools can readily access (in many cases making it look like a spreadsheet, but including dump/restore to/from flat text files) makes it pretty easy to get at the data.
also, keep in mind that one commonly used database for this is sqlite. that database engine keeps all the data in one file (just as easy to check into a version control system), and does not require communication to a seperate daemon. it basicly acts as a file manipulation library in your software, so it has very low overhead.
as for redundancy and failover, that can mean many things. if you want to have the data stored on a central box (rather than sitting on someone's laptop that can break, get lost/stolen, etc) databases can do failovers to backup boxes transparently to the user
for a single home user this (usually) does not matter, for a small business this can be critical.
In addition, with a database back-end it is relativly easy to allow multiple people to share access to the data at once. again, not a feature that a single home user normally cares about.