When we dealt with numbers in the thousands (10^3), approximating a kilo as 2^10 was only 2.5%
off. Now that we routinely deal with numbers in the billions (10^9), approximating a giga as
2^30 is 7.5% off. Some of us already deal with numbers in the trillions (10^12), and
approximating a tera as 2^40 is a full 10% off!
Now if you do binary arithmetic in your head, so that when you see 14,463,188,475,466, you
instantly know that it is 13.2 * 2^40, then this comment doesn't apply to you. But you don't.
When you see "14,463,188,475,466" you approximate it in your head as "14.5 gigs". If you tell
someone else that you are looking at 14.5 gigs, and they think that you mean 14.5 2^40's, then
they are overestimating the number you are looking at by more than 10%!
A "kilo" has meant 10^3 to the scientific world since 1795. A "tera" has meant 10^12 since
1960. Programmers use of units are eventually going to have to become compatible with the
larger scientific world, not least because the numbers we deal with are getting bigger.