|
|
Log in / Subscribe / Register

Mozilla on the coming version-100 apocalypse

Mozilla on the coming version-100 apocalypse

Posted Feb 17, 2022 12:24 UTC (Thu) by excors (subscriber, #95769)
In reply to: Mozilla on the coming version-100 apocalypse by rsidd
Parent article: Mozilla on the coming version-100 apocalypse

He also declared that on his death the version number will be changed to π, and the code will be frozen forever. "From that moment on, all “bugs” will be permanent “features.”" (https://tug.org/TUGboat/tb11-4/tb30knut.pdf)

He's not claiming that the software will have reached perfection at that point, merely that it will be a local optimum. Long-term stability is very valuable to users, and no bug that has gone unnoticed for >30 years will be important enough to be worth breaking that stability.

I think it turns out he was wrong. Nobody was asking for PDF or Unicode or OpenType support in 1990, but they became important over time, and those features had to be added in forks like e-TeX and pdfTeX and XeTeX and LuaTeX. Nowadays if you want to write a *TeX document or process someone else's *TeX document, you'll almost certainly have to use one of those other engines, and you'll probably end up reading outdated documentation and suffering package compatibility problems until you eventually figure out which is the right gigabyte-sized TeX distribution to download (TeX Live, MiKTeX, proTeXt, ...) and the right command to run (lualatex?). It's kind of a mess. That seems worse than having a single official continuously-developed project that is easy for users to follow, which can still aim to largely preserve backward compatibility and can let you download old releases when you really need perfect bug compatibility.


to post comments

Mozilla on the coming version-100 apocalypse

Posted Feb 17, 2022 12:59 UTC (Thu) by anselm (subscriber, #2796) [Link]

I think it turns out he was wrong. Nobody was asking for PDF or Unicode or OpenType support in 1990, but they became important over time, and those features had to be added in forks like e-TeX and pdfTeX and XeTeX and LuaTeX.

Remember that Knuth started TeX (and METAFONT) mostly because he was unhappy with the way his own books were being typeset. If you think of TeX as an engine for typesetting The Art of Computer Programming, it's probably fine to freeze it because at least the system is very well documented – as long as you have a way of putting black pixels onto a white sheet of paper, you'll always be able to use TeX and METAFONT to typeset Knuth's books, even if you're a space alien from the 43rd century. (I'm saying this as someone who spent quite some time developing DVI-to-device software in the late 1980s and early 1990s.)

PDF, Unicode and all that only come in if you insist on using fonts that aren't Computer Modern, or languages that aren't English. It's true that that applies to many if not most of us these days, but to be fair, it wasn't really part of the original specification. We can probably count ourselves lucky that the original system was open enough to admit that sort of radical change, and that now, 40 years after it originally came out, it is still a force to be reckoned with.

Mozilla on the coming version-100 apocalypse

Posted Feb 19, 2022 5:50 UTC (Sat) by rsidd (subscriber, #2582) [Link]

I agree with you, I myself use pdflatex or xelatex (the latter for unicode support which is indeed the way to go). And metafont was an interesting idea but far too slow because it uses cubic splines and it turns out quadratic (postscript, opentype) is good enough; more so, it doesn't directly render vector font outlines, but generates bitmaps for each size. So I think TeX should be regarded as a frozen base that can be built upon by other projects like luatex and xetex, and metafont is not even used by most TeX projects now and can be regarded as perfect, but dead.


Copyright © 2026, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds