Font and type development at LGM 2014
Font production occupied a significant subset of the sessions at Libre Graphics Meeting 2014 in Leipzig, Germany. Driven in large part by the massive increase in open-licensed fonts over the past few years, recent interest in the community has turned toward automating those parts of the production pipeline that are still dominated by repetitive, manual work. At the same time, the increased interest in font development in the open-source community is visible in other ways, with far-reaching effects.
Historically, the focus of free-software font development effort has been FontForge, the graphical outline editor. Progress on FontForge has increased in recent years; in addition to general stability improvements and small feature enhancements, the editor has gained a built-in web server for immediate testing of web font rendering as well as a collaboration server that allows multiple users to work on the same font file in real time from different machines.
Most recently, though, the project's emphasis has been on improving FontForge support for the Unified Font Object (UFO) file format, which was highlighted in FontForge's January 2014 stable release. UFO is an XML-based format that stores each glyph from a given font as a separate file in its directory structure. Unlike FontForge's native, monolithic file format SFD, UFO is readable by a wide variety of external applications (including the leading proprietary font tools), and simplifies tracking changes to individual glyphs in version control systems like Git. Consequently, it is popular as an interchange format not just for full-fledged applications, but for smaller scripts and utilities; making FontForge compatible with other UFO implementations opens it up to a significant number of new users.
One example of such an external utility is Øyvind "Pippin" Kolås's Kernagic. Kernagic analyzes a UFO font looking for automatic "rhythm points;" it can then set the spacing of the glyphs to produce a uniform vertical-stem rhythm. Naturally, such automatic metrics are a first approximation only; Kernagic lets the user manually adjust and fine tune the spacing using a variety of samples.
Kolås actually started work on Kernagic after LGM 2013. Subsequently, though, he was introduced to Letter Model, the ongoing research of type designer (and founder of the Dutch Type Library) Frank E. Blokland, who presented Letter Model at LGM 2014. Blokland showed how he and Kolås had now implemented support for it in Kernagic. Letter Model is a somewhat controversial theory in type design circles: Blokland argues that Renaissance type designers actually constructed their designs on a fixed grid, an idea that runs counter to the traditional claim that old designs were "optically" perfected to the designer's eye and that the mechanization of font production in subsequent centuries has eroded the original artistic ideals.
![Frank E. Blokland [Blokland at LGM]](https://static.lwn.net/images/2014/04-lgm-blokland-sm.jpg)
Blokland walked the audience through his evidence, starting with a large body of precise measurements of the actual matrices from which Renaissance fonts were cast—measurements that indeed show fixed-width components, not irregular ones with proportions that varied letter to letter. He also showed not only how Letter Model was implemented in Kernagic, but in his own application LetterModeller, which he said he was working to release as free software.
Semi-automatic letterspacing is a major time saver; so too is semi-automatic weight adjustment, which is the purpose of Metapolator, the program presented by developers Simon Egli and Nicolas Franck Pauly. Metapolator is a web-based editor for interpolating between a pair of "master" fonts. Such interpolation is a common enough task—to, for example, create the intermediate instances between a "regular" weight and an "extra bold." What makes Metapolator different is that under the hood it uses Donald Knuth's METAFONT system: it converts the input fonts to METAFONT representations as parametric equations, which can be more easily interpolated than can the outline curves of TrueType, PostScript, and the like.
![Simon Egli and Nicolas Franck Pauly [Egli and Pauly at LGM]](https://static.lwn.net/images/2014/04-lgm-metapolator-sm.jpg)
Metapolator takes the outlines defining an input font and derives the "skeleton" path at their center. This skeleton becomes the "pen path" that is the cornerstone of a METAFONT representation. Of course, anyone who knows METAFONT knows that the system could be used to parameterize far more than just font weight: proportions, stroke contrast, slant angle—just about anything is fair game. Ultimately, Metapolator could become a tool that greatly reduces the time needed to design and build a large font family; for now the emphasis is being put on essentials like weight.
The aforementioned tools are poised to simplify design tasks, but design is only one part of font production. Like any software product, testing and quality analysis is required before a binary font should be packaged up and shipped out to users. As Dave Crossland, who curates Google's web-font service, explained, that process can be extremely time consuming when hundreds or thousands of fonts are involved. Consequently, he has been working with Mikhail Kashkin on Font Bakery, a continuous integration tool for font building.
![Dave Crossland [Crossland at LGM]](https://static.lwn.net/images/2014/04-lgm-crossland-sm.jpg)
The tests involved currently focus on technical features: language coverage, certain easy-to-calculate sanity checks (for example, virtually all applications expect that the "general-purpose space" character U+0020 and the "non-breaking space" character U+00A0 will be the same width), and whether or not the font source builds with FontForge. Future extensions may also assist with design issues, though, by generating test documents for a reviewer to inspect.
Speaking of Google Fonts, the web service has been responsible for
delivering the lion's share of CSS @font-face resources, but
there are plenty of people who for one reason or another would prefer
to run their own server. Raphaël Bastide presented Use & Modify, his personal
web-font directory server, which runs on his open source "CMS for
typeface collections
" called ofont. Anyone can
use ofont to self-host a public web font collection, he said. While
specialty software is not required to serve the TTF, OTF, or WOFF
files to web browsers, ofont provides a slick catalog-like front end
to the collection, which might become a popular option for independent
font designers or open-source projects.
Coincidentally, there were a handful of font design talks in the schedule as well, although each presented a unique take on the subject. Kolås spoke about 0xA000, a font family he designed originally to emulate the look of low-resolution raster displays—in particular, to be readable with lower-case letters just three pixels wide. While the initial incarnation of 0xA000 used square blocks to represent the pixelated look, he said, he soon began experimenting with other sorts of component pieces—curves, circles, hash marks, etc.—that still produced the same shapes when reduced in size.
Simon Budig also presented his personal font project, which tackles creating nice-looking typefaces for printed circuit board (PCB) design. The PCB manufacturing process, he explained, introduces some peculiar technical requirements (such as only permitting circle segments, rather than the quadratic or cubic Bézier curves normally used in outline fonts), so making a quality PCB font involves technical as well as artistic challenges. Nevertheless, he noted that PCB designs have a long history of attracting artistic expression over the years, and he hope to convinced the makers of PCB design tools to support better font options.
Last but not least, Edward Trager spoke about the development of Hariphunchai, his font for the Tai Tham alphabet that is used for several languages in Northern Thailand, Laos, and surrounding areas. Local computer users typically use non-Unicode fonts for Tai Tham, he explained, which makes it difficult or impossible to create web pages or exchange electronic documents in the supported languages, so he set out to create a quality, Unicode-compliant font. Along the way, though, he encountered mistakes in the official Unicode normalization for Tai Tham which consistently resulted in misplaced characters.
![Edward Trager [Trager at LGM]](https://static.lwn.net/images/2014/04-lgm-trager-sm.jpg)
When the Unicode Technical Committee refused to amend the standard because doing so would violate "stability," Trager turned to the developers of the open source HarfBuzz shaping engine and Firefox browser looking for a solution. Ultimately, Mozilla's Jonathan Kew wrote a HarfBuzz patch that works around the broken Unicode specification, and the current Firefox Aurora channel is the first to properly support Tai Tham.
Ultimately, font development has always involved balancing technical and aesthetic considerations, although working around bugs in the Unicode specification itself is thankfully a rare requirement. But it is educational to consider all of the ways in which the two halves of the task demand attention, and to see how the community responds. Continuous integration is an idea with roots firmly in software engineering, for example, while Blokland's Letter Model comes straight from Renaissance type designers. Nevertheless, both can provide insights to font developers today, thanks to a community that is open to concepts from anywhere.
[The author would like to thank Libre Graphics Meeting for assistance to travel to Leipzig for LGM 2014.]
Index entries for this article | |
---|---|
Conference | Libre Graphics Meeting/2014 |
Posted Apr 11, 2014 2:48 UTC (Fri)
by zuki (subscriber, #41808)
[Link] (1 responses)
Posted Apr 11, 2014 10:22 UTC (Fri)
by etienne (guest, #25256)
[Link]
In general, official normalization process do not follow official normalization procedure (like ISO-9001) because they have no bugs (TM).
Font and type development at LGM 2014
Font and type development at LGM 2014
Even ISO-9001 and the like did not describe how to fix bugs in their own documents, because their documents are already perfect (TM).