|
|
Subscribe / Log in / New account

Accessibility in Linux systems

October 8, 2008

This article was contributed by Samuel Thibault

The Linux kernel recently saw the addition of a "basic Braille screen reader", and thus, the addition of a drivers/accessibility subdirectory and its corresponding CONFIG_ACCESSIBILITY option. It is worth noting that one of the first reactions was "what the heck is accessibility?" This shows how the idea is still quite unknown to developers.

And yet the issue of GNU/Linux accessibility, i.e. the usability of GNU/Linux by disabled people (e.g. blind people) is, of course, not new. Work in that area has been conducted for a long time: the speakup speech screen reader saw its 0.07 version against Linux 2.2.7 in 1999, and the brltty Braille screen reader started in 1995. The basic Braille screen reader that has just been added to the Linux kernel is just the emerging part of that work which has been around since then.

With the popularization of GNU/Linux among non-technical people, there has been renewed interest in mainline accessibility support: the GNOME desktop, OpenOffice.org and Firefox 3 can now be rendered via Braille and speech synthesis thanks to the AT-SPI framework and the Orca screen reader. KDE will soon follow when these technologies get rebased on D-BUS. In addition, accessibility menus have started appearing in the upstream distributions.

One of the main concerns for disabled people used to be the lack of support of Javascript in text-mode web browsers and office suite support. With more and more companies and governments migrating to Linux—particularly since some states require accessibility of tools used in government—renewed development effort was becoming more and more of a must. In Massachusetts, people had even signed a petition against the migration to libre software because it was not yet accessible at the time!

What is Accessibility?

Accessibility, sometimes abbreviated a11y, means making software usable by disabled people. That includes blind people of course, but also people who have low vision, are deaf, colorblind, have only one hand, can move only a few fingers, or even only the eyes. It also includes people with (even light) cognitive troubles or just not familiar with the language. Last but not least, it includes elderly people, who often have a bit of all these disabilities. Yes, that actually means everybody is concerned, eventually. That means support for special devices, but also general care during development, like not assuming that an audible alarm will be heard or a transient message will be read.

Maybe one of the most obvious accessibility techniques is speech synthesis, which turns text into audio that can be sent to speakers or headphones. There used to be hardware speech synthesis (supported by the speakup drivers), but these have often been replaced by software speech synthesis. While the quality of commercial software speech synthesis is very good these days, the quality of free software vary a lot. While there is very good libre English speech synthesis, the support of other languages is quite diverse. For instance, the Festival and eSpeak libre engines easily support a wide range of languages, but their sound is rather robotic. There are better phoneme libraries like mbrola, but they are often not completely libre. To better handle all these potential speech synthesis backends, the speech dispatcher daemon takes care of automatically choosing the appropriate synthesis according to the desired language and style.

Another very popular kind of device is Braille terminals. These "show" text by raising and lowering little pins which thus form Braille patterns. Because their cost is very high, a Braille terminal often has room for only 40 characters or even 20 or 12. They integrate keys to navigate around the screen, so the user ends up reading it piece by piece. Compared to speech synthesis, the reading accuracy is far better, but not everybody can read Braille, and the cost remains very high (on the order of $5,000). The support of the various existing devices is very good: both the brltty and suseblinux screen readers support a very wide range of devices.

Blind people will actually often use a combination of speech synthesis and Braille devices. As for other kinds of disabilities, the kind of devices varies a lot. It ranges from joysticks (natively supported by X.org) to eye-tracking systems (managed by dasher), via press button (supported by the GNOME Onscreen Keyboard) or mere screen magnification (implemented by gnome-mag).

Everyday Use

The eternal Command Line Interface vs Graphical User Interface flamewar actually also holds for people using a Braille terminal or speech synthesis. The contrast is perhaps even exacerbated by the inherent difficulties of performing anything with a computer when being disabled.

The old traditional way of using a GNU/Linux system, the text console, has been working well with Braille devices and speech synthesis for a long time. The principle is indeed quite simple: there are 25 lines of 80 characters and text appears sequentially. Screen readers for Braille terminals would thus just automatically display what was last written and permit the user to navigate among these 25 lines. Screen readers for speech synthesis (e.g. speakup or yasr) would speak text as it appears on the screen, and have some review facilities similar to what Braille screen readers have. This works quite well because applications are limited to the TTY interface, they cannot have non-accessible fancy features such as graphical buttons. Some applications may still not be so easy to read, e.g. if they draw ASCII art or use colors to show active buttons, but they often have options to get more accessible, a collection of tips can be found on this wiki.

Accessibility of graphical desktops is on the other hand a quite recent matter, in part because the issue is technically much less simple: while applications on the text console are limited to producing text, these days graphical applications usually render text as bitmaps themselves, so that the textual information is not available outside of the application for screen readers. There have been application adaptation attempts in the past (like ultrasonix), but they never really got popular. The GNOME project has been developing AT-SPI (Assistive Technology Service Provider Interface) for the past decade, and that has become really promising with the advent of the Orca screen reader. AT-SPI can be understood as a protocol between screen readers (e.g. Orca) and applications. To be "accessible", applications thus have to implement AT-SPI, or use a toolkit that implements it (like GTK and soon Qt), so that screen readers can get the logical and textual content of the application. Orca is not yet as good as what mature, proprietary Windows screen readers can achieve, but it is already usable for everyday work. It is progressing rapidly, notably thanks to the support of Sun and the involvement of the Accessibility Free Software Group. At the time of writing, only gtk+ 2 (and thus the GNOME desktop and gtk+ 2 applications), Java/Swing, the Mozilla suite, OpenOffice.org, and acrobat reader implement AT-SPI and thus are accessible. Qt (and thus the KDE desktop) is expected to support it once it gets rebased on D-BUS. To get the best results, the latest versions of applications should be used: for instance, Firefox is really usable only starting from version 3.

Another approach is the use of self-reading applications. For instance, Firevox is a version of Firefox that integrates a dedicated screen reader. That permits a tighter interaction between the reader and the application, but that is of course limited to that particular application. Another example is emacspeak, which is a vocalized version of emacs. Some people simply just use emacspeak and nothing else, as emacs already meets all their needs.

All in all, as usual the mileage varies. Some people will be very happy with the mature, efficient screen reading of the text console, while other people will consider that as a regression (like going back to DOS) and prefer using intuitive environments such as the GNOME desktop, even if the Orca screen reader is still quite young. It is actually quite common to use both: for instance the text console for the usual work, and the graphical environment for tasks that require it, like browsing Javascript-powered websites or manipulating OpenOffice documents.

Upstream Integration

Now, how can all of that be installed? Most distributions already provide most of the useful packages, but they often lack documentation on which tools are useful according to the various disabilities. The Linux Accessibility Resource Site is a quite complete source of information on the various tools that one could use. There is also a wiki page meant for administrators to get started with accessibility needs.

A point worth noting, however, is that some distributions have accessibility components built into their installation CDs. For instance, starting from Etch (aka Debian GNU/Linux 4.0), the Debian installer automatically detects Braille terminals and if found, switches to text mode, runs brltty, and makes sure that brltty gets installed and configured on the target system. Other distributions often have been non-officially adapted into so-called "Braillified" installation images. The very important point is that it permits disabled people to be completely independent from the help of sighted people, even when the (re)installation of a system has to be done! That is clearly one area in which Windows is far behind GNU/Linux achievements.

Future Challenges

To sum it up, "accessible" GNU/Linux is getting its democratization step as well, just a bit shifted in time compared to the average Linux democratization. There are, of course, things that could be improved. Even if distributions usually contain accessibility software, it is hard for accessibility-newcomers to know which software will be useful for the various kinds of disabilities users can have, so distributions will have to develop wizards to help them. In the meanwhile, websites such as the Linux Accessibility Resource Site can be used as sources of information. In any case, discussion with the disabled users is essential to establish a suitable solution (setting up Braille output would be useless if the user can not read Braille for instance).

Beyond the mere use of GNU/Linux or its installation, one area that still is not really accessible at all is the early stages of the boot process. With future development of the recently added basic Braille screen reader, the Linux kernel should eventually be able to provide basic feedback even before user space screen reader daemons can be started from the hard disk. Bootloaders like lilo and grub are able to emit basic beeps, but being able to accurately edit the kernel command line, for example, would require some support. Last but not least, tinkering with BIOS settings is currently possible for disabled people only on high-end machines that can drive a serial console. The democratization of the EFI platform could be an opportunity to embed basic screen reading functionalities.

[Samuel Thibault has been working on accessibility since 2002, when he and a blind colleague designed the BrlAPI client/server Braille output engine, now used by Orca for Braille support . Since then he has worked on various accessibility tasks, from the Debian installer support to Braille standardization. In his professional life, he conducted a PhD on thread scheduling on high-end machines, and is now a lecturer at the University of Bordeaux.]

Index entries for this article
GuestArticlesThibault, Samuel


to post comments

Accessibility in Linux systems

Posted Oct 9, 2008 1:04 UTC (Thu) by i3839 (guest, #31386) [Link] (5 responses)

I wonder, there are systems in place for internationalisation, it seems
to me that adapting/plugging into those would be a good idea to get to
the text. After all, if the system is flexible enough to load different
texts depending on the language, it's a small step towards "displaying" it
differently when needed. It's a similar problem, and you might want to
have a different, more appropriate text if it's read aloud, for instance.
This way it also doesn't matter if it is a CLI or GUI program.

Accessibility in Linux systems

Posted Oct 9, 2008 9:33 UTC (Thu) by jamesh (guest, #1159) [Link] (2 responses)

The point at which a string gets translated for internationalisation is often not the point where you'd want it read by text to speech software or sent to a brail terminal.

For a graphical application, it will usually translate most of its UI strings with gettext() on start up. You'd only want these read when the user tries to interact with those controls. The same probably goes for full screen text mode applications.

Accessibility in Linux systems

Posted Oct 10, 2008 1:46 UTC (Fri) by i3839 (guest, #31386) [Link] (1 responses)

Gah, you're right for how it currently works. I've no experience with internationalisation and I expected something more elaborate than just one function gettext() which is called at startup for each string, for some reason.

Accessibility in Linux systems

Posted Oct 11, 2008 2:49 UTC (Sat) by nix (subscriber, #2304) [Link]

It's not, _() (aka gettext()) is called *when a translation is needed* for
each string (it translates each format string into a new one given the
current locale, possibly reordering arguments in the process). It's just
that GUI applications often need to translate most of their strings in one
big lump at initial-window-mapping time.

Accessibility in Linux systems

Posted Oct 9, 2008 17:53 UTC (Thu) by sthibaul (✭ supporter ✭, #54477) [Link]

It would also be a big work. Internationalization is already a
difficult task just because of finding translators. Finding translators
who _also_ know a bit about accessibility and have an idea on how they
should translate strings is even harder, I'd say "square as much" :)

That's why having a technical solution that just uses what exists is
the most pragmatic solution for now.

That being said, having a different output for sighted and non-sighted
users is not so handy, in particular when you have both kind of users
working together: if they don't have the same output, they will not
understand each other.

Accessibility in Linux systems

Posted Oct 10, 2008 14:12 UTC (Fri) by Cato (guest, #7643) [Link]

It's worth pointing out that there is more to accessibility than screen readers or braille output for the seriously visually impaired. It's also important to support the following:

- mild visual impairments: large fonts and magnification

- for the deaf: visual equivalents of sound effects, and subtitling of videos

- for those with motor/dexterity impairments (including RSI): speech input, alternative input mechanisms (e.g. tongue, foot, blowing), modified input mechanisms (alternative pointing devices)

- for those with speech and language impairments: alternative input/output, communications boards (i.e. use computer to communicate with others interactively through symbols) - inconsistent speech can be a problem for speech input though.

See http://en.wikipedia.org/wiki/Computer_accessibility#Consi... for more complete coverage of this.

Bravo

Posted Oct 9, 2008 4:17 UTC (Thu) by ncm (guest, #165) [Link]

Articles like this are what force me to renew my subscription.

Braille support in SuSE

Posted Oct 9, 2008 7:52 UTC (Thu) by niner (subscriber, #26151) [Link] (1 responses)

I've started using SuSE (when it was still called so) at version 5.2 and as far as I can
remember it has had Braille support in the installation system even then. I noticed,
because loading support and detecting Braille output took quite a few seconds.

Braille support in SuSE

Posted Oct 9, 2008 17:57 UTC (Thu) by sthibaul (✭ supporter ✭, #54477) [Link]

Yes, SuSE's SuSE blinux has been providing auto-detection for quite
some time, though it's sometimes considered unsafe to try to send chars
on a serial port to see what is there :)

Screen reading for web browsing on Ubuntu 7.10

Posted Oct 9, 2008 17:48 UTC (Thu) by plaxx (guest, #53703) [Link] (1 responses)

I always thought that Linux was accessible since almost every distro release talked about accessibility improvements (mostly done in Gnome but still).

As part of a little assignment at University, I wanted to try the state of screen reading in a Web browsing context under Ubuntu 7.10. It was not very successfull.

See my blog post[1] describing what I tried.

[1] http://www.bottomlesspit.org/2008/08/04/blind-web-browsin...

Screen reading for web browsing on Ubuntu 7.10

Posted Oct 9, 2008 18:04 UTC (Thu) by sthibaul (✭ supporter ✭, #54477) [Link]

Yes, distros talk about accessibility improvement, but we are still
a bit far from _real_ mainline accessibility. GUI accessibility is
still some mess, but hopefully with firefox 3 integrated in all distros
the web part will work nicely.

About the bugs & such you encountered, that's unfortunately mostly
because distributions haven't yet done their part, integrate components
together, so that you have to mix things together by hand.

Now, what you tried was GUI accessibility, which is quite recent. Try
CLI accessibility, _that_ has been working for more than a decade now;
so you can say a linux _text_ system definitely _is_ accessible.

Accessibility in Linux systems

Posted Oct 9, 2008 18:13 UTC (Thu) by riddochc (guest, #43) [Link] (4 responses)

Disabilities come in many different forms. I suspect that my form is going to become increasingly common: I suffer from RSI, closely related but not identical to carpal tunnel, and thus largely untreatable.

I use a few tools to make my computing life easier:

  • Dasher, for text-entry. Unfortunately, being a university project, it's on a maintainership hiatus and most of the developers appear to have treated it as a means to a thesis rather than as a piece of software that needs stable releases for users that rely on it. The ability to copy all text automatically into the clipboard when movement stops was broken over a year ago, and hasn't been fixed. It takes more work to select all and copy by hand. And yes, they know about the bug - I told them.
  • A wacom tablet. Mine is an old graphire. Repetitive stress comes from repetitive actions - switching between different input methods helps to reduce that stress. I find that controlling Dasher with the stylus on the tablet involves the least physical movement possible for entering text, of the mechanisms that use physical movement at all. Unfortunately, the drivers for X can take a bit of twiddling to work. SaX on SuSE 11.0 can't produce a working xorg.conf file without help, and the newer versions of the wacom drivers don't play nice with the tablet or crash the X server at random. Yes, I've reported these bugs, too.
  • Workrave, a program to remind you to take typing breaks. The best way to keep RSI from happening, and to keep it from getting worse, is to take frequent breaks. Unfortunately, with my upgrade from SuSE 10.2 to 11.0, it crashes after running for a minute. I'm trying to figure out why, before filing a bug.
  • And now, the obligatory non-free software: Dragon NaturallySpeaking 10, running in Windows XP, running in VirtualBox. Through some miracle, VirtualBox can feed audio from also into XP and thus Dragon. The ~$200 version of Dragon can also take sound files from "digital recorders" in the form of wave files and transcribe them. It's nice, but it doesn't exactly integrate very well with Linux - it leads to a lot of temporary files and clipboard work.
  • Then there's Voxforge. If you want good speech recognition on Linux as much as I do, please read a book to them. The speech recognition engines (Sphinx, Julius, and HTK) are great, but what we need is a good collection of training data for speaker-independent recognition. This is tedious work, and needs volunteers. Please contribute!
  • I recently purchased an Orbitouch keyless keyboard. Unfortunately, they didn't put much thought into the arrangement of positions for letters - they're in sequential order. I'm working on remapping the keyboard so that going from one letter to the next will involve fewer physical movements. I haven't had the device long enough yet to really judge whether it's worth it.
As you can tell, there just aren't that many good options. People with disabilities face a really tough road trying to use computers - even more so if they intend to do any kind of software development. None of the options I've used so far are adequate for the kind of non-linear text editing necessary to do real work on code.

I've come to the conclusion that I have two choices: either stop using computers altogether, or focus my attention on improving accessibility myself. The truth is, the expectation that people will write code to scratch their own itch simply can't apply to people with RSI. The people who need these tools the most are the very same people who are least able to write that software, for obvious reasons.

Please, if you want to contribute and haven't yet, fix accessibility bugs!

Accessibility in Linux systems

Posted Oct 9, 2008 22:56 UTC (Thu) by nix (subscriber, #2304) [Link] (3 responses)

OK, that keyboard is dramatically weirder than my Maltron.

(Regarding the key rearrangement: you might want to do as the Maltron
folks did and treat it as a large-search-space problem searching for
minimum finger motion across a large corpus of representative input: both
GP and simulated annealing seem likely to find something reasonably good.)

Accessibility in Linux systems

Posted Oct 9, 2008 23:06 UTC (Thu) by riddochc (guest, #43) [Link]

Yes, that was my plan. I'm working on code for a GP approach, similar to what has been tried for traditional keyboards. The approach is similar: I've collected letter bigram frequencies from text I've written (which I already use for training Dasher,) and I need to talk to someone in the nearby university's physiology department to get some help building a metric for the work involved in different arm movements.

Accessibility in Linux systems

Posted Oct 16, 2008 13:57 UTC (Thu) by nocomment (guest, #33767) [Link] (1 responses)

In the US Kinesis keyboards are similar and (somewhat) easier to find. See

http://www.kinesis-ergo.com/

Accessibility in Linux systems

Posted Oct 17, 2008 19:53 UTC (Fri) by nix (subscriber, #2304) [Link]

Indeed. I went for a Maltron mainly because they last (much) longer than
Kinesis keyboards do.

Accessibility in Linux systems

Posted Oct 10, 2008 12:11 UTC (Fri) by Wummel (guest, #7591) [Link] (1 responses)

There is currently no linux distribution specialized for older people. I ended up installing Debian Linux and changing a lot of stuff to have a simple but usable workspace for my mother:
  • Autologin (using the script from Knoppix)
  • Autoshutdown on logout
  • Large fonts
  • Locked down options (ie. the user cannot change application or desktop options)
  • Simplifying the desktop (in this case KDE). The only icons on the desktop are Mail, Browser Openoffice Writer and Freecell :-)
  • Remote administration with SSH and NX server
  • ... and possibly lots of other small adjustments I forgot in this list
So I'd like to see a distribution targeted at older people. After all, the target audience for such a distribution is growing every year.

Accessibility in Linux systems

Posted Oct 10, 2008 13:25 UTC (Fri) by Cato (guest, #7643) [Link]

Those aren't really optimisations from older people (except for large fonts perhaps) - they would also be suitable for novices who can only deal with a simple setup and won't want to install their own software. I did a similar setup for an elderly relative, using GNOME, which seems harder to customize completely than KDE.

Accessibility in Linux systems

Posted Oct 10, 2008 14:25 UTC (Fri) by Cato (guest, #7643) [Link]

Here's a great resource for end users of Linux: http://www.bbc.co.uk/accessibility/ - it lists Linux alongside Windows and Mac, with helpful tips on how to make websites and applications more accessible. Covers KDE and GNOME. May be a little bit dated as it talks about KDE 3.4 and GNOME 2.1, but it's good to see this on a mainstream site that is itself quite accessible.


Copyright © 2008, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds