Accessibility and the graphics stack
Accessibility and the graphics stack
Posted Oct 23, 2014 8:23 UTC (Thu) by nim-nim (subscriber, #34454)Parent article: Accessibility and the graphics stack
1. the hardware key location (hardware or virtal hardware)
2. the symbol code this location is mapped to (unicode codes or application actions)
3. the language which is being inputed (language being actually a locale, the same language is not written the same way everywhere, the unicode/opentype rendering changes, the spellcheck dictionnary changes, you can not give safely symbols to apps without locale context)
4. the app localization (the locale used to render its UI) Actually this fourth is a fake, it should not matter a bit on the input side, but C locales confuse it with the rest so it participates in the problem today when it should not
To get accessibility and i18n working (and you need to do both simultaneously, every disabled person does not work in a single locale and every locale has both disabled and non disabled persons) your stack must work very hard to avoid confusing the four concepts.
So, while you may want to transmit both keyboard codes and keysyms to apps, it is a very dangerous thing to do. Because then app devs confuse both concepts and map keyboard locations to actions (many many games make this mistake, starting with the "console key" which is *not* a console key and is needed to input human languages in various parts of the world). They assume those locations are "free to use" because in *their* locale those locations do not map to symbols they care about. And then the app is not i18n-safe, switching to any i18n mapping that uses the same location for something important to this particular locale will break. A localized app must allow a different mapping for actions per locale, keys symbols available for action overriding are not the same in every locale.
So by all means allow virtual keyboards to emit key locations, but this should never ever be exposed to anything but the component responsible from mapping those locations to symbols. Otherwise different parts of the app stack will fight for the ownership of the key locations, depending of which part wins things will break and users will be unhappy.
Likewise X (and IIRC wayland too) makes the mistake of confusing the input mapping with the language being inputed. They are not one and the same, it is common for the same symbol to be mapped to different locations depending on the layout the user uses (typically, azerty vs qwerty, so a and z move), but that does not mean the user of an azerty (French) layout wants a and q to be switched when he starts writing English text. As the article states "Belgian French, Canadian French, and French in France do not share the same symbol mappings". They are all used to input French. A Belgian braille user do not want his mapping to change just because he is contributing to a document in Canadian French (the dictionnaries are not exactly the same, you use the one of the target locale).
But there is no way to indicate apps the target language you are writing, so most of them infer it from the your input mappingĀ ! This is seriously broken compared to other OSes. And don't get me started on the common UI locale = text locale confusion (many people work in an English locale because they feel the UI localization sucks, but they are still writing text for non-English locale targets)
So while reusing x input concepts in wayland was a good idea to get started quickly, they really need a refresh unless wayland gets stuck with the same i18n and atk problems that plagued X for decades.
