touch screens - such as on the Openmoko freerunner - tend to deliver noisy data, particularly when touched lightly. Should this be processed in-kernel then passed out as nice clean mouse events, or should the raw data get to user space to be processed, and then possibly sent back through the kernel (via uevent) or directly to the application?
It sounds like much the same problem, and it seems likely that the same theme will keep coming up as Linux gets used in more interesting devices.
To my mind, the principles of freedom and openness suggest that the author of a particular driver should be free to manage the processing however (s)he wants. As long as we do get the fully processed events coming from the input subsystem in the kernel, the path they take to get there must be left to the author or maintainer of the code.
Putting it a different way, I think that the interface that needs to be stable is the interface to applications, not the interface to the kernel. More and more, there are user-space programs that work closely with the kernel to provide particular functionality. Freezing the interface between those programs and the kernel seems silly. Freezing the interface between the functionality and the applications is where the focus should be. For both LIRC and touch screens, the frozen interface should be the input subsystem. Where raw noisy inputs are processed is just an implementation detail.