[Dev] Display servers (was Cynara)
jussi.laako at linux.intel.com
Mon Apr 14 12:44:55 GMT 2014
On 12.4.2014 6:21, Carsten Haitzler (The Rasterman) wrote:
> it isn't. it's just a historical artifact we live with. if the display server
> is determining things like location of window, visibility of it, etc. i would
> imagine it should also have a say in audio routing - eg audio of an app is
> muted if it is hidden (unless app request a grab on audio ala grab on keys).
Exactly, ROUTING which doesn't mean that it would need to see the actual
audio samples. Even less it would need to understand what kind of music
> too late. input is tied to its visual representation and that representation is
> decided by the display server, so it decided input transforms and routing.
Keyboard input's visual representation is decided by the application
receiving it (only one at a time!), not by display server.
> pulse audio doesnt need input because it doesnt auto control audio via input
> devices, it doesnt provide input feedback to pluse audio clients. a display
> server does. a display server lets you share the keyboard, mouse etc. and the
Pulseaudio let's you share audio devices. You could have audio feedback
for clicks or keyboard presses. It may also play back haptic feedback
for touch. So it may produce exactly similar kind of input feedback as
What if your user is blind? With only braille display, keyboard and
Audio is no less important than graphics output.
More information about the Dev