latency and audio programming in real-time
I am not an expert about real-time programming issues, so I am here :-)
Suppose I have to play audio while displaying some stuff on the screen, and
that this displaying should go in time with the audio output. Now, my tools
to achieve this task is libsdl and alsa (I cannot use libsdl audio for various reasons),
however I would like to maintain discussion abstract from my solutions. From
what I read the best thing to do to avoid latency problem is run the code that
should output audio in a separate thread, but how to sync what I have to display
on the screen? I.e. how can other threads know how many samples were flushed
to audio output stream? I think a simple variable should solve (Perhaps this var
could even not need to be protected from race courses, because only the audio
thread update it, while the others read it only, isn't it?).
Any suggestion or papers?