[FFmpeg-devel] Time in ffmpeg / live streams

Alexandre Ferrieux alexandre.ferrieux at orange-ftgroup.com
Wed Apr 20 09:02:44 CEST 2011


What's the best point to start (in the sources or documentation) to get acquainted with the way ffmpeg manages time in 
live streams ?

The kind of questions I'd like to see answered:

  - in single-input, single-output mode, where are blocking vs. nonblocking vs. blocking-with-timeout I/O ?
  - when input and output framerates differ, what's the strategy to dup without adding too much delay ?
  - what kind of control can be gained over the delay ? what does the low_delay flag do ? what does genpts do ? how to 
handle a systematic drift between the sender's clock and system clock ? any way to detect cumulative delay and handle it 
by drops ?

  - same questions in case of overlay, with multiple movie sources
  - do some containers like RTP lend themselves to proper async operation better than others (like a raw codec stream 
over a pipe) ?
  - in case the current state is not suitable for live overlay (eg because input demuxing is all done in a single thread 
even when there are several movie sources), are there side experiments like multithreaded vfilters that would ease that 
a bit ?

Thanks for any insight,


More information about the ffmpeg-devel mailing list