[FFmpeg-user] Multithreaded Asynchronous Encoding/Muxing

BIGLER Don (Framatome) don.bigler at framatome.com
Wed Feb 27 00:32:57 EET 2019


All,

I am writing an application that displays, encodes, and muxes live video using ffmpeg as the backend.  The audio and video encoding occurs asynchronously in the background, each in its own thread, such that the encoded packets arrive to the av_interleaved_write_frame() function call at different times.  H.264 video encoding by itself works fine.  However when I add audio, the audio is out of sync from the video even though the audio and video pts are in sync (from avdevice using the matroska muxer).  The cause of the problem is not clear to me.  Specifically, here are my questions:


1.     Can the av_interleaved_write_frame() function handle multithreaded asynchronous calls for audio and video streams?

2.     The transcoding example uses a filter graph using buffer/abuffer filters.  My current implementation does not use the buffer/abuffer filters because I am not applying any filters before encoding.  Are they required for my situation?

3.     The encoding happens randomly within the stream such that the first pts received by the muxer is not zero.  Is this required?

I will greatly appreciate any assistance!

Regards,
Don Bigler


More information about the ffmpeg-user mailing list