<div dir="ltr">Hi,<div><br></div><div> I am consuming a multi-program transport stream with several video streams and decoding them simultaneously. This works well.</div><div><br></div><div>I am currently doing it al on a single thread. </div>
<div>Each AVPacket received by av_read_frame() is checked for the relevant stream_index and passed to a <i>corresponding</i> decoder. </div><div>Hence, I have one AVCodecContext per decoded elementary stream. Each such AVCodecContext handles one elementary stream, calling avcodec_decode_video2() etc.</div>
<div><br></div><div>The current single threaded design means that the next packet isn't decoded until the one before it is decoded.</div><div>I'd like to move to a multi-threaded design where each AVCodecContext resides in a separate thread with its own AVPacket (concurrent SPSC-)queue and the master thread calls av_read_frame() and inserts the coded packet into the relevant queue (Actor Model / Erlang style).</div>
<div>Note that each elementary stream is always decoded by the same single thread.</div><div><br></div><div>Before I refactor my code to do this, I'd like to know if there is anything on the avlib side <i>preventing</i> me from implementing this approach.</div>
<div><ul><li>AVPacket is a pointer to internal and external data. Are there any such data that are shared between elementary streams?<br></li><li>What should I beware of?</li></ul></div><div>Please advise,</div><div>Thanks,</div>
<div>Adi</div><div><br><br></div></div>