Hi Vistas,<br><br><div class="gmail_quote">> I want to be able to open a video file on one end, encode it to h264 and then send it over RTP(custom library) to another end for decoding.> I am not sure how exactly I am going to open the AVFormatContext and<br>
> AVCodecContext on the receiving end. I am able to serialize AVPackets correctly.<br>> I am new to libav* and I can' t see how I need to initialize the AVFormat state on the decoding side.<br></div><br>I use same work-flow in my live transcoder: <a href="https://www.gitorious.org/live-transcoder">https://www.gitorious.org/live-transcoder</a><br>
<br>It read and decode packets in one thread, pass decoded data to multiple encode threads (to encode in multiple formats) and at the end, pass encoded packets to one or multiple muxing threads, that uses custom writer to send muxed data via HTTP.<br>
<br>Transcoder uses my AVCPP lib, that C++ wrapper for libavformat, libavcodec and libavfilter (also: libavutil, libswscale)<br>