[Libav-user] Streaming from camera to server

Timur Guseynov s1.sam.1.93 at gmail.com
Sat Sep 10 10:56:02 EEST 2016


I am new to C++ and FFmpeg and wanted to write application that streams
video from camera (webcam at least) to some server.

I've seen StreamingGuide <https://trac.ffmpeg.org/wiki/StreamingGuide> and
wanted to know how to implement it.
I think that basic flow is like this, please correct me if I'm wrong:

   1. Get input device AVInputFormat from libavdevice
   2. Open that input with avformat_open_input
   3. Find its streams' codecs
   4. Get decoder with avcodec_find_decoder
   5. Decode it somehow
   6. Encode it for stream
   7. Write data with muxer
   8. Send muxed data to server

So I imagine how to implement the first half of this list but not the

2 questions that I have are:

   1. Do I understand the streaming flow right? What are the nuances that I
   must consider? What modules/methods should I look into to implement it?
   2. How can I do a preview of stream in a GUI using, for example, Qt
   Quick? Is an input device blocked by one of the processes (either FFmpeg or
   Qt)? If it is, should I somehow copy frames for GUI to show them for user
   or just reference it?

Thanks in advance!

Kind regards,
Timur Guseynov
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://ffmpeg.org/pipermail/libav-user/attachments/20160910/133213c8/attachment.html>

More information about the Libav-user mailing list