[FFmpeg-user] Using FFMPEG to stream continuously videos files to a RTMP server

LANGLOIS Olivier PIS -EXT olivier.pis.langlois at transport.alstom.com
Mon Jan 13 17:53:31 CET 2014

> -----Original Message-----
> Hello,
> I am looking to a proper way to broadcast several files to a remote rtmp
> server.
> Actually I do so using a bash script that bring the big inconvenient to close the
> connection to the server at the end of the video file.
> I heard (and read) a bit about using pipe like:
> vid1: pipe1
> vid2: pipe2
> cat pipe1 pipe2  pipestream
> and using pipestream as ffmpeg source.
> It didn't work most likely because  I don't know about the right syntax.
> Was looking for some help on this
not sure exactly why you need pipes if you have files. You could just pass the file paths on the command line, no?

If in your setup, you do really need to use pipes, this is something that I have been thinking for some time as I have a program generating raw audio/video streams and I am looking to find a way to interface with ffmpeg. This require some system programming and I have not yet tested it myself to see if it works as imagined. Here is the idea:

1. Call pipe() for each stream
2. fork
3. Keep the pipes read side open and close other fds in the child
4. exec to ffmpeg and pass the pipes fds on the command line by using /dev/fd/xxx

Also note that default pipe buffer size on Linux is 64KB. This can be modified with F_SETPIPE_SZ and the maximum pipe buffer size can be tweaked with /proc/sys/fs/pipe-max-size. I would think that for efficient operation, the pipe buffer size for the video should be big enough to contain at least a frame.

CONFIDENTIALITY : This e-mail and any attachments are confidential and may be privileged. If you are not a named recipient, please notify the sender immediately and do not disclose the contents to another person, use it for any purpose or store or copy the information in any medium.

More information about the ffmpeg-user mailing list