[FFmpeg-user] Live streaming using ffmpeg

Robert Reinhardt robert at theMakers.com
Sat Mar 10 17:06:04 CET 2012

HTML5 video is not a ubiquitous specification. Most of my clients mean "Apple iOS" when they say "HTML5", especially with respect to video implementations. For a live stream, non-H.264 HTML5 browsers don't even have a live streaming specification (!!). Firefox, Opera only play WebM (VP8) or the older Theora codec, and even then, only progressive download/HTTP range requests work with these codecs/formats. 

My recommendation is to stick with H.264, using Flash Player where Flash Player is supported (which covers all of your desktop browsers), and fall back to HTML5 on iOS where HLS is supported with H.264. I've written two articles on the topic here:

"The World of Pain that is HTML5 Video"

"Solving HTML5 Video Problems with Adaptive Streaming"



Robert Reinhardt
The difference knowledge + experience makes | Consultant @ [theMAKERS]
{ work: http://www.theMakers.com }
{ video:  http://videoRx.com }
{ blog: http://probablyjustme.com }

From: ffmpeg-user-bounces at ffmpeg.org [ffmpeg-user-bounces at ffmpeg.org] on behalf of TERRY WILSON [twilson7755 at rogers.com]
Sent: Saturday, March 10, 2012 7:49 AM
To: ffmpeg-user at ffmpeg.org
Subject: [FFmpeg-user] Live streaming using ffmpeg

I want to input a continous stream of images into FFmpeg (using a NamedPipe) and send the resulting continuous video stream out through a NamedPipe and then subsequently to an HTML5 based client application utilizing the video tag.

I have the input and output mechanisms working but I am not sure about the format I should be using for the output video stream.  I was going to use MP4 but I have read a couple of posts that suggest MP4 is not an appropriate format for a continuous video stream.  I was hoping that someone here could clarify this for me and suggest what video format could be used for a continuous output stream that can be generated by ffmpeg and subsequently displayed by the HTML5 video tag.

Note I tried an intermediate step where I simply write the output stream I received through the NamedPipe to a mp4 file.  The resultant file is not recognized as a valid mp4 file.  If I change my ffmpeg command to write the output directly to an mp4 file, then the resultant file does display correctly. The two files appear almost identical except that the one I wrote based on the output from the NamedPipe is about 68 bytes longer then the valid one (also when viewed in a binary editor, the files are identical except near the end)  If I try to use ffplay to display the video, it says "moov atom not found".  Perhaps this is related to my first question about trying to stream mp4.  Just to be clear here are the two different ffmpeg commands I was trying:

ffmpeg -re -f mjpeg -r 20 -sameq -i //./pipe/InputPipe -an mytest.mp4      (resultant mp4 file works)
ffmpeg -re -f mjpeg -r 20 -sameq -i //./pipe/InputPipe -an //./pipe/OutputPipe.mp4  (file generated by writing output from pipe does not work)
Any suggestions on what output video format should be used and why my intermediate test doesn't work would be appreciated.


ffmpeg-user mailing list
ffmpeg-user at ffmpeg.org

More information about the ffmpeg-user mailing list