[FFmpeg-user] Low latency streaming from webcam

Tim Pitman tapitman11 at gmail.com
Wed Jun 20 02:43:03 CEST 2012


Hey, first off I want to thank all the great people who take the time
to make ffmpeg possible.
Truly a great piece of software. I'm trying to obtain raw h264 from an
mjpeg webcam with as
little latency as possible and constant ~620kbps bitrate. I'm
currently using the following pipeline:

ffmpeg -y -f v4l2 -video_size 640x480 -r 30 -i /dev/video0 -bufsize 50
-f yuv4mpegpipe -pix_fmt yuv420p startpipe

x264 --vbv-bufsize 50 --vbv-maxrate 2000 --bitrate 650 --fps 30 --tune
zerolatency --intra-refresh --ref 0 --muxer raw -o endpipe startpipe

I'm using linux and startpipe and endpipe were both created using
mkfifo from bash. The problem
I'm having is that if I start up everything and let the buffers fill
before reading from endpipe, ffmpeg
records the first couple seconds, and then whatever frame it's on
while the buffers are sitting there
gets repeated constantly until I start reading from endpipe, so the
resulting output is the same size
as if I had been reading the whole time. I had hoped that ffmpeg would
keep updating the frames so
that they are up to date when I finally start reading the pipe. Is
there any setting to accomplish this?

>From reading the linux page about pipes it seems to indicate that I'm
doing this wrong, and that
I should be making sure that the pipes never fill up. That's the
approach that I was about to take
but if there's an easier way using ffmpeg I'd like to do that instead.


More information about the ffmpeg-user mailing list