[Libav-user] feeding frames to ffmpeg manually?

Csillag Kristof csillag.kristof at gmail.com
Fri Jan 27 19:40:58 CET 2012

Hi there,

I am interested in using ffmpeg in a very non-standard way.

There is an application that is currently displaying images to an X11 
I would like to modify application, so that the pictures are not 
displayed via X,
but fed into ffmpeg, and encoded as frames of a H264 video stream instead.

How do I do this?

(I am aware that one can build a video file from a group of images, like 

      ffmpeg -f image2 -i foo-%03d.jpeg -r 12 -s WxH foo.avi

... but that's now that I want!
I want to feed the data from my application directly to ffmpeg, without 
encoding it into JPEG, and writing it put to a file.

(The purpose of this is to get smaller latencies and better throughput.)

   * * *

Obviously, I will need to write code, but that's no problem; the 
question is where do I start?
Where can I find relevant documentation, and/or code examples? How do I 
need to talk to?

Thank you for your help:


More information about the Libav-user mailing list