[Ffmpeg-devel] How to use ffserver and ffplay to make a H.264 C/S system
Sun Mar 4 07:45:26 CET 2007
I want to develop a system which uses H.264 protocol to transfer video
data between a desktop server and an embedded client, both with Linux
For instance, I could use video camera to grab live video in the
server, then I use ffmpeg on the server to encode the data by H.264,
and transfer the encoded data to client (embedded ARM board) with
ffserver. Finally I could decode the data on the client and play the
Is it possible to use ffmpeg to achieve this?
If this is too difficult, I would like to send the encoded H.264 data
to the client using ffserver. Will ffserver support this function?
thanks for your help.
More information about the ffmpeg-devel