[FFmpeg-devel] ffmpeg or ffserver to stream?
Fri Aug 31 15:21:02 CEST 2007
for the second time I post an help message about RTP streaming. I'd
like to add a streaming functionality to my software in order to
stream every single frame out of the encoding functions instead to
save it in a .mpg video file. I didn't find any tutorial or help in
which to learn the correct functions to call inside my code so I
decided to study the source code in which I hope to find some useful
Luca Abeni told me to use this command line to stream something:
"ffmpeg -re -i $1 -vcodec copy -an -f rtp rtp://127.0.0.1:10000 -vn
-acodec copy -f rtp rtp://127.0.0.1:20000 -newaudio" now that the RTP
functionality has been fixed. I looked the ffmpeg.c code but I didn't
understand well the entire streaming mechanism (I can only stream some
frame every time watching it with VLC but with many sincro error after
changing "fmt = guess_format("mpeg", NULL, NULL);" in "fmt =
guess_format("rtp", NULL, NULL);") and now my question is...I should
continue to study ffmpeg code or it's better to study ffserver.c code
(that, I think, implements some other functionality more than RTP,
like HTTP and others)?
In every case, is there someone who can give me any little help to
understand the streaming mechanism (I mean, how it is implemented in
thanks in advance.
More information about the ffmpeg-devel