[Ffmpeg-devel] OpenGL stream ?

Luca Barbato lu_zero
Mon Jan 8 14:45:49 CET 2007

Joerg Anders wrote:
> Hi all!
> I want to use the ffserver to stream a live animation produced
> by an OpenGL (Mesa-) based program (virtual reality).

Looks interesting =)

> The idea behind this is to send an animation which can be influenced
> by the user at the receiving side.


> I have already some success with the ffmpeg API. My problem is: I don't 
> know
> how to bring the Windows Media player to play a live stream without
> sending a "Content-Length:" HTTP header. It seems the "asf" data haeder
> must contain an appropriate "play endlessly" (or so) information.

Deformation: why not using rtp?

> My problem is: The ffserver is not prepared for a live stream
> created by an OpenGL animation. I tried a named pipe to which the 
> OpenGL-Process
> writes asf data and the ffserver reads and sends it. But that does not
> work.
> So I want to change the source. And I'd like to know your opinion.

Don't do it

> Near line 2056 in "ffserver.c" is
>  redo:
>       if (av_read_frame(c->fmt_in, &pkt) < 0) { ...
> It seems this is the location where a frame is read from the webcam
> (video4linux).

nope av_read_frame()[1] calls the functions (ok it is a bit convolute 
and maybe could be documented as well as av_parser_parse()[2]) up to the 
codec parsing layer in order to get a frame.

> My idea is to change this:
>  redo:
>       if (read_opengel_frame(c->fmt_in, &pkt) < 0) { ...
> whereby
>      static int read_opengel_frame(AVFormatContext *s, AVPacket *pkt) {..}
> fills an RGB image and converts it to YUV using "sws_scale".
> Any comments ?

All you need is to create a codec for your format or a capture if you 
want to send pixel data and not vector data and then just configure 
ffserver, this way should be clean and relatively as easy.

I hope it helps




Luca Barbato

Gentoo/linux Gentoo/PPC

More information about the ffmpeg-devel mailing list