[Libav-user] (no subject)

david.weber at l-3com.com david.weber at l-3com.com
Thu Aug 25 21:49:52 CEST 2011


Hey all, I have searched quite a bit, but haven't found a good answer.
So I thought I'd post it here.

 

I have a custom video device which gives me a simple framebuffer,
(24-bit BGR).

 

I am trying to stream it elsewhere, but would like to encode/compress it
in realtime.

 

On the client side, I would decode it, and display it in an OpenGL
window.

 

I have it working today, with uncompressed images, but it is killing the
network (as one would expect).

 

 

 

Are there any examples for making something like this work?

 

In short:

 

[Custom Video Framebuffer] -> [encode] -> [network] -> [decode] ->
[display]

 

 

Considerations that I have to deal with:

 

1.)    Relatively realtime.  I have <16ms to encode a 640x480x3 frame
buffer, and ship it off.

2.)    I need a streaming-type on the wire format.  The clients may be
turned on/off at any point, so i need to support this.

 

 

The naive implementation, is simply running the video frames though
zlib, which is shrinking the data nicely, but it's rather expensive, and
makes me feel dirty.  All of the examples I can find, are reading off an
existing encoded file, but not quite what I need.

 

 

Thanks for your input

 

--dw

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://ffmpeg.org/pipermail/libav-user/attachments/20110825/3dccfd81/attachment.html>


More information about the Libav-user mailing list