[FFmpeg-devel] [PATCH] lavdevice: SDL Audio Playback

Ivo ivop
Sun Dec 13 19:44:27 CET 2009

On Wednesday 09 December 2009, 22:50:30, Michael Niedermayer wrote:
> On Tue, Dec 08, 2009 at 02:56:48PM +0100, Ivo wrote:
> > IMHO audio out formats need a few functions similar to libao2 in
> > MPlayer to be really useful in building an audio/video player, like
> > get_delay/latency/space. If there's get_space(), perhaps a non-blocking
> > output isn't necessary or wanted.
> indeed, it seems this stuff has been overlooked as te audio out stuff
> growed out of muxers for which such things are not that obviously useful.
> i guess adding a
> get_space(int stream, int *bytes_available, int *space_available)
> to AVOutputFormat

Yes, I think that's the best way to go forward and remove the half-baked 
implementation of non-blocking outputs. I am not really interested in 
hacking ffmpeg.c, but I am interested in creating the basis for a 
stand-alone player which could eventually make MPlayer obsolete ;)

> would be the obvious thing to do
> the alternative would be to wait until AVFilters support audio and then
> implement audio out as an audio filter
> > Video out formats should have functions
> > like draw_osd, get_buffer (for direct rendering) and flip_page. If
> > write_packet is used/defined to write full frames, something like
> > draw_slice would be useful too. Both video and audio out formats need a
> > control function to control for example hardware treble/bass/volume/pan
> > or hardware chroma/luma/saturation etc.. Also, a function to capture
> > events (IR remote control, keypresses, mouse movements) and send them
> > back to the application would be useful too. Perhaps by calling a user
> > supplied callback function that processes the event.
> You seem to be thinking of writing video out based on the muxer API, this
> is an option for sure. But maybe avfilters are more appropriate as they
> already have direct rendering & slice support. Also control commands
> could be easily intercepted by a brightness changing filter when the
> hardware doesnt support it.
> but above all, we need ffplay updated for whichever system is choosen or
> we wont have a player to test anything ...

Yes, the next step after SDL audio and video out would have been to update 
ffplay. But before first we need to decide whether we want to further 
implement audio out as muxers, which would make it logical to implement 
video out as muxers too, so they both can be used without having a lavfi 
dependency. To have muxers at the end of a filter chain could be done 
similarly to mplayer, i.e. by creating vf_vo and af_ao filters. (or maybe 
even vf/af_muxer).

Or, as you said, we could move over all audio out to lavfi and implement 
video out there too. I'm not sure if that's a good idea though as it 
creates an extra dependency and logically lavdevice audio/video in should 
be filters/generators too, no?

> Also we should try to reuse existing code if at all possible, like
> mplayers video and audio out. Rewriting all the video outs from mplayer
> is just the most brain amputated thing someone could attempt.

I agree partly. Some ao's/vo's are very good (OpenGL comes to mind) but 
others are a somewhat messy. And a lot of them are not that much work to 
rewrite. The added benefit is that the new _o's are LGPL (well, at least 
the ones I will write) and MPlayer's are GPL. And besides that, it is not 
that trivial to lift libvo or libao2 out of MPlayer and remove all MPlayer 
dependency. It's something I started doing before I 'discovered' 
libavdevice :)


More information about the ffmpeg-devel mailing list