[FFmpeg-devel] [PATCH] lavdevice: SDL Audio Playback
Thu Dec 10 20:33:23 CET 2009
On date Wednesday 2009-12-09 22:50:30 +0100, Michael Niedermayer encoded:
> On Tue, Dec 08, 2009 at 02:56:48PM +0100, Ivo wrote:
> > Hi,
> > On Saturday 05 December 2009, 10:08:48, Luca Abeni wrote:
> > > On Fri, 2009-12-04 at 21:38 +0100, Ivo wrote:
> > > [...]
> > >
> > > > Yes, you are right. I'll fix it next time I resend the patch. First I
> > > > want to have the libavdevice API for audio and video output devices
> > > > changed, so I can start writing a whole bunch of those. I'll send a
> > > > proposal next week.
> > >
> > > Just out of curiosity: what needs to be changed, and why?
> > > (sorry if it was discussed somewhere and I missed it)
> > IMHO audio out formats need a few functions similar to libao2 in MPlayer to
> > be really useful in building an audio/video player, like
> > get_delay/latency/space. If there's get_space(), perhaps a non-blocking
> > output isn't necessary or wanted.
> indeed, it seems this stuff has been overlooked as te audio out stuff growed
> out of muxers for which such things are not that obviously useful.
> i guess adding a
> get_space(int stream, int *bytes_available, int *space_available)
> to AVOutputFormat
> would be the obvious thing to do
> the alternative would be to wait until AVFilters support audio and then
> implement audio out as an audio filter
Just want to mention that the mere act of waiting will not make audio
support go forward, more people working on libavfilter is what is
required just now.
Also I wonder if integrating the afilters repo (currently
unmaintained) into the vfilters one will make things smoother for
someone potentially interested in developing audio support.
> > Video out formats should have functions
> > like draw_osd, get_buffer (for direct rendering) and flip_page. If
> > write_packet is used/defined to write full frames, something like
> > draw_slice would be useful too. Both video and audio out formats need a
> > control function to control for example hardware treble/bass/volume/pan or
> > hardware chroma/luma/saturation etc.. Also, a function to capture events
> > (IR remote control, keypresses, mouse movements) and send them back to the
> > application would be useful too. Perhaps by calling a user supplied
> > callback function that processes the event.
> You seem to be thinking of writing video out based on the muxer API, this
> is an option for sure. But maybe avfilters are more appropriate as they
> already have direct rendering & slice support. Also control commands
> could be easily intercepted by a brightness changing filter when the hardware
> doesnt support it.
> but above all, we need ffplay updated for whichever system is choosen or we
> wont have a player to test anything ...
> Also we should try to reuse existing code if at all possible, like mplayers
> video and audio out. Rewriting all the video outs from mplayer is just
> the most brain amputated thing someone could attempt.
> In that sense id like to mention it would be nice if mplayers video filters
> could be supported by libvafilter. Porting them one by one is silly as well.
> Someone should write code to use them unchanged its a lot more gain per work
Maybe a vf_libmpcodecs.c filter?
FFmpeg = Faboulous and Forgiving Magnificient Philosofic Erudite Gangster
More information about the ffmpeg-devel