[FFmpeg-devel] [PATCH] Efficiently support several output pixel formats in Cinepak decoder
u-9iep at aetey.se
u-9iep at aetey.se
Sun Feb 5 15:56:26 EET 2017
On Sun, Feb 05, 2017 at 12:12:37PM +0000, Mark Thompson wrote:
> On 05/02/17 10:02, u-9iep at aetey.se wrote:
> > To make it a bit more clear, my use case is
> > - various devices and videobuffers
> > - different applications which are not feasible to patch/teach/replace
> > Replacing ffmpeg with something else or modifying the applications
> > unfortunately does not fit there, that's also why get_format() does not help.
> Even if you need to support such a use-case, doing it via the get_format() callback is still the right thing to do. Once you've done that, all normal applications (including ffmpeg.c itself) can use the standard API as it already exists to set the output format. For your decoding-into-framebuffer case the calling application must already be fully aware of the state of the framebuffer (after all, it has to be able to make a suitable AVFrame to pass to get_buffer2() so that you can avoid the extra copy), so adding get_format() support to also communicate the format is not onerous.
Note that it is generally impossible to let the application decide
what to choose. Sometimes this may work, but the applications lack
the relevant needed knowledge, which is not guessable from
"what the layers before and after me report as supported formats
in a certain order".
> Then, if you have a proprietary application which cannot be modified because you don't have the source, you could make a shim layer like:
Proprietary or not, I can hardly modify them.
A shim layer is certainly a possible workaround, but from my
perspective it would be "moving from a mildly inelegant
to a basically broken and unreliable technology".
The problem is
<offtopic - system administration>
the technology of LD_*, an old and really bad design by itself.
Compared to a _library_specific_ envvar it is a long way father
from being usable as a general solution.
Note that an LD_* variable affects _linking_ (which is very intrusive)
for _all_ programs, related or not, in all child processes.
Note also that some applications do play tricks with LD_* on their own.
Have I said enough? :(
> static enum AVPixelFormat get_format_by_env_var(pix_fmt_list)
> requested_pix_fmt = getenv(SOMETHING);
> if (requested_pix_fmt in pix_fmt_list)
> return requested_pix_fmt;
Exactly, but in my situation it is much more robust and easier to
enable the corresponding code in the decoder (or even add it there on
my own in the worst case) than play with binary patching on the fly,
which dynamic linking basically is.
So instead of forcing the possible fellow sysadmins in a similar situation
to patch, it would we nice to just let them build lilbavcodec with
this slightly non-standard (and pretty safe) behaviour.
> and LD_PRELOAD it into the application to achieve the same result (insofar as that is possible - it seems unlikely that it will be able to use get_buffer() appropriately, so there are probably going to be more redundant copies in the application and you would need to patch it directly to eliminate them).
In any case, thanks for understanding the original problem.
More information about the ffmpeg-devel