[FFmpeg-devel] Ticket 6375, Too many packets buffered for output stream

wm4 nfxjfg at googlemail.com
Mon Jul 10 12:20:04 EEST 2017


On Sun, 9 Jul 2017 13:38:10 +0200
Reimar Döffinger <Reimar.Doeffinger at gmx.de> wrote:

> On 09.07.2017, at 13:09, Michael Niedermayer <michael at niedermayer.cc> wrote:
> 
> > Hi all
> > 
> > It does appear this regression affects alot of people, judging from
> > the multiple different people in the ticket.
> > Also people ask me about this, for example baptiste yesterday hit it
> > too.
> > 
> > I belive multiple people where in favour of the change that caused
> > this regression. Does someone who was in favor of this change have
> > time to fix this ticket ?
> > 
> > I belive its important to fix this as it seems affecting many people.
> > 
> > Thanks
> > 
> > For reference:
> > Ticket: https://trac.ffmpeg.org/ticket/6375
> > Regressing Commit: https://github.com/FFmpeg/FFmpeg/commit/af1761f7b5b1b72197dc40934953b775c2d951cc  
> 
> Huh? I don't know if the commit message is accurate, but if it is the basic concept of this patch is utterly broken and can never work.
> There can be hours of video data before you actually get a frame on one of the "missing" streams (subtitles might be the most obvious case, but there are others), and buffering that much data simply is not possible.

That's a libavfilter issue. In my own code, I do some sort of
stream-with-no-data detection to work this around.

The previous design had exactly the same problem, except it was within
libavformat. ffmpeg.c used the codec parameters which libavformat
"found" to initialize the filter chain. That means, it used
libavformat's guess what the decoded AVFrames are going to look like.

Now if a stream was "missing" data, libavformat just read more data
ahead (sometimes making opening a stream extremely slow), or filled in
the data with pure guesses. In some cases, libavformat would get
parameters merely by opening a decoder, and e.g. using the sample
format it set in the init function.

Often enough, these parameters were incorrect too. Especially if other
decoders than the default were used (typically external decoders, but
also if you explicitly chose float vs. fixed point decoder - although
it's possible that ffmpeg.c's messy option handling tried to force
that as "probing" decoder in libavformat to aboid this problem).

The best example for this is hardware decoding - libavformat will be
completely wrong about the output format, and ffmpeg.c had to configure
the filter chain again on the first decoded frame. The consequence was
that it was really hard to build fully hardware filter chains. Also
remember the awful hacks QSV used, which were needed not because QSV is
bad, but because ffmpeg.c's filter setup was bad.

In conclusion, what ffmpeg.c did before this change was certainly more
broken than after this change.

Also I don't know how often I explained this shit over and over again.

Anyway, there is no real problem for ffmpeg.c. It's not like it
behaved well with missing/sparse streams at any time. And in most cases,
just enlarging the queue size will help. If you really care about the
remaining cases, you could add code to set dummy filter params if no
data is available. 

> You can do something like it if instead of failing when the buffering limit is reached you then force stream processing of what is available, which is kind of a compromise that usually works well but also makes things a bit unpredictable.
> Though since it seems to cause issues with audio files with cover image there's probably also bugs in the implementation itself, since handling those correctly is entirely possible...


More information about the ffmpeg-devel mailing list