[Libav-user] Improving HTTP demux throughput
Jabotical
james.nickerson at vistasystems.net
Fri Oct 31 01:40:34 CET 2014
wm4 wrote
> From what I understand, mov/mp4 files can be indeed quite terrible at
> this, and require heavy seeking. The file format was not designed with
> streaming in mind (insert grumpy note about Apple mis-engineering), and
> they are often not properly interleaved.
Ok, that certainly matches with my experience with many (though not all!)
mov/mp4 files (which has prompted many-a-similar grumble about Apple
mis-engineering). I kept thinking I must be missing something, but the
interleaving indeed seems terrible.
> I don't actually know much about mp4 file structure itself, but maybe
> you could analyze the seek patterns, and determine yourself whether
> some sort of caching could reduce the seeks.
Yeah, I've in fact done that. And it really does look like the audio streams
aren't interleaved properly with the video, so that the demuxer keeps
seeking back and forth between the two of them. Since the gap can be quite
extreme (video data at one end of the file with audio at the other), it
doesn't really work to just use a large buffer.
> Actually using multiple connections sounds a bit unrealistic, because
> it'd probably raise code complexity a lot.
That it does! I know this from experience, since I've already implemented
exactly that using a custom AVIOContext with my own HTTP I/O. But as you say
it really drives up complexity, and optimizing it is tricky. So I was hoping
I was just missing/misinterpreting something, or that libav had implemented
a similar feature internally that I just hadn't been able to find out about.
I think it would be great if I could even configure libav not to seek back
and forth between streams. If I could have one instance that seeked
in/returned only the video stream, and another that seeked in/returned only
the audio, that would probably work nicely.
>> Thanks for the feedback! But as noted above, I probably wasn't clear --
>> *this problem isn't due to HLS*, and HLS content always works
>> wonderfully.
>> Presumably because it's been muxed in a way that's friendly to streaming.
>
> Yep, sorry.
Haha, understandable. I shouldn't have even mentioned HLS, I just thought it
was remarkable that it actually worked really well, while single continguous
files often performed horribly.
> XBMC might be using their own mp4 demuxer, or so. Or possibly have
> hacks by providing their own HTTP I/O.
>From what I can tell analyzing their code, they look to be using ffmpeg
directly for the cases I was testing. That's what both mystified me and gave
me hope. But they don't handle all Quicktime files equally well, either.
All right, it sounds like the only solution, unfortunately, is to handle my
own HTTP streaming with a custom AVIOContext, like I've been doing. It's a
pain, but not as much a pain as writing my own demuxer from scratch (or HLS
parser/player!), so I shouldn't complain.
Thanks for the discussion. If anyone reading this has any further secret
knowledge that might help, I will continue to welcome it.
--
View this message in context: http://libav-users.943685.n4.nabble.com/Improving-HTTP-demux-throughput-tp4660593p4660601.html
Sent from the libav-users mailing list archive at Nabble.com.
More information about the Libav-user
mailing list