[Libav-user] newb mp4 question

Atli Thorbjornsson atlithorn at gmail.com
Thu Oct 6 01:05:16 CEST 2011


Thanks Dan, when we stream live we often falsify the content-length to
something like 2gb, users are never going to hit that limit with
normal usage so it's "good enough". The underlying http client will
often cut off at that point which is fine, people will forgive
restarting a stream after 2gb of listening.

Could I use QTFastStart and provide similarily false information to
the container so it could at least start streaming?

While I have you on the line ;) what transport stream should I
otherwise be looking at. I don't have a lot of options to choose
from....

Atli.

On 5 October 2011 22:38, Dan Haddix <dan6992 at hotmail.com> wrote:
> MP4 files contain something called a MOOV atom, which contains vital
> information about the file that is needed for playback. The MOOV atom is not
> written until you call av_write_trailer. And even then it's written to the
> very end of the file which makes it so you can't stream the file. There is
> another chunk of code in FFmpeg called QTFastStart which moves the MOOV atom
> to the start of the file, which in turn makes it possible to stream the
> file, but you still need a finished file before you can call it. So it's
> really only meant for streaming complete files from a server to a player.
>
> If you need to remux and stream in realtime then you'd be better off using
> another format. Transport streams are the best for streaming. They have no
> header or footer and can be picked up at any point in the stream without
> issue. This is why broadcasters use TS for digital TV signals.
>
> Dan
>
>> Date: Wed, 5 Oct 2011 22:18:24 +0000
>> From: atlithorn at gmail.com
>> To: libav-user at ffmpeg.org
>> Subject: [Libav-user] newb mp4 question
>>
>> Hi, I'm trying to remux an flv muxed aac stream using the mp4 muxer.
>> (movenc.c). It works fine so long as I call av_write_trailer.
>>
>> Problem is I also need to remux live (long running) streams, how would
>> I go about that. Currently a skeleton of my code looks like this
>>
>> init_input
>> init_output
>>
>> do{
>> av_init_packet(&ipacket)
>> av_read_frame(iFormatCtx, &ipacket);
>> if(packet.stream_index == audioStrem){
>> av_init_packet(&opacket);
>>
>> copy stuff around, code borrowed from ffmpeg.c
>>
>> av_write_frame(oFormatCtx, &opacket);
>> av_free_packet(&opacket);
>> }
>> av_free_packet(&ipacket);
>>
>> }while(stuff is incoming);
>>
>> av_write_trailer(oFormatCtx);
>>
>>
>> I was hoping I could buffer av_write_frame calls and write out to a
>> client but the client can't understand the file without the trailer it
>> seems. I'm new to this stuff so I was hoping someone could point me in
>> the right direction.
>>
>> Before you ask, I am streaming to the Android MediaPlayer so my format
>> options are a little limited.
>>
>> Thanks,
>> Atli.
>> _______________________________________________
>> Libav-user mailing list
>> Libav-user at ffmpeg.org
>> http://ffmpeg.org/mailman/listinfo/libav-user
>
> _______________________________________________
> Libav-user mailing list
> Libav-user at ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/libav-user
>
>


More information about the Libav-user mailing list