[FFmpeg-devel] [PATCH][RFC] avcodec/avutil: Add timeline side data

Derek Buitenhuis derek.buitenhuis at gmail.com
Mon Apr 2 18:46:18 EEST 2018

On 4/1/2018 10:38 PM, Michael Niedermayer wrote:
>> I agree that the APIs are annoyingly different, but that they should remain
>> separate APIs. The suggestion of aligned (same names, args, etc.) seems the
>> most reasonable to me - consistency.
>> I don't really think they can share some common structs, but they can certainly
>> be aligned.
>> However, some of the stuff you listed doesn't make sense to me:
>> * Codec changes are not part of timeline data; they're entirely separate things
>>   in every format I'm aware of. In MPEG-TS it just changes, in MOV/MP4 it is done
>>   via multiple 'stsd' atoms/boxes, which don't really carry any time related
>>   data - there is another atom/box that maps samples to a particular 'stsd',
>>   and that's it.
> yes but in an abstract way each timespan has one codec
> its the same kind of "data structure" even if its not stored this way in 
> formats.

I disagree with this. Codec changes shouldn't be represented as timespans.
They may happen in-band, with a new PMT, new stsd, etc. Short of indexing
and possibly decoding the entire file, you won't have a timespan at all
to output. I also think its just a fundamentally wrong way to go about
supporting codec or format changes.

>> * What is AVTimelineMetadataList meant to be?

That is very vague. Could you elaborate on this? A lot of different and not
necessarily time related data can be stored in this generic type. I know
e.g. WebM DASH cue points are, but that's the only one I think is even vaguely
related. And cue points are not timestamps, but a singular timestamp.

>> Lastly, what is the intent here? Is it to align all of these before timeline
>> support, or to deprecate and align after? Right now the API is align with
>> other side data APIs. I need to know which scenario this is before I go
>> ahead and implement this.
> Part of the intend is to have a easier to understand API where things follow
> similar design.

I don't think anybody will disagree with this. But I need a clear path forward,
and not a vague 'would should be consistent' which could mean many things.

> Another part is "compatibility". Ideally a format like mpegts that just
> changes codecs should fill whatever data structure we use for codec changes
> in the future. And simply passing these structures over to a mov/mp4 muxer
> would make it write stsd accordingly.

This can be accomplished with the current packet side data, I think. The code
just hasn't been written to do so yet. Possibly extending or replacing the
'new extradata' type is the best solution IMO. It's not timeline/timespan data,
and certainly can't be provided up-front, but only at packet level.

> It would be suboptimal for example if these 2 would use different ways to
> export this codec change data at least unless we also provide code to
> convert. 

Can't disagree with that, but I think this problem is entirely orthogonal to
timeline data. There is nothing that, IMO, should be shared.

> I didnt now check what exactly is stored in it currently but,
> We do with AV_PKT_DATA_STRINGS_METADATA, the chapter array (which also has
> metadata in it) possibly already something that goes a bit in that direction

Ideally we can deprecate the chapter array in favour of an actual struct, which aligns
with how we currently do side data (encryption, this rfc, etc.)? I don't quite think
there is much overlap between timeline edits and chapters, and conflating the two,
in my opinion, ends up making it more annoying for users.

> Its absolutely not my intend to suggest that other APIs would need to
> be changed before. Much more to bring the current incosistent APIs to people
> attention so it is given a thought when a new API is added.
> deprecating APIs and replacing them also has a cost for our users ...

OK. What is my path forward here then? Make a good API and then work on fixing
the other older worse APIs like chapters? Use the old chapter-style API for timelines
for the sake of consistency?

I get the worry and problem of consistent and large APIs, but I cannot continue the
RFC, or even write an implementation until I know what exactly I need to do to move
it forward; an actionable path to take. You have very valid concerns, but I don't
know what I'm actually supposed to take action on, and do.

- Derek

More information about the ffmpeg-devel mailing list