[FFmpeg-devel] [RFC] mpegts demuxer: fix timestamps wrapped around 33 bits
andrey.krieger.utkin at gmail.com
Fri Aug 17 19:17:02 CEST 2012
While working with broadcasted mpegts streams, i find them having
different timestamp discontinuities. One type of them happen due to
UDP packet loss. Reason to another kind is actual broadcasted stream
change (e.g. streamer reload). And one more kind is caused by packet
timestamp value wraparound, as the timestamp field effectively has 33
bits, which at timebase of 1/90000 gives 26.5 hours. This last kind of
timestamp discontinuity is the most easy to track and fix, but it
still makes it harder to work seamlessly with mpegts streams longer
than that, causing error like "Application provided invalid
timestamp". Of course, programmer can handle this in his application,
but ffmpeg utility should work, too.
I propose to add a check in mpegts demuxer read_packet function. It
should behave in following way:
All dts/pts computed in original way must be incremented by the shift
value. This shift value will be 0 initially, and when described kind
of discontinuity is detected, it is incremented by 0x20,00,00,00,00
(which represents 33 bits range).
After shifting, if previous packet's timestamp from same elementary
substream is bigger than of currently read packet, then check if
previous dts is close enough to 0x20,00,00,00,00. Strictly, it should
be prev_dts + prev_duration = curr_dts + 0x20,00,00,00,00. But mpegts
streams are often transmitted via UDP, and some packets may be lost,
so it is better to add some value tolerance to such comparsion.
So, if we check this, and there is really a wraparound, then increment
a shift value by 0x20,00,00,00,00.
The bad thing is that, as i know, while demuxing mpegts currently,
pulled packets are ordered by dts within an elementary stream, but not
amongst them. Also i am not sure that this dts ordering happens inside
a container demuxer code, maybe it happens on utils.c, please
developers be so kind to explain this. (I've read that mpegts muxers
shuffle frames ("interleave") by certain pattern to obtain signal loss
As a summary, i want to say that i am convinced that this issue should
be fixed, and to ask for comments on what/how/where can be done to
improve libavformat demuxing system on this issue.
More information about the ffmpeg-devel