[Libav-user] Handle signal loss

Andrey Utkin andrey.krieger.utkin at gmail.com
Fri Jan 27 10:42:44 CET 2012


I have opinion that this approach is not adequate, as assumptions are wrong.
Exactly:
1. that exceeding some interval between av_read_frame means data loss.
In fact, your UDP stream can "pulse" in terms of bitrate/frames
arrival.
2. i think gettimeofday()'ing on each frame arrival is unneeded load,
because frames you receive are timestamped.
If you still want to check time, how do you imagine it to work on
streams near 10 MB/s bitrate? Which triggering limit will you use? Any
limit value will not work. And in this sutiation data loss will be
determined at demuxer, and it will just log warnings "Continuity check
fail".
3. av_read_frame returns frames ordered not by pts (presentation
timestamp), but by dts (decoding timestamp). So how will you timestamp
your "forged" frames, is a sort of mindwork. MPEG TS arrives
"interleaved", meaning that frames presentation order is like
1,5,9,13; 2,6,10,14; 3,7,11,15; 4,8,12,16. They can get lined up after
decoding, although i'm not sure.
4. Not all frames store full picture. Some of them hold just
incremental diff to last "bearing" frame, and loss of such frames is
not a big problem.

After all, IMHO black frames instead of lost ones should be very
offending for watcher. Video screen will effectively blink!
If UDP signal is revceived so badly, better think on another
transmission protocol or network architecture.
In our work we don't do such things as you are ordered to do :) Its
better to have "scattered" or stuck frame on data loss, than blinking
black screen.

-- 
Andrey Utkin


More information about the Libav-user mailing list