<html><body>
<p><font size="2" face="sans-serif">Hello,</font><br>
<br>
<font size="2" face="sans-serif">I am encoding video data on an Android device, sending the encoded video data to a server via websocket, and decoding the video data on the server. This seems to work fine, except it appears that the timestamps are not set in the decoded video frames.</font><br>
<br>
<font size="2" face="sans-serif">On the Android client:</font><br>
<br>
<font size="2" face="sans-serif">AVRational time_base = new AVRational();</font><br>
<font size="2" face="sans-serif">time_base.num(1);</font><br>
<font size="2" face="sans-serif">time_base.den(m_videoFormat.getFrameRate());</font><br>
<font size="2" face="sans-serif">video_c.time_base(time_base);</font><br>
<font size="2" face="sans-serif">frame.pts(0)</font><br>
<br>
<font size="2" face="sans-serif">// For each frame:</font><br>
<font size="2" face="sans-serif"> frame.pts(frame.pts() + 1);</font><br>
<font size="2" face="sans-serif"> int ret = avcodec_encode_video2(video_c, pkt, frame, got_output);</font><br>
<font size="2" face="sans-serif"> if (ret < 0) {</font><br>
<font size="2" face="sans-serif"> return(-1);</font><br>
<font size="2" face="sans-serif"> }</font><br>
<br>
<font size="2" face="sans-serif"> if (got_output) {</font><br>
<font size="2" face="sans-serif"> System.out.printf("Write frame %3d, (size=%5d), (milliseconds: %14d), (tstamp: %8d)\n", </font><br>
<font size="2" face="sans-serif"> m_iFrames, pkt.size(), time, pkt.pts());</font><br>
<br>
<font size="2" face="sans-serif"> // Send pkt.data to server via websocket</font><br>
<font size="2" face="sans-serif"> :</font><br>
<font size="2" face="sans-serif"> :</font><br>
<font size="2" face="sans-serif"> :</font><br>
<font size="2" face="sans-serif"> }</font><br>
<br>
<br>
<font size="2" face="sans-serif">On the client the printf appears to display valid timestamps.</font><br>
<font size="2" face="sans-serif">Write frame 37, (size= 1275), (milliseconds: 1415727731456), (tstamp: 20)</font><br>
<br>
<font size="2" face="sans-serif">On the server:</font><br>
<br>
<font size="2" face="sans-serif">// For each chunk of video data read from websocket...</font><br>
<br>
<font size="2" face="sans-serif"> avpkt.size = iBytes; // iBytes is length of data read from websocket</font><br>
<font size="2" face="sans-serif"> avpkt.data = cBytes; // cBytes is data read from websocket</font><br>
<br>
<font size="2" face="sans-serif"> int len = avcodec_decode_video2(c, frame, &got_frame, &avpkt);</font><br>
<font size="2" face="sans-serif"> if (len < 0) {</font><br>
<font size="2" face="sans-serif"> return(-1);</font><br>
<font size="2" face="sans-serif"> }</font><br>
<br>
<font size="2" face="sans-serif"> if (got_frame) {</font><br>
<font size="2" face="sans-serif"> printf("Frame timestamps: pts: %lld, dts: %lld, base: %lld. \n", </font><br>
<font size="2" face="sans-serif"> (long long)avpkt.pts, (long long)avpkt.dts, </font><br>
<font size="2" face="sans-serif"> av_q2d(c->time_base));</font><br>
<font size="2" face="sans-serif"> </font><br>
<font size="2" face="sans-serif"> // Write frame to disk...</font><br>
<br>
<font size="2" face="sans-serif"> }</font><br>
<br>
<font size="2" face="sans-serif">On the server the timestamps appear to be not set.</font><br>
<font size="2" face="sans-serif">Frame timestamps: pts: -9223372036854775808, dts: -9223372036854775808, base: 222879922077.</font><br>
<br>
<font size="2" face="sans-serif">Any idea what I'm doing wrong? Thanks in advance for any suggestions!</font><br>
<br>
<font size="2" face="sans-serif">Charles</font></body></html>