<div dir="auto">It seems before send to gstreamer, you need de-packetize first, this why you demux by libav to AVFrame, it de-packetize the stream to complete data of frame, then it works with GStreamer. </div><div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Mon, 4 Jan 2021 at 17:45 fre deric <<a href="mailto:frenky.picasso@seznam.cz">frenky.picasso@seznam.cz</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">My goal is to demux video stream in libav and then decode it in GStreamer<br>
pipeline.<br>
<br>
My approach is to take AVPacket from the video stream in the first thread<br>
and send it to GStreamer pipeline in the second thread. Important parts of<br>
code are here:<br>
<br>
// -- THREAD 1 -- <br>
// Take data from AVPacket<br>
img_data = (guchar *)packet.data;<br>
size = packet.size;<br>
// Create GStreamer buffer<br>
buffer = gst_buffer_new_allocate(NULL, size, NULL);<br>
gst_buffer_map(buffer, &map, GST_MAP_WRITE);<br>
memcpy((guchar *)map.data, img_data, gst_buffer_get_size(buffer));<br>
map.size = size;<br>
gst_buffer_unmap(buffer, &map);<br>
// Send the buffer to appsrc element in the pipeline.<br>
gstret = gst_app_src_push_buffer((GstAppSrc *)app_source, buffer);<br>
<br>
// -- THREAD 2 --<br>
// A video cap for appsrc element<br>
const gchar *video_caps = "video/x-theora, width=1920, height=1080,<br>
framerate=30/1";<br>
// GStreamer pipeline<br>
string = g_strdup_printf("appsrc name=testsource caps=\"%s\" ! theoradec !<br>
videoconvert ! autovideosink", video_caps);<br>
<br>
However, I am getting following error in GStreamer pipeline:<br>
"<br>
ERROR from element theoradec0: Could not decode stream.<br>
Debugging info: gsttheoradec.c(812): theora_handle_data_packet ():<br>
/GstPipeline:pipeline0/GstTheoraDec:theoradec0: no header sent yet<br>
"<br>
<br>
I also tested a version, when the AVPacket was decoded to AVFrame by libav<br>
and then sent to the gstreamer pipeline and it WORKS:<br>
<br>
-- THREAD 1 --<br>
// Decode.<br>
avcodec_decode_video2(pCodecCtx, pFrame, &frameFinished, &packet);<br>
// Take data from AVFrame<br>
img_data = (guchar *)pFrame->data<br>
size = av_image_get_buffer_size(AV_PIX_FMT_BGR24, 1920, 1080, 1);<br>
// same way as before.<br>
<br>
-- THREAD 2 --<br>
const gchar *video_caps = "video/x-raw, format=BGR, width=1920, height=1080,<br>
framerate=30/1";<br>
string = g_strdup_printf("appsrc name=testsource caps=\"%s\" ! videoconvert<br>
! autovideosink", video_caps);<br>
<br>
<br>
All this is tested on this video file:<br>
container: ogg<br>
codec: Theora<br>
dim: 1920x1080<br>
framerate: 30fps<br>
<br>
<br>
Why does sending the AVFrame to GStreamer pipeline work, but not the<br>
AVPacket?<br>
<br>
<br>
<br>
--<br>
Sent from: <a href="http://libav-users.943685.n4.nabble.com/" rel="noreferrer" target="_blank">http://libav-users.943685.n4.nabble.com/</a><br>
_______________________________________________<br>
Libav-user mailing list<br>
<a href="mailto:Libav-user@ffmpeg.org" target="_blank">Libav-user@ffmpeg.org</a><br>
<a href="https://ffmpeg.org/mailman/listinfo/libav-user" rel="noreferrer" target="_blank">https://ffmpeg.org/mailman/listinfo/libav-user</a><br>
<br>
To unsubscribe, visit link above, or email<br>
<a href="mailto:libav-user-request@ffmpeg.org" target="_blank">libav-user-request@ffmpeg.org</a> with subject "unsubscribe".</blockquote></div></div>