[Libav-user] get RGB values from ffmpeg frame

René J.V. Bertin rjvbertin at gmail.com
Tue Nov 20 09:53:30 CET 2012


Have you looked at the tutorial concerning ffmpeg wit sdl, and/or the ffplay sourcecode? Other things to check out would be the motion jpeg encoder - afaik jpg images are in rgb space.

R

Navin <nkipe at tatapowersed.com> wrote:

>Been trying these techniques, but I keep getting crashes. Help please?:
>
>         avpicture_fill((AVPicture*) frame2, frame2_buffer, 
>PIX_FMT_RGB24, width2, height2);
>         //printf("data = %d, %d, %d, %d 
>\n",frame2->data[0],frame2->data[1],frame2->data[2],frame2->data[3]);
>         //printf("linesize = %d %d %d\n",frame2->linesize[0], 
>frame2->linesize[1], frame2->linesize[2]);
>         //printf("width = %d\n", pCodecCtx->width);
>         //printf("height = %d\n", pCodecCtx->height);
>         //std::cin.get();
>
>         int linesize = frame2->linesize[0];
>         for(int xx = 0; xx < (linesize * width1)-1; xx += 3)
>         {
>             int r = frame2->data[0][xx];//int r = frame2->data[0][xx];
>             int g = frame2->data[0][xx+1];
>             int b = frame2->data[0][xx+2];
>             printf("xx=%d                 r=%d, g=%d, b=%d \n",xx, r, 
>g, b);
>         }
>         printf("frame%d done----------------",i++);
>         //for(int xx = 0; xx < width1; xx = xx + 3)
>         //{
>         //    for(int yy = 0; yy < height1; ++yy)
>         //    {
>         //        //int p = xx*3 + yy*frame2->linesize[0];
>         //        //int p = xx * 3 + yy * linesize;
>         //        printf("yy=%d xx=%d",yy,xx);
>         //        int p = yy * linesize + xx;
>         //        printf("p=%d\n",p);
>         //        int r = frame2->data[0][p];
>         //        int g = frame2->data[0][p+1];
>         //        int b = frame2->data[0][p+2];
>         //        printf("[r=%d, g=%d, b=%d ]\n", r, g, b);
>         //    }//for
>         //}//for
>
>Nav
>
>On 11/20/2012 8:52 AM, Nav wrote:
>> Hi! Glad to be part of this mailing list.
>> What I wanted to create, was a program which would receive a
>streaming 
>> video, and when it decodes a frame of the video into either a bitmap 
>> format or just pure RGB (perhaps stored in a char array), it would 
>> notify another program that it has received a frame, and the other 
>> program would take the RGB values and display it.
>> I've already asked this question here: 
>> http://ffmpeg.zeranoe.com/forum/viewtopic.php?f=15&t=805
>> and rogerdpack told me to post my question on the libav mailing list.
>> I have been through many websites, but they either use img_convert 
>> (which doesn't work) or sws_scale, which crashes when I try to use it
>
>> with RGB.
>> Could anyone help with a complete piece of code which can give me the
>
>> RGB values of a frame?
>>
>> This is a part of the YUV conversion that I tried initially.
>>
>>   i=0;
>>   while(av_read_frame(pFormatCtx, &packet) >= 0)
>>   {
>>     // Is this a packet from the video stream?
>>     if(packet.stream_index==videoStream)
>>     {
>>     // Decode video frame
>>     avcodec_decode_video2(pCodecCtx, pFrame, &frameFinished,
>&packet);
>>
>>     // Did we get a video frame?
>>     if(frameFinished)
>>     {
>>         // Convert the image into YUV format that SDL uses
>>         sws_scale( sws_ctx, (uint8_t const * const *)pFrame->data, 
>> pFrame->linesize, 0, pCodecCtx->height, pict.data, pict.linesize );
>>
>>
>_______________________________________________
>Libav-user mailing list
>Libav-user at ffmpeg.org
>http://ffmpeg.org/mailman/listinfo/libav-user



More information about the Libav-user mailing list