[FFmpeg-devel] regression - mpeg2 interlace/topfield flags not set anymore

christophelorenz christophelorenz
Sun Jan 27 01:26:40 CET 2008

Ok, I found the source of the problem...

When parsing flags, the flags values are represented internally as a 
number, passed as in a string.

The problem is that the flags are represented as hex values 0x20140000 
while the other values are base 10.
So when ff_eval2 wants to parse the flag number, it has to parse hex.
Didn't go further into ff_eval2 but I can say that under linux it can 
parse hex but not under win32 : ff_eval2("0x20140000"....)=0

So the fix is either make ff_eval2 work with hex under win32, or return 
the flags as base10 numbers.

The base10 fix consist simply in a one line change in opt.c
I can't tell if it can have other implication in the code however.

Index: /ffmpeg/libavcodec/opt.c
--- /ffmpeg/libavcodec/opt.c    (revision 11625)
+++ /ffmpeg/libavcodec/opt.c    (working copy)
@@ -225,7 +225,7 @@
     if(o_out) *o_out= o;
-    case FF_OPT_TYPE_FLAGS:     snprintf(buf, buf_len, 
"0x%08X",*(int    *)dst);break;
+    case FF_OPT_TYPE_FLAGS:     snprintf(buf, buf_len, "%d",*(int    
     case FF_OPT_TYPE_INT:       snprintf(buf, buf_len, "%d" , *(int    
     case FF_OPT_TYPE_INT64:     snprintf(buf, buf_len, "%"PRId64, 
     case FF_OPT_TYPE_FLOAT:     snprintf(buf, buf_len, "%f" , *(float  


More information about the ffmpeg-devel mailing list