[FFmpeg-devel] Unified CPU Endianness for framecrcenc
Sun Jan 27 00:21:19 CET 2008
I am using libavformat/framecrcenc.c for automated testing on the FATE
Server ( http://fate.multimedia.cx ). For those who don't know, '-f
framecrc' is a neat tool that "muxes" a file by simply running the Adler
CRC algorithm over incoming packets and printing some frame stats to the
stdout. It's great for validating bit accuracy for file/codec types that
are expected to be bit exact.
I am having trouble with certain formats, however. For example, if the
data is decoded as 15- or 16-bit RGB/BGR, the data will be stored in CPU
endianness. Would it be acceptable to have a special "#ifdef BIG_ENDIAN"
case in the muxer that swaps bytes before running the CRC? Or is there a
better solution to the problem? I'm asking before I do the coding work.
There is also the problem with the endian-order palette on PAL8 frames,
but I want to address the 15/16-bit problem first. This also affects
More information about the ffmpeg-devel