[FFmpeg-devel] Unified CPU Endianness for framecrcenc
Sun Jan 27 05:19:20 CET 2008
On Sun, Jan 27, 2008 at 04:50:51AM +0100, Michael Niedermayer wrote:
> On Sat, Jan 26, 2008 at 03:21:19PM -0800, Mike Melanson wrote:
> > Hi,
> > I am using libavformat/framecrcenc.c for automated testing on the FATE
> > Server ( http://fate.multimedia.cx ). For those who don't know, '-f
> > framecrc' is a neat tool that "muxes" a file by simply running the Adler
> > CRC algorithm over incoming packets and printing some frame stats to the
> > stdout. It's great for validating bit accuracy for file/codec types that
> > are expected to be bit exact.
> > I am having trouble with certain formats, however. For example, if the
> > data is decoded as 15- or 16-bit RGB/BGR, the data will be stored in CPU
> > endianness. Would it be acceptable to have a special "#ifdef BIG_ENDIAN"
> > case in the muxer that swaps bytes before running the CRC? Or is there a
> > better solution to the problem? I'm asking before I do the coding work.
> convert to rgb24 or bgr24 maybe ...
If the format of frames being sent to the _muxer_ depends on the host
cpu endianness, something is seriously broken. LE/BE versions of
RGB15/16 should be considered separate formats...
More information about the ffmpeg-devel