[FFmpeg-devel] [RFC] libswscale palette output implementation

Vitor Sessak vitor1001
Sat Jan 2 20:09:53 CET 2010


Michael Niedermayer wrote:
> On Sat, Jan 02, 2010 at 01:16:23AM +0100, Vitor Sessak wrote:
>> Stefano Sabatini wrote:
>>> On date Friday 2010-01-01 21:55:44 +0100, Michael Niedermayer encoded:
>>>> On Fri, Jan 01, 2010 at 02:15:05PM +0100, Stefano Sabatini wrote:
>>>>> On date Thursday 2009-12-31 19:41:44 +0200, Kostya encoded:
>>>>>> On Thu, Dec 31, 2009 at 05:28:24PM +0100, Stefano Sabatini wrote:
>>>>>>> Hi,
>>>>>>>
>>>>>>> related thread:
>>>>>>> http://thread.gmane.org/gmane.comp.video.ffmpeg.devel/80845/focus=82531
>>>>>>>
>>>>>>> Kostya's idea is to use Vitor's ELBG implementation in
>>>>>>> libavcodec/elbg.{h,c}, first step would be to move it to lavu where it
>>>>>>> can be used by lsws. This shouldn't comport any ABI / API issues,
>>>>>>> since the API is only internal (it would only require a dependancy
>>>>>>> change of lavc on lavu, but I may be wrong here).
>>>>>> Just don't forget one point (I may be wrong here though): scale code 
>>>>>> may
>>>>>> be called on slice or single line, so you need somehow to ensure it
>>>>>> processes the whole picture. Maybe just having single filter for that 
>>>>>> is
>>>>>> better. Also it may be improved to produce palette with minimum
>>>>>> differences for consequent frames, etc.
>>>>> Michael what about the filter option? I'm not even sure if the
>>>>> libswscale solution would be viable, since the usage required here
>>>>> conflicts with the slice API.
>>>> It would be very annoying if one pixel format was a special case and
>>>> couldnt be handled by swscale.
>>>> There are projects using swscale tha do not use lavfi
>>> Mmh OK, do you already have some ideas about how to make sws_scale()
>>> manage such a thing?
>>> I mean: sws_scale() is passed in input a slice, it needs the whole
>>> image in order to be able to compute the quantization palette, but
>>> it is supposed to immediately draw in output the scaled slice.
>>> Also how would be possible to request the filter/sws_scale() to keep
>>> the same palette, and/or to use a palette provided by the user rather
>>> than compute it for each frame?
>>> In attachment a lazy implementation of a quantization filter using
>>> ELFG.
>> In case anyone is curious, I've tested it with against a few different 
>> programs:
>>
>> Original image:
>> http://sites.google.com/site/vsessak2/home/original.jpg
>>
>> This patch:
>> http://sites.google.com/site/vsessak2/home/ffmpeg_output.bmp
>>
>> pnmquant:
>> http://sites.google.com/site/vsessak2/home/pnmquant.bmp
>>
>> ffmpeg -vfilters "format=rgb8":
>> http://sites.google.com/site/vsessak2/home/ffmpeg_rgb8.bmp
>>
>> gimp without dithering:
>> http://sites.google.com/site/vsessak2/home/gimp.bmp
>>
>> gimp with dithering:
>> http://sites.google.com/site/vsessak2/home/gimp_dither.bmp
>>
> 
>> Interestingly, using just clustering (like this patch) the image suffer 
>> less from the lack of dithering than what I would expect.
> 
> This is not unexpected
> 
> with clustering as done the algorithm tries to select colors so as to
> reduce the error between the best matching available color and what should
> be represented. This is pretty much optimal for the non dithered case its not
> optimal for the dithered case.
> To see the problem with clustering and dithering consider a image of a cartoon
> made of 5% black, 10% dark gray, 40% mid gray, 30% light gray and 15% white
> clustering this to 2 colors will give you mid gray and light gray this is
> not capable to reasonable reproduce 30% of the image with dithering, white
> and black would have been a better choice.
> If you implement error diffussion dithering and find a image where it doesnt
> work too well (maybe use a palette with few colors) then you could ttry to
> weight the input colors stronger that cannot be repreented well and run
> the clustering a 2nd time with the updated weights. This should not be
> hard to implement

I understand this point (you've already made it when Kostya first gave 
the idea of using ELBG to palletize). What I said is that I expected 
that the image would be bad with no dithering, no matter which palette 
is used (by having large gradients turning into bands of fixed colors). 
Turns out 256 colors is almost enough to give a good picture, even if I 
choose a particularly colorful sample.

-Vitor



More information about the ffmpeg-devel mailing list