[FFmpeg-devel] [PATCH] ac3dec: fix non-optimal dithering of zero bit mantissas
Reimar.Doeffinger at gmx.de
Sat Jan 5 11:46:19 CET 2013
On Sat, Jan 05, 2013 at 11:37:55AM +0100, madshi wrote:
> Hey guys,
> the latest revision of the (E-)AC3 spec says:
> > The optimum scaling for the dither words is to take a
> > uniform distribution of values between –1 and +1, and
> > scale this by 0.707, resulting in a uniform distribution
> > between +0.707 and –0.707
> Currently the AC3 decoder applies a dithering of only +0.5 .. -0.5.
> According to the spec this is "also acceptable", but it's not the "optimum
> scaling". The effects of this non-optimal dithering are clearly visible in
> frequency graphs. Basically high frequencies are lower in volume than they
> should be. You can test this yourself by looking at the following files:
> Load the WAV files into Audacity, then for each WAV in the Audacity menu
> choose "analyze -> frequency analyzis". You can see that the "azid" and
> "liba52" decoders have a better high frequency response than the "libav"
> decoder. However, look at "libav_patched" which shows the results when
> using the proper amount of dithering in the libav AC3 decoder.
> Attached is a patch which modifies the AC3 decoder to apply the optimum
> (av_lfg_get() does return values in the range 0..FFFFFFFF, correct?)
Not sure about lfg (though IMHO it should be replaced by lcg anyway to
not needlessly waste CPU time) but usually the lower bits contain more
"randomness". Thus you should generally be using some kind of modulo
operation instead of a division.
In addition, doing division/modulo by a value that is not a power of 2
will introduce some bias. It might not be relevant, but it should be
analyzed to make sure it truly isn't.
More information about the ffmpeg-devel