[FFmpeg-devel] [PATCH] Apple RPZA encoder
Thu Apr 16 15:23:28 CEST 2009
> I took some time off to wade through that doc. From what I understand,
> I'll need a metric to represent the distortion and output rate.
> Assuming I use a simple variant like least squares or sum of absolute
> differences or whatever, I still don't understand how I can visualize
> and RD curve for this (rpza) case. Do I fix the output size and
> iterate over possible values for the lagrangian parameter to minimize
> the bit cost? If I do that, I still don't quite get how I can
> correlate that bit cost to rpza block coding. Any insights appreciated
RD = Lambda * Bits + Distortion
For any given lambda, there is an optimal RD score.
Thus, there are two (simple) ways to encode:
1. Constant lambda; keep lambda the same everywhere, get some target
bitrate as a result. Effectively constant quality.
2. Constant bitrate; adjust lambda to get a target bitrate.
In MPEG-like formats, lambda is usually chosen as a per-quantizer
constant. The exact mapping of lambda to quantizer is usually
More information about the ffmpeg-devel