[FFmpeg-trac] #8902(avcodec:new): VAAPI pixel format support regression between FFmpeg 3.x and 4.x
FFmpeg
trac at avcodec.org
Sat Sep 19 19:13:27 EEST 2020
#8902: VAAPI pixel format support regression between FFmpeg 3.x and 4.x
-------------------------------------+-------------------------------------
Reporter: bmegli | Type: defect
Status: new | Priority: normal
Component: avcodec | Version: 4.2
Keywords: vaapi, | Blocked By:
pixel format |
Blocking: | Reproduced by developer: 0
Analyzed by developer: 0 |
-------------------------------------+-------------------------------------
Summary of the bug:
In 3.x (e.g. 3.4 in Ubuntu 18.04) it is possible to use FFmpeg libavcodec
encoding with formats like bgr0 as frames context sw_format for encoding
(e.g. 444 input chroma subsampling).
In 4.x (e.g. 4.2 in Ubuntu 20.04) this is no longer possible and ends with
error (e.g. for H.264):
{{{
[h264_vaapi @ 0x55af43d3ee40] No usable encoding profile found.
}}}
Both in 3.x and 4.x av_hwframe_transfer_get_formats reports those formats
as uploadable for VAAPI.
The reason for this problem seems to be dynamic vaapi profile selection
introduced in
[https://github.com/FFmpeg/FFmpeg/commit/3b188666f19a17d15efb7eae590e988832972666
commit] present since FFmpeg 4.1.
The profile auto-detection feature tries to match software format depth,
number of components, log2_chroma_w and log2_chroma_h vs supported codec
profile and later match it with VAAPI profile. Naturally it is out of
luck, most VAAPI encoders do not support 4:4:4 chroma subsampling. In 3.4
it would accept surface in such pixel format and perform encoding
(obviously not in 4:4:4 but in 4:2:0 or 4:2:2).
The relevant vaapi_encode.c fragment from
[https://github.com/FFmpeg/FFmpeg/blob/c67bdd6534a0ee67c0d18aed0e3345e59464254f/libavcodec/vaapi_encode.c#L1330
current master]:
{{{
desc = av_pix_fmt_desc_get(ctx->input_frames->sw_format);
if (!desc) {
av_log(avctx, AV_LOG_ERROR, "Invalid input pixfmt (%d).\n",
ctx->input_frames->sw_format);
return AVERROR(EINVAL);
}
depth = desc->comp[0].depth;
//some code omitted for clarity
for (i = 0; (ctx->codec->profiles[i].av_profile !=
FF_PROFILE_UNKNOWN); i++) {
profile = &ctx->codec->profiles[i];
if (depth != profile->depth ||
desc->nb_components != profile->nb_components)
continue;
if (desc->nb_components > 1 &&
(desc->log2_chroma_w != profile->log2_chroma_w ||
desc->log2_chroma_h != profile->log2_chroma_h))
continue;
if (avctx->profile != profile->av_profile &&
avctx->profile != FF_PROFILE_UNKNOWN)
continue;
#if VA_CHECK_VERSION(1, 0, 0)
profile_string = vaProfileStr(profile->va_profile);
#else
profile_string = "(no profile names)";
#endif
for (j = 0; j < n; j++) {
if (va_profiles[j] == profile->va_profile)
break;
}
if (j >= n) {
av_log(avctx, AV_LOG_VERBOSE, "Compatible profile %s (%d) "
"is not supported by driver.\n", profile_string,
profile->va_profile);
continue;
}
ctx->profile = profile;
break;
}
if (!ctx->profile) {
av_log(avctx, AV_LOG_ERROR, "No usable encoding profile
found.\n");
err = AVERROR(ENOSYS);
goto fail;
}
}}}
The problem is, there is currently no way to use pixel format like bgr0,
even if encoding profile is specified manually.
How to reproduce:
The easiest way to reproduce is to use doc/examples/vaapi_encode.c
with
[https://github.com/FFmpeg/FFmpeg/blob/f30a41a6086eb8c10f66090739a2a4f8491c3c7a/doc/examples/vaapi_encode.c#L56
different sw_format], e.g.:
{{{
//frames_ctx->sw_format = AV_PIX_FMT_NV12;
frames_ctx->sw_format = AV_PIX_FMT_BGR0;
}}}
The rest of example doesn't need to be modified (it will never be reached
due to error with auto profile selection).
Note, that with 3.4 this example would require additional call to
avcodec_register_all to work (deprecated and meaningless in 4.2).
If this new VAAPI behaviour is intentional what is now the correct way to
work with bgr0 VAAPI surfaces keeping hardware format conversions (like in
3.x)?
--
Ticket URL: <https://trac.ffmpeg.org/ticket/8902>
FFmpeg <https://ffmpeg.org>
FFmpeg issue tracker
More information about the FFmpeg-trac
mailing list