[FFmpeg-devel] Shared Thread Pool

Malcolm Bechard malcolm.bechard at gmail.com
Thu Apr 12 19:08:50 EEST 2018


Hey,

I'd like to restart the conversation about a Shared Thread Pool in FFmpeg.
I found a past conversation about it here:

https://ffmpeg.org/pipermail/ffmpeg-devel/2016-January/186770.html.

As far as I can tell there isn't a thread pool in FFmpeg so far, but I
apologize if this has been solved already and I missed the solution.


One of the contentious issues in that last conversation seems to be that
there was few real-world examples where FFmpegs current behavior of
spawning new threads for each decoder would cause issues. I'll provide
another example:


Live music shows often have a VJ (video jockey) that is performing live
along with the performer/band. This is particularly true for electronic
music, but is becoming more and more common for other music styles as well.

The application the VJ uses allows them to choose from banks of
pre-rendered video, and blend/mix the layers of video in real time. What
video is going to be played is selected in real-time by the VJ depending on
what they currently want to show the audience, based on what the performer
is doing.

To make this possible, the app will need to have a large bank of videos
already open and ready to play. 50+. These videos can not be opened when
the VJ decides to play them, the 100ms+ of time it takes to open a video
and allocate resources is far too slow for a real-time performance. So
although they aren't playing 50+ videos at the same time, all of the
resources those videos need to be played at an instants notice needs to be
allocated already. So, using a thread pool will greatly reduce the number
of threads that need to be allocated here.


The 50 videos they have in one bank is only one bank of many they have.
They may want to switch banks and load up another set of 50 videos. Not
having to spawn/kill new threads for every video that is opened will
improve the performance of opening and closing of videos when switching
banks.


Most importantly though, decode performance. Since H264/H265 is not a
constant-time decoder, it's impossible to know for sure how many threads
you need for your video to be able to play it back in real time. Some may
require 8, some may require 1, depending on the video content. To be safe
the app should assign enough threads for a worst-case decode at that
particular resolution. The VJ can mix all kinds of different videos at the
same time, multiple 4K videos, a mix of 1080p and 4K videos, some even
smaller ones. It could be 2-4 heavy files or 12 light ones, or anything in
between. Without a thread pool this causes heavy oversubscription to the
CPU cores, reducing performance. Avoiding oversubscription is exactly why
most modern parallel applications utilize thread pools.


To be clear, this usage case is already happening, and has been happening
for a long time. Adding thread pool support will push the limits of what
this use case can do even further. Yes, the need is there for that.


Is there any interest in adding thread pool capabilities to FFmpeg? If I
was to take on this work, would it be accepted into the code base? Ideally
the API would be such that the host application could provide it's own
thread pool to be used (using callbacks), to avoid having two thread pools
allocated (and contending with each other for the cores).

I'm perfectly fine with this being entirely optional, and programming it in
such a way that the existing threading workflow stays functional, if there
is a desire for that.


Thanks!


Malcolm


More information about the ffmpeg-devel mailing list