[FFmpeg-devel] [PATCH 1/6] Frame-based multithreading framework using pthreads
Sat Jan 22 07:25:08 CET 2011
On Fri, Jan 21, 2011 at 10:18 PM, Rob <robert.swain at gmail.com> wrote:
> On 22 January 2011 07:10, Alex Converse <alex.converse at gmail.com> wrote:
>> On Fri, Jan 21, 2011 at 10:05 PM, Rob <robert.swain at gmail.com> wrote:
>>> On 21 January 2011 23:38, Alexander Strange <astrange at ithinksw.com> wrote:
>>>> On Jan 21, 2011, at 9:58 AM, Ronald S. Bultje wrote:
>>>>> On Fri, Jan 21, 2011 at 5:51 AM, Alexander Strange
>>>>> <astrange at ithinksw.com> wrote:
>>>>>> It really simplified development to do it like this, because it made the frame delay reliable
>>>>>> and made sure all the threads allocated got started. When you're decoding stuff read off
>>>>>> disk, multiple calls to avcodec_decode_video are practically instantaneous, so it doesn't
>>>>>> help speed to return things faster. The only case I can think of where less decode delay
>>>>>> _really_ helps is where it receives new encoded frames rarely.
>>>>> Realtime streams?
>>>> Yeah, but realtime streams where each frame is more time-sensitive than just TV. Security cameras?
>>> Medical cameras?
> Is that just advertising or something more? I can't tell from a quick
> glance at the website. In any case there's onLive (games rendered on
> servers and streamed as video to an STB).
Gaikai is low-latency streaming tech being used to advertise products
(e.g. games, applications, etc) by allowing publishers to pay for
users to demo their software (on the order of ~1 cent per minute).
They use ffmpeg in their client. Frame-based threading is completely
irrelevant for them because it's far too high-latency; they would at
most use slice-based threading.
(And I work for them.)
More information about the ffmpeg-devel