[FFmpeg-devel] Trying to implement possability to pause encoding

L. Rahyen research
Sat Aug 29 06:53:02 CEST 2009


	At first the problem seems to be trivial but unfortunately it isn't (obvious 
solutions like signals STOP/CONT are not useful for my problem). 
	First, I will explain why I'm trying to implement possibility to pause 
encoding. I often need to record screencasts. Usually, I show how to do 
something in my video but I often need to skip some useless parts (for example, 
parts where computer is busy and nothing interesting happens for a (possibly) 
long time). Of course I can record everything to one file and then remove 
unwanted parts later in video-editing software. However, this is slow and time 
consuming. Since I almost always know exactly when I need to pause during 
screencast recording and when I need to continue it, it seems natural to have 
possibility to pause or continue when I want (I'm going to use simple wrapper 
for ffmpeg so I can use global shortcuts instead of pressing "p" (for pause) or 
"q" in the console but this is irrelevant here). This way I will not need to 
edit and reencode my video in most cases (and in cases when I still need to 
edit it, my work will be much easier). To record screencast I use this command:

ffmpeg -f x11grab -r 4 -s 1680x1050 -i :0.0 -vcodec libx264 -vpre screencast \
~/out.avi

	I tried is to add these lines to ffmpeg.c (please see attached patch for 
more information):

+            if (key == 'p')
+                while ( (key = read_key() ) != 'p' )
+                    usleep(100000);

	And this seems to work... for everything except screencast recording! Since I 
need this for screencast recording this isn't good enough.
	For example, if I (after applying attached patch and running command mentioned 
earlier) press "p" at frame #30, wait for 15 seconds and press "p" again and 
record additional 30 frames (let's assume frame rate is 1 fps for this example) 
then I get 60 frames (as expected). However, when I try to play the file I get 
75 seconds of video instead of 60: first 30 frames will play as expected, then 
playback will be stuck for 15 seconds and will continue normally after that; 
but expected result was video with 1 fps for whole video file. Just imagine how 
bad the result might be if encoding was paused for dozens of minutes or even 
for few hours! I perfectly understand the idea behind this behavior: if some 
frames wasn't captured, captured video duration will still have 1:1 relation 
with real-time. This good thing unless I want to pause screencast recording and 
resume later.
	I looked to libavdevice/x11grab.c but I'm not sure what is the right way to 
implement what I want. As you can see, I just want to limit maximum time delay 
to 1/fps between two specific frames - one right before pause was activated and 
one right after it was deactivated. For example, if frame rate is 4 fps then 
time delay between these two frames in the recorded video must not be more than 
0.25 seconds (for all over frames current behavior should be used for 
determining time delay). This will allow to effectively skip unwanted parts 
during screencast recording.

	Any ideas what is the correct way to implement this?

	Thank you.

	P.S. Please note that for me this is first time when I looked into ffmpeg source 
code and tried to change something in it. I'm sorry if I'm wrong in something 
or missing something obvious.




-------------- next part --------------
A non-text attachment was scrubbed...
Name: patch.diff
Type: text/x-patch
Size: 710 bytes
Desc: not available
URL: <http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/attachments/20090829/c57c3267/attachment.bin>



More information about the ffmpeg-devel mailing list