[FFmpeg-devel] [PATCH 1/1] This change adds an encoder for Camera metadata motion. This is a type of sensor data associated with video, such as GPS, acceleration, gyro, and camera orientation. It does not encode video itself, but rather, this metadata.

wm4 nfxjfg at googlemail.com
Thu Jul 13 22:40:17 EEST 2017


On Thu, 13 Jul 2017 12:05:15 -0700
"Louis O'Bryan" <louiso-at-google.com at ffmpeg.org> wrote:

> On Thu, Jul 13, 2017 at 1:34 AM, wm4 <nfxjfg at googlemail.com> wrote:
> 
> > On Wed, 12 Jul 2017 22:42:36 +0200
> > Hendrik Leppkes <h.leppkes at gmail.com> wrote:
> >  
> > > On Wed, Jul 12, 2017 at 8:31 PM, Louis O'Bryan
> > > <louiso-at-google.com at ffmpeg.org> wrote:  
> > > > On Wed, Jul 12, 2017 at 9:16 AM, Louis O'Bryan <louiso at google.com>  
> > wrote:  
> > > >  
> > > >> On Wed, Jul 12, 2017 at 12:50 AM, wm4 <nfxjfg at googlemail.com> wrote:
> > > >>  
> > > >>> On Tue, 11 Jul 2017 16:17:33 -0700
> > > >>> "Louis O'Bryan" <louiso-at-google.com at ffmpeg.org> wrote:
> > > >>>  
> > > >>> > If I need to write a new atom under stsd for my stream in the mov  
> > muxer  
> > > >>> > <https://github.com/FFmpeg/FFmpeg/blob/master/libavformat/movenc.c  
> > >  
> > > >>> > (mov_write_stsd_tag),
> > > >>> > is it appropriate to indicate that through the AVStream metadata  
> > rather  
> > > >>> > than the codec_tag?  
> > > >>>
> > > >>> It seemed to have lots of unrelated changes, but maybe I'm missing
> > > >>> something. If those codec tag refactors are needed, they should
> > > >>> probably be split into a separate patch.
> > > >>>
> > > >>> But it looks like most of those changes were unintended (Moritz
> > > >>> suspected that too). The tag addition itself is probably fine.
> > > >>>
> > > >>> Also, please don't top post on mailing lists.
> > > >>> _______________________________________________
> > > >>> ffmpeg-devel mailing list
> > > >>> ffmpeg-devel at ffmpeg.org
> > > >>> http://ffmpeg.org/mailman/listinfo/ffmpeg-devel
> > > >>>  
> > > >>
> > > >> That file had unrelated changes that shouldn't have been there, please
> > > >> ignore them.
> > > >> Now that there is no codec associated with the stream, there  
> > shouldn't be  
> > > >> a codec tag at all, I would assume. (Another issue I need to deal  
> > with is  
> > > >> that the MOV muxer also doesn't support streams without a codec, but  
> > that  
> > > >> is separate.)
> > > >>  
> > > >
> > > > My goal is to modify the MOV/MP4 muxer so that I can mux the new stream
> > > > with video and audio streams. Part of that is writing a new sample  
> > entry  
> > > > box under the stsd box.
> > > > Since I no longer plan to use an encoder for the stream, I was  
> > wondering if  
> > > > the AVStream::metadata
> > > > <https://www.ffmpeg.org/doxygen/3.2/structAVStream.html#  
> > a50d250a128a3da9ce3d135e84213fb82>  
> > > > would be an appropriate way to recognize that stream. Other cases in  
> > the  
> > > > mov_write_stsd_tag function use the codec tag.
> > > > I have the following sample of that idea here, which allows me to use  
> > the  
> > > > new stream and write the sample entry box:
> > > >  
> > >
> > > You can associate a codec to the stream, put the codec in the data
> > > codec range. You just wouldn't have an encoder for it.
> > > This should probably work without any special hacks, I would guess.  
> >
> > If it does, I would assume it uses ffmpeg.c's stream copy path?
> >
> > I still don't understand where the data comes from. Having a dummy
> > encoder that consumes dummy AVFrames would also require a dummy
> > decoder. Unless this is for something outside of ffmpeg.c.
> > _______________________________________________
> > ffmpeg-devel mailing list
> > ffmpeg-devel at ffmpeg.org
> > http://ffmpeg.org/mailman/listinfo/ffmpeg-devel
> >  
> 
> I was using the C API. I am using this code for muxing
> <https://github.com/lbobryan/FFmpeg/blob/camm/doc/examples/camm_muxing.c>
> and demuxing
> <https://github.com/lbobryan/FFmpeg/blob/camm/doc/examples/camm_demuxing.c>.
> I have not tried the ffmpeg.c copy path.
> For a real use case, the data for the new stream would be coming from a
> camera's hardware / sensors as it receives it in real time.

Oh I see.

For data verification and dumping, a BSF could be used, btw.


More information about the ffmpeg-devel mailing list