How to create video thumbnails with Python and Gstreamer

核能气质少年 提交于 2019-11-29 09:56:01
daf

To elaborate on ensonic's answer, here's an example:

import os
import sys

import gst

def get_frame(path, offset=5, caps=gst.Caps('image/png')):
    pipeline = gst.parse_launch('playbin2')
    pipeline.props.uri = 'file://' + os.path.abspath(path)
    pipeline.props.audio_sink = gst.element_factory_make('fakesink')
    pipeline.props.video_sink = gst.element_factory_make('fakesink')
    pipeline.set_state(gst.STATE_PAUSED)
    # Wait for state change to finish.
    pipeline.get_state()
    assert pipeline.seek_simple(
        gst.FORMAT_TIME, gst.SEEK_FLAG_FLUSH, offset * gst.SECOND)
    # Wait for seek to finish.
    pipeline.get_state()
    buffer = pipeline.emit('convert-frame', caps)
    pipeline.set_state(gst.STATE_NULL)
    return buffer

def main():
    buf = get_frame(sys.argv[1])

    with file('frame.png', 'w') as fh:
        fh.write(str(buf))

if __name__ == '__main__':
    main()

This generates a PNG image. You can get raw image data using gst.Caps("video/x-raw-rgb,bpp=24,depth=24") or something like that.

Note that in GStreamer 1.0 (as opposed to 0.10), playbin2 has been renamed to playbin and the convert-frame signal is named convert-sample.

The mechanics of seeking are explained in this chapter of the GStreamer Application Development Manual. The 0.10 playbin2 documentation no longer seems to be online, but the documentation for 1.0 is here.

An example in Vala, with GStreamer 1.0 :

var playbin = Gst.ElementFactory.make ("playbin", null);
playbin.set ("uri", "file:///path/to/file");
// some code here.
var caps = Gst.Caps.from_string("image/png");
Gst.Sample sample;
Signal.emit_by_name(playbin, "convert-sample", caps, out sample);
if(sample == null)
    return;
var sample_caps = sample.get_caps ();
if(sample_caps == null)
    return;
unowned Gst.Structure structure = sample_caps.get_structure(0);
int width = (int)structure.get_value ("width");
int height = (int)structure.get_value ("height");
var memory = sample.get_buffer().get_memory (0);
Gst.MapInfo info;
memory.map (out info, Gst.MapFlags.READ);
uint8[] data = info.data;

It's an old question but I still haven't found it documented anywhere.
I found that the following worked on a playing video with Gstreamer 1.0

import gi
import time
gi.require_version('Gst', '1.0')
from gi.repository import Gst

def get_frame():
    caps = Gst.Caps('image/png')
    pipeline = Gst.ElementFactory.make("playbin", "playbin")
    pipeline.set_property('uri','file:///home/rolf/GWPE.mp4')
    pipeline.set_state(Gst.State.PLAYING)
    #Allow time for it to start
    time.sleep(0.5)
    # jump 30 seconds
    seek_time = 30 * Gst.SECOND
    pipeline.seek(1.0, Gst.Format.TIME,(Gst.SeekFlags.FLUSH | Gst.SeekFlags.ACCURATE),Gst.SeekType.SET, seek_time , Gst.SeekType.NONE, -1)

    #Allow video to run to prove it's working, then take snapshot
    time.sleep(1)
    buffer = pipeline.emit('convert-sample', caps)
    buff = buffer.get_buffer()
    result, map = buff.map(Gst.MapFlags.READ)
    if result:
        data = map.data
        pipeline.set_state(Gst.State.NULL)
        return data
    else:
        return

if __name__ == '__main__':
    Gst.init(None)
    image = get_frame()
    with open('frame.png', 'wb') as snapshot:
        snapshot.write(image)

The code should run with both Python2 and Python3, I hope it helps someone.

Use playbin2. set the uri to the media file, use gst_element_seek_simple to seek to the desired time position and then use g_signal_emit to invoke the "convert-frame" action signal.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!