How to save a frame using QMediaPlayer?

只愿长相守 提交于 2019-12-29 09:22:05

问题


I want to save an image of a frame from a QMediaPlayer. After reading the documentation, I understood that I should use QVideoProbe. I am using the following code :

QMediaPlayer *player = new QMediaPlayer();
QVideoProbe *probe   = new QVideoProbe;

connect(probe, SIGNAL(videoFrameProbed(QVideoFrame)), this, SLOT(processFrame(QVideoFrame)));

qDebug()<<probe->setSource(player); // Returns true, hopefully.

player->setVideoOutput(myVideoSurface);
player->setMedia(QUrl::fromLocalFile("observation.mp4"));
player->play(); // Start receving frames as they get presented to myVideoSurface

But unfortunately, probe->setSource(player) always returns false for me, and thus my slot processFrame is not triggered.

What am I doing wrong ? Does anybody have a working example of QVideoProbe ?


回答1:


You're not doing anything wrong. As @DYangu pointed out, your media object instance does not support monitoring video. I had the same problem (and same for QAudioProbe but it doesn't interest us here). I found a solution by looking at this answer and this one.

The main idea is to subclass QAbstractVideoSurface. Once you've done that, it will call the method QAbstractVideoSurface::present(const QVideoFrame & frame) of your implementation of QAbstractVideoSurface and you will be able to process the frames of your video.

As it is said here, usually you will just need to reimplement two methods :

  1. supportedPixelFormats so that the producer can select an appropriate format for the QVideoFrame
  2. present which allows to display the frame

But at the time, I searched in the Qt source code and happily found this piece of code which helped me to do a full implementation. So, here is the full code for using a "video frame grabber".

VideoFrameGrabber.cpp :

#include "VideoFrameGrabber.h"

#include <QtWidgets>
#include <qabstractvideosurface.h>
#include <qvideosurfaceformat.h>

VideoFrameGrabber::VideoFrameGrabber(QWidget *widget, QObject *parent)
    : QAbstractVideoSurface(parent)
    , widget(widget)
    , imageFormat(QImage::Format_Invalid)
{
}

QList<QVideoFrame::PixelFormat> VideoFrameGrabber::supportedPixelFormats(QAbstractVideoBuffer::HandleType handleType) const
{
    Q_UNUSED(handleType);
    return QList<QVideoFrame::PixelFormat>()
        << QVideoFrame::Format_ARGB32
        << QVideoFrame::Format_ARGB32_Premultiplied
        << QVideoFrame::Format_RGB32
        << QVideoFrame::Format_RGB24
        << QVideoFrame::Format_RGB565
        << QVideoFrame::Format_RGB555
        << QVideoFrame::Format_ARGB8565_Premultiplied
        << QVideoFrame::Format_BGRA32
        << QVideoFrame::Format_BGRA32_Premultiplied
        << QVideoFrame::Format_BGR32
        << QVideoFrame::Format_BGR24
        << QVideoFrame::Format_BGR565
        << QVideoFrame::Format_BGR555
        << QVideoFrame::Format_BGRA5658_Premultiplied
        << QVideoFrame::Format_AYUV444
        << QVideoFrame::Format_AYUV444_Premultiplied
        << QVideoFrame::Format_YUV444
        << QVideoFrame::Format_YUV420P
        << QVideoFrame::Format_YV12
        << QVideoFrame::Format_UYVY
        << QVideoFrame::Format_YUYV
        << QVideoFrame::Format_NV12
        << QVideoFrame::Format_NV21
        << QVideoFrame::Format_IMC1
        << QVideoFrame::Format_IMC2
        << QVideoFrame::Format_IMC3
        << QVideoFrame::Format_IMC4
        << QVideoFrame::Format_Y8
        << QVideoFrame::Format_Y16
        << QVideoFrame::Format_Jpeg
        << QVideoFrame::Format_CameraRaw
        << QVideoFrame::Format_AdobeDng;
}

bool VideoFrameGrabber::isFormatSupported(const QVideoSurfaceFormat &format) const
{
    const QImage::Format imageFormat = QVideoFrame::imageFormatFromPixelFormat(format.pixelFormat());
    const QSize size = format.frameSize();

    return imageFormat != QImage::Format_Invalid
            && !size.isEmpty()
            && format.handleType() == QAbstractVideoBuffer::NoHandle;
}

bool VideoFrameGrabber::start(const QVideoSurfaceFormat &format)
{
    const QImage::Format imageFormat = QVideoFrame::imageFormatFromPixelFormat(format.pixelFormat());
    const QSize size = format.frameSize();

    if (imageFormat != QImage::Format_Invalid && !size.isEmpty()) {
        this->imageFormat = imageFormat;
        imageSize = size;
        sourceRect = format.viewport();

        QAbstractVideoSurface::start(format);

        widget->updateGeometry();
        updateVideoRect();

        return true;
    } else {
        return false;
    }
}

void VideoFrameGrabber::stop()
{
    currentFrame = QVideoFrame();
    targetRect = QRect();

    QAbstractVideoSurface::stop();

    widget->update();
}

bool VideoFrameGrabber::present(const QVideoFrame &frame)
{
    if (frame.isValid()) 
    {
        QVideoFrame cloneFrame(frame);
        cloneFrame.map(QAbstractVideoBuffer::ReadOnly);
        const QImage image(cloneFrame.bits(),
                           cloneFrame.width(),
                           cloneFrame.height(),
                           QVideoFrame::imageFormatFromPixelFormat(cloneFrame .pixelFormat()));
        emit frameAvailable(image); // this is very important
        cloneFrame.unmap();
    }

    if (surfaceFormat().pixelFormat() != frame.pixelFormat()
            || surfaceFormat().frameSize() != frame.size()) {
        setError(IncorrectFormatError);
        stop();

        return false;
    } else {
        currentFrame = frame;

        widget->repaint(targetRect);

        return true;
    }
}

void VideoFrameGrabber::updateVideoRect()
{
    QSize size = surfaceFormat().sizeHint();
    size.scale(widget->size().boundedTo(size), Qt::KeepAspectRatio);

    targetRect = QRect(QPoint(0, 0), size);
    targetRect.moveCenter(widget->rect().center());
}

void VideoFrameGrabber::paint(QPainter *painter)
{
    if (currentFrame.map(QAbstractVideoBuffer::ReadOnly)) {
        const QTransform oldTransform = painter->transform();

        if (surfaceFormat().scanLineDirection() == QVideoSurfaceFormat::BottomToTop) {
           painter->scale(1, -1);
           painter->translate(0, -widget->height());
        }

        QImage image(
                currentFrame.bits(),
                currentFrame.width(),
                currentFrame.height(),
                currentFrame.bytesPerLine(),
                imageFormat);

        painter->drawImage(targetRect, image, sourceRect);

        painter->setTransform(oldTransform);

        currentFrame.unmap();
    }
}

VideoFrameGrabber.h

#ifndef VIDEOFRAMEGRABBER_H
#define VIDEOFRAMEGRABBER_H

#include <QtWidgets>

class VideoFrameGrabber : public QAbstractVideoSurface
{
    Q_OBJECT

public:
    VideoFrameGrabber(QWidget *widget, QObject *parent = 0);

    QList<QVideoFrame::PixelFormat> supportedPixelFormats(
            QAbstractVideoBuffer::HandleType handleType = QAbstractVideoBuffer::NoHandle) const;
    bool isFormatSupported(const QVideoSurfaceFormat &format) const;

    bool start(const QVideoSurfaceFormat &format);
    void stop();

    bool present(const QVideoFrame &frame);

    QRect videoRect() const { return targetRect; }
    void updateVideoRect();

    void paint(QPainter *painter);

private:
    QWidget *widget;
    QImage::Format imageFormat;
    QRect targetRect;
    QSize imageSize;
    QRect sourceRect;
    QVideoFrame currentFrame;

signals:
    void frameAvailable(QImage frame);
};
#endif //VIDEOFRAMEGRABBER_H

Note : in the .h, you will see I added a signal taking an image as a parameter. This will allow you to process your frame anywhere in your code. At the time, this signal took a QImage as a parameter, but you can of course take a QVideoFrame if you want to.


Now, we are ready to use this video frame grabber:

QMediaPlayer* player = new QMediaPlayer(this);
// no more QVideoProbe 
VideoFrameGrabber* grabber = new VideoFrameGrabber(this);
player->setVideoOutput(grabber);

connect(grabber, SIGNAL(frameAvailable(QImage)), this, SLOT(processFrame(QImage)));

Now you just have to declare a slot named processFrame(QImage image) and you will receive a QImage each time you will enter the method present of your VideoFrameGrabber.

I hope that this will help you!




回答2:


After Qt QVideoProbe documentation:

bool QVideoProbe::setSource(QMediaObject *mediaObject)

Starts monitoring the given mediaObject.

If there is no media object associated with mediaObject, or if it is zero, this probe will be deactivated and this function wil return true.

If the media object instance does not support monitoring video, this function will return false.

Any previously monitored objects will no longer be monitored. Passing in the same object will be ignored, but monitoring will continue.

So it seems your "media object instance does not support monitoring video"



来源:https://stackoverflow.com/questions/37724602/how-to-save-a-frame-using-qmediaplayer

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!