Android Camera Capture using FFmpeg

情到浓时终转凉″ 提交于 2019-12-17 22:53:27

问题


Am tryin' to take the preview frame generated by the android camera and pass the data[] to ffmpeg input pipe to generate a flv video. The command that I used was :

ffmpeg -f image2pipe -i pipe: -f flv -vcodec libx264 out.flv

I've also tried to force the input format to yuv4mpegpipe and rawvideo but with no success... The default format of the preview frame generated by android-camera is NV21. The way am invokin' ffmpeg is through the Process API and writing the preview frames data[] to the process' stdin... The onPreviewFrame() definition is as follows :

public void onPreviewFrame(byte[] data, Camera camera)
{   
    try
    {
        processIn.write(data);
    }
    catch(Exception e)
    {
        Log.e(TAG, FUNCTION + " : " + e.getMessage());
    }               
    camera.addCallbackBuffer(new byte[bufferSize]);
}

processIn is connected to the ffmpeg process stdin and buffersize is computed based on the documentation provided for addCallbackBuffer(). Is there something that am doin' wrong...?

Thanks...


回答1:


Kinda got it working perfectly... The mistake that seemed to be happenin' was related to the vcodec of the image stream. Seems that ffmpeg has no provision to decode NV21 format images or image stream. For that had to convert the NV21 format preview frame to JPEG and as the images had to streamed in real time to the ffmpeg process ,the conversion had to be On the Fly. The closest reliable solution for On the Fly conversion to JPEG was as follows :

public void onPreviewFrame(byte[] data, Camera camera)
{
        if(isFirstFrame)
    {
        Camera.Parameters cameraParam = camera.getParameters();
        Camera.Size previewSize = cameraParam.getPreviewSize();
        previewFormat = cameraParam.getPreviewFormat();
        frameWidth = previewSize.width;
        frameHeight = previewSize.height;
        frameRect = new Rect(0, 0, frameWidth, frameHeight);
        isFirstFrame = false;
    }

    previewImage = new YuvImage(data, previewFormat, frameWidth, frameHeight, null);

    if(previewImage.compressToJpeg(frameRect, 50, processIn))
        Log.d(TAG, "Data : " + data.length);

    previewImage = null;

    camera.addCallbackBuffer(new byte[bufferSize]);
}

And the ffmpeg command used was :

ffmpeg -f image2pipe -vcodec mjpeg -i - -f flv -vcodec libx264 out.flv


来源:https://stackoverflow.com/questions/15280722/android-camera-capture-using-ffmpeg

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!