Capture camera frames in background (Android)

二次信任 提交于 2019-12-11 13:34:47

问题


My problem is this: I want a background service, that will obtain frames from the camera in real-time, so that I can analyze them. I've seen a lot of similar topics here that supposedly address this issue, but none of them has really worked in my case.

My first attempt was to create an Activity, which started a Service, and inside the service I created a surfaceView, from which I got a holder and implemented a callback to it in which I prepared the camera and everything. Then, on a previewCallback, I could make a new thread, which could analyze the data I was getting from the onPreviewFrame method of PreviewCallback.

That worked well enough, while I had that service in the foreground, but as soon as I opened up another application (with the service still running in the background), I realized that the preview wasn't there so I couldn't get the frames from it.

Searching on the internet, I found out I could perhaps solve this with SurfaceTexture. So, I created an Activity which'd start my service, like this:

public class SurfaceTextureActivity extends Activity {

public static TextureView mTextureView;

public static Vibrator mVibrator;   
public static GLSurfaceView mGLView;

protected void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);

    mGLView = new GLSurfaceView(this);

    mTextureView = new TextureView(this);

    setContentView(mTextureView);


    try {
        Intent intent = new Intent(SurfaceTextureActivity.this, RecorderService.class);
        intent.putExtra(RecorderService.INTENT_VIDEO_PATH, "/folder-path/");
        startService(intent);
        Log.i("ABC", "Start Service "+this.toString()+" + "+mTextureView.toString()+" + "+getWindowManager().toString());
    }
    catch (Exception e)  {
        Log.i("ABC", "Exc SurfaceTextureActivity: "+e.getMessage());
    }

}

}

And then I made the RecorderService implement SurfaceTextureListener, so that I could open the camera and do the other preparations, and then perhaps capture the frames. My RecorderService currently looks like this:

public class RecorderService extends Service implements TextureView.SurfaceTextureListener, SurfaceTexture.OnFrameAvailableListener {

    private Camera mCamera = null;
    private TextureView mTextureView;
    private SurfaceTexture mSurfaceTexture;
    private float[] mTransformMatrix;

    private static IMotionDetection detector = null;
    public static Vibrator mVibrator;

    @Override
    public void onCreate() {
        try {

            mTextureView = SurfaceTextureActivity.mTextureView;
            mTextureView.setSurfaceTextureListener(this);

            Log.i("ABC","onCreate");

//          startForeground(START_STICKY, new Notification()); - doesn't work

        } catch (Exception e) {
            Log.i("ABC","onCreate exception "+e.getMessage());
            e.printStackTrace();
        }

    }

    @Override
    public void onFrameAvailable(SurfaceTexture surfaceTexture) 
    {
        //How do I obtain frames?!
//      SurfaceTextureActivity.mGLView.queueEvent(new Runnable() {
//          
//          @Override
//          public void run() {
//              mSurfaceTexture.updateTexImage();
//              
//          }
//      });
    }

    @Override
    public void onSurfaceTextureAvailable(SurfaceTexture surface, int width,
            int height) {

        mSurfaceTexture = surface;
        mSurfaceTexture.setOnFrameAvailableListener(this);
        mVibrator = (Vibrator)this.getSystemService(VIBRATOR_SERVICE);

         detector = new RgbMotionDetection();

        int cameraId = 0;
        Camera.CameraInfo info = new Camera.CameraInfo();

        for (cameraId = 0; cameraId < Camera.getNumberOfCameras(); cameraId++) {
            Camera.getCameraInfo(cameraId, info);
            if (info.facing == Camera.CameraInfo.CAMERA_FACING_FRONT)
                break;
        }

        mCamera = Camera.open(cameraId);
        Matrix transform = new Matrix();

        Camera.Size previewSize = mCamera.getParameters().getPreviewSize();
        int rotation = ((WindowManager)(getSystemService(Context.WINDOW_SERVICE))).getDefaultDisplay()
                .getRotation();
        Log.i("ABC", "onSurfaceTextureAvailable(): CameraOrientation(" + cameraId + ")" + info.orientation + " " + previewSize.width + "x" + previewSize.height + " Rotation=" + rotation);

        try {

        switch (rotation) {
        case Surface.ROTATION_0: 
            mCamera.setDisplayOrientation(90);
            mTextureView.setLayoutParams(new FrameLayout.LayoutParams(
                    previewSize.height, previewSize.width, Gravity.CENTER));
            transform.setScale(-1, 1, previewSize.height/2, 0);
            break;

        case Surface.ROTATION_90:
            mCamera.setDisplayOrientation(0);
            mTextureView.setLayoutParams(new FrameLayout.LayoutParams(
                    previewSize.width, previewSize.height, Gravity.CENTER));
            transform.setScale(-1, 1, previewSize.width/2, 0);
            break;

        case Surface.ROTATION_180:
            mCamera.setDisplayOrientation(270);
            mTextureView.setLayoutParams(new FrameLayout.LayoutParams(
                    previewSize.height, previewSize.width, Gravity.CENTER));
            transform.setScale(-1, 1, previewSize.height/2, 0);
            break;

        case Surface.ROTATION_270:
            mCamera.setDisplayOrientation(180);
            mTextureView.setLayoutParams(new FrameLayout.LayoutParams(
                    previewSize.width, previewSize.height, Gravity.CENTER));
            transform.setScale(-1, 1, previewSize.width/2, 0);
            break;
        }

            mCamera.setPreviewTexture(mSurfaceTexture);


        Log.i("ABC", "onSurfaceTextureAvailable(): Transform: " + transform.toString());

        mCamera.startPreview();
//      mTextureView.setVisibility(0);

        mCamera.setPreviewCallback(new PreviewCallback() {

            @Override
            public void onPreviewFrame(byte[] data, Camera camera) {
                if (data == null) return;
                Camera.Size size = mCamera.getParameters().getPreviewSize();
                if (size == null) return;

                //This is where I start my thread that analyzes images
                DetectionThread thread = new DetectionThread(data, size.width, size.height);
                thread.start();

            }
        });
        } 
        catch (Exception t) {
             Log.i("ABC", "onSurfaceTextureAvailable Exception: "+ t.getMessage());
        }
    }

However, similarly as in the other case, since my analyzing thread starts inside the onSurfaceTextureAvailable, which is only when the texture is there, and not when I open up another application, the frame capturing won't continue when I open up something else.

Some ideas have shown that it's possible, but I just don't know how. There was an idea, that I could implement SurfaceTexture.onFrameAvailable and then once new frame is available, trigger a runnable to be ran on render thread (GLSurfaceView.queueEvent(..)) and finally run a runnable call SurfaceTexture.updateTexImage(). Which is what I've tried (it's commented out in my code), but it doesn't work, the application crashes if I do that.

What else can I possibly do? I know that this can work somehow, because I've seen it used in apps like SpyCameraOS (yes, I know it's open-source and I've looked at the code, but I couldn't make a working solution), and I feel like I'm missing just a small piece somewhere, but I have no idea what I'm doing wrong. I've been at this for the past 3 days, and no success.

Help would be greatly appreciated.


回答1:


Summarizing the comments: direct the output of the Camera to a SurfaceTexture that isn't tied to a View object. A TextureView will be destroyed when the activity is paused, freeing its SurfaceTexture, but if you create a separate SurfaceTexture (or detach the one from the TextureView) then it won't be affected by changes in Activity state. The texture can be rendered to an off-screen Surface, from which pixels can be read.

Various examples can be found in Grafika.



来源:https://stackoverflow.com/questions/28222407/capture-camera-frames-in-background-android

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!