Run multiple AVCaptureSessions or add multiple inputs

痞子三分冷 提交于 2020-01-20 01:41:27

问题


I want to display the stream of the front and the back facing camera of an iPad2 in two UIViews next to each other. To stream the image of one device I use the following code

AVCaptureDeviceInput *captureInputFront = [AVCaptureDeviceInput deviceInputWithDevice:[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo] error:nil];

AVCaptureSession *session = [[AVCaptureSession alloc] init];
session addInput:captureInputFront];
session setSessionPreset:AVCaptureSessionPresetMedium];
session startRunning];

AVCaptureVideoPreviewLayer *prevLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
prevLayer.frame = self.view.frame;
[self.view.layer addSublayer:prevLayer];

which works fine for either camera. To display the stream in parallel I tried to create another session, but as soon as the 2nd session is established the first freezes.

Then I tried to add two AVCaptureDeviceInput to the session but seems like at most one input is supported at the moment.

Any helpful ideas how to stream from both cameras?


回答1:


It is possible to get CMSampleBufferRefs from multiple video devices on MacOS X. You have to setup the AVCaptureConnection objects manually. For example, assuming you have these objects:

AVCaptureSession *session;
AVCaptureInput *videoInput1;
AVCaptureInput *videoInput2;
AVCaptureVideoDataOutput *videoOutput1;
AVCaptureVideoDataOutput *videoOutput2;

Do NOT add the outputs like this:

[session addOutput:videoOutput1];
[session addOutput:videoOutput2];

Instead, add them and tell the session not to make any connections:

[session addOutputWithNoConnections:videoOutput1];
[session addOutputWithNoConnections:videoOutput2];

Then for each input/output pair make the connection from the input's video port to the output manually:

for (AVCaptureInputPort *port in [videoInput1 ports]) {
    if ([[port mediaType] isEqualToString:AVMediaTypeVideo]) {
        AVCaptureConnection* cxn = [AVCaptureConnection
            connectionWithInputPorts:[NSArray arrayWithObject:port]
            output:videoOutput1
        ];
        if ([session canAddConnection:cxn]) {
            [session addConnection:cxn];
        }
        break;
    }
}

Finally, make sure to set sample buffer delegates for both outputs:

[videoOutput1 setSampleBufferDelegate:self queue:someDispatchQueue];
[videoOutput2 setSampleBufferDelegate:self queue:someDispatchQueue];

and now you should be able to process frames from both devices:

- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection
{
    if (captureOutput == videoOutput1)
    {
        // handle frames from first device
    }
    else if (captureOutput == videoOutput2)
    {
        // handle frames from second device
    }
}

See also the AVVideoWall sample project for an example of combining live previews from multiple video devices.



来源:https://stackoverflow.com/questions/11071202/run-multiple-avcapturesessions-or-add-multiple-inputs

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!