OpenCV command line app can't access camera under macOS Mojave

冷暖自知 提交于 2020-06-25 09:34:13

问题


I'm unable to access the iMac camera from a command line OpenCV program. (I'm compiling and running the program under CodeRunner, not Xcode.) I've read that Mojave requires NSCameraUsageDescription in Info.plist and I think I'm embedding it correctly in the binary. I added -sectcreate __TEXT __info_plist Info.plist (which I learned about here) to the compile flags and when I run otool -X -s __TEXT __info_plist videotest | xxd -r (from the same blog post) it outputs:

-?<?xml ve.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN"
"http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
    <key>NSCameraUsageDescription</key>
    <string>Uses camera to see vision targets</string>
    <key>NSMicrophoneUsageDescription</key>
    <string>This app requires to access your microphone in order to access the camera</string>
</dict>
</plist>

(I added NSMicrophoneUsageDescription in case it was trying to open the microphone along with the camera.)

This is the output when I run the program:

OpenCV version 4.1.0-dev
[ INFO:0] global /Users/steve/Documents/GitHub/ssteve-opencv/modules/videoio/src/videoio_registry.cpp (185) VideoBackendRegistry VIDEOIO: Enabled backends(5, sorted by priority): FFMPEG(1000); GSTREAMER(990); AVFOUNDATION(980); CV_IMAGES(970); CV_MJPEG(960)
[ INFO:0] global /Users/steve/Documents/GitHub/ssteve-opencv/modules/videoio/src/backend_plugin.cpp (248) getPluginCandidates VideoIO pluigin (GSTREAMER): glob is 'libopencv_videoio_gstreamer*.dylib', 1 location(s)
[ INFO:0] global /Users/steve/Documents/GitHub/ssteve-opencv/modules/videoio/src/backend_plugin.cpp (256) getPluginCandidates     - /usr/local/lib: 0
[ INFO:0] global /Users/steve/Documents/GitHub/ssteve-opencv/modules/videoio/src/backend_plugin.cpp (259) getPluginCandidates Found 0 plugin(s) for GSTREAMER
OpenCV: not authorized to capture video (status 0), requesting...
OpenCV: camera failed to properly initialize!
Unable to open camera

It implies it's requesting access, but I never get a dialog and no apps are listed under System Preferences > Security & Privacy > Camera.

Here's the program I'm running:

#include <iostream>

#include "opencv2/core.hpp"
#include "opencv2/imgproc.hpp"
#include "opencv2/highgui.hpp"

using namespace std;
using namespace cv;

int main(int argc, char *argv[]) {
    cout << "OpenCV version " << CV_VERSION << endl;
    VideoCapture cap;
    cap.open(0);
    if (!cap.isOpened()) {
        cerr << "Unable to open camera\n";
        return -1;
    }

    Mat frame;
    for (;;) {
        cap >> frame;
        if (frame.empty()) {
            cerr << "Got blank frame\n";
            return -1;
        }
        imshow("Live", frame);
        if (waitKey(5) >= 0)
        break;
    }

    return 0;
}

This is the compiler invocation:

xcrun clang++ -x c++ -lc++ -o "$out" -std=c++11 -I/usr/local/include/opencv4 -lopencv_core -lopencv_highgui -lopencv_imgproc -lopencv_imgcodecs -lopencv_videoio -lopencv_calib3d -lopencv_aruco -lopencv_xfeatures2d -lopencv_features2d -sectcreate __TEXT __info_plist Info.plist "${files[@]}" "${@:1}"

What piece of the puzzle am I missing?

(I know this is similar to Cannot access camera with opencv on Mac Mojave but that question never went beyond a malformed plist file.)


In response to the suggestion to make sure ffmpeg see the device:

$ ffmpeg -hide_banner -f avfoundation -list_devices true -i ""
[AVFoundation input device @ 0x7fed77d16dc0] AVFoundation video devices:
[AVFoundation input device @ 0x7fed77d16dc0] [0] FaceTime HD Camera (Built-in)
[AVFoundation input device @ 0x7fed77d16dc0] [1] Capture screen 0
[AVFoundation input device @ 0x7fed77d16dc0] [2] Capture screen 1
[AVFoundation input device @ 0x7fed77d16dc0] [3] Capture screen 2
[AVFoundation input device @ 0x7fed77d16dc0] AVFoundation audio devices:
[AVFoundation input device @ 0x7fed77d16dc0] [0] Built-in Microphone

回答1:


The problem was that the c++ program, for whatever reason, wasn't requesting camera access. I took the advice of @gerwin in the comments to give it a try with Python. Running that program from Terminal resulted in Terminal asking for camera access. Once I granted that, the c++ program was able to access the camera when run from Terminal.

As far as CodeRunner, I'm not sure how to get CodeRunner to run Python programs under a virtual environment so I haven't been able to run a Python OpenCV program to get it to ask for camera access. So at the moment I can't use CodeRunner to run a c++ program that accesses the camera.




回答2:


It not an ultimate solution but I got it resolved by installing any terminal application that request access to your Camera. Then your openCv c++ program will gain the access to the FaceTime HD Camera afterwards.

for example, you can install ImageSnap by:

brew install imagesnap

imagesnap -w 1 shot.png

Then give camera permission through the pop out that will appear.




回答3:


A couple of comments here...

The error I'm seeing when trying to run OpenCV from my MacOS development environment Is:

OpenCV: not authorized to capture video (status 0), requesting... OpenCV: camera failed to properly initialize! Error opening video stream or file Program ended with exit code: 255

I know those words originate from the OpenCV library here. My initial thought was that this was an OpenCV issue. With a bit more testing I think it's something else. As others have noted, MacOS security / permissions issue. But here's the rub.

If I go to Mac Apple Icon (Upper Left Corner) --> Systems Preferences --> Security and Privacy I can glean a lot of info.

Check on the Camera Icon.

In my case this shows two applications which require additional permissions to get access to the camera, Terminal and Virtualbox (not sure what happens to browser, Facetime?) I do note, Xcode didn't make this list.

When I click over to Microphone, I see different apps listed, INCLUDING Xcode.

How does that even work? I did do a whole lot of testing, including researching modifying the Info.plist for the Xcode application package (Finder --> Applications Folder --> Xcode --> Rt click, Show Package Contents. Copy Info.plist save it elsewhere, modify it via Xcode, resubmit.) Note: Don't try this without keeping a copy of the original Info.plist. Total fail. Adding the NSCameraUsageDescription key/value was a total bust. Xcode won't open at all. Reminder DON'T lose the original Info.plist.

This whole thing is baffling. Why does Apple allow us to access the camera via terminal but not in Xcode? What's the logic there?

I sure would like to be able to step thru code to understand frame by frame possible design issues. This just isn't fun.

So a couple of things to understand.

  1. Yes, you can run an OpenCV project on MacOS WITH your camera after the program has been successfully compiled to Unix Executable. You have to ensure permissions for the Terminal are set in Security and Privacy per photo above. Obviously you build the executable in your development tool (in my case Xcode) then open the executable from the projects Build/Debug folder. The app opens in the terminal window and works just fine as noted by SSteve.

  2. If you really want to do some video / camera debugging, you do have the option to "pre-record" a video, then open that video in your development environment. At that point you can use the debugger. How do you guys do frame by frame analysis? This is the only way I know of that will at least partially work.

  3. (edit update 5/22/19...) Whoa. I just realized.. you can attach the debugger to a running (terminal) process. You can totally do frame by frame debugging, using the camera (as long as the program compiles to a functional executable.) Now this is pretty cool, and gets me to 98% functionality. To do this, start the terminal executable, then go to Xcode --> Debug --> Attach to Process. Select the running application, add Breakpoints to the source code and debug/step along. Works well.

I start my OpenCV project with:

int main(int argc, char** argv){
    // Parse command line arguments
    CommandLineParser parser(argc,argv,keys);

    // Create a VideoCapture object & open the input file
    VideoCapture cap;
    if (parser.has("video")){
        cap.open(parser.get<String>("video"));
    }
    else
        cap.open(0);
   ...

It's a hack work around, but better than nothing. (Sure wish Apple included the camera in iOS emulator, that would be another way to solve this, sigh.) Obviously a lot depends on where you are going with your project. Ultimately I need mine to run on an iPad; Proveout on MacOS, then wrap code in Swift, etc...

For reference, I'm using macOS Mojave, 10.14.4, MacBook 2.7GHz i7

PS. The security preferences above doesn't show Chrome with Camera access. Seems odd. I just tested the camera at this site... in Chrome, and it asks for permission and works exactly as expected. Its not clear on what's going on here.

PS2. Am I the only person to file a bug report on this issue? Link included for your convenience. Thanks.




回答4:


Versions: XCode 10.3, MacOS Mohave 10.14.6, OpenCV 4.1.1_2

OpenCV project is on C++

Add this class to your project:

Header (.h):

class CameraIssue {


public:
    CameraIssue() {}
    ~CameraIssue() {}

    bool dealWithCamera();
};

.mm file. Note it's not .cpp, it's .mm because we want to operate with AVFoundation

bool CameraIssue::dealWithCamera()
{
    AVAuthorizationStatus st = [AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo];
    if (st == AVAuthorizationStatusAuthorized) {
        return true;
    }

    dispatch_group_t group = dispatch_group_create();

    __block bool accessGranted = false;

    if (st != AVAuthorizationStatusAuthorized) {
        dispatch_group_enter(group);
        [AVCaptureDevice requestAccessForMediaType:AVMediaTypeVideo completionHandler:^(BOOL granted) {

            accessGranted = granted;
            NSLog(@"Granted!");
            dispatch_group_leave(group);
        }];
    }

    dispatch_group_wait(group, dispatch_time(DISPATCH_TIME_NOW, (int64_t)(5.0 * NSEC_PER_SEC)));

    return accessGranted;
}

And before accessing VideoCapture, call this method like that:

CameraIssue _camIssue;
_camIssue.dealWithCamera(); //do whatever you need with bool return

You might wonder - why am I creating C++ class while using Objective-C++ extension (.mm)?

In order to create Objective-C class, I might need to import Foundation framework and importing that gave me a lot of errors about duplicate symbols because Foundation and 3rd party libraries I'm using share lots of names. So I created C++ class, but with .mm extension so I can import AVFoundation framework and grant the camera access.

Method dealWithCamera() is very far from perfect but it suits exactly my needs. Feel free to extend it, optimize, add a callback, etc.




回答5:


I was finally able to resolve this by following a chain of recommendations across Stackoverflow and GitHub. It was a painful bug that burnt my day trying to get my code to work again even though it was working fine pre MacOS Mojave.

Solution

Put the Info.plist file with the NSCameraUsageDescription field as suggested in the Products/Build directory of your Target (Right click Product in left pane in Xcode project and click "Show in Finder").

  • Automate this process of copy/pasting Info.plist to your Build directory (following this suggestion) by adding it to the list of Copy Files under Build Phases of your "Target" and changing the Destination to "Products Directory" and Subpath to "."

Outcome

  • The Target's Unix executable binary will then ask for permission to access the Camera and upon consenting, the binary will be added to the list of applications permitted to access the Camera available in System Preferences > Privacy > Camera.
    • FYI: To force clear this list, type tccutil reset Camera in Terminal
  • You might need to run the Target a couple times before you are prompted for permission / the Camera is accessed.

Issue

Instantiating the cv::VideoCapture(0) object to access the camera video stream throws the follwoing error even though the code was running fine in MacOS version before Mojave

OpenCV: not authorized to capture video (status 0), requesting...
OpenCV: camera failed to properly initialize!

Cause

MacOS Mojave has tightened the privacy protection, which now requires applications to explicitly prompt and seek permission from the before accessing the camera as explained here.

Suggestions that didn't work

Below suggestions as given in various Stackoverflow posts did not successfully force the built binary to prompt for permission to access the camera: - Adding the Info.plist to your Project directory - Setting the path to Info.plist under Build Settings > Packaging > Info.plist File or - Choosing it in General > Identity > Choose Info.plist File... of your Target

Suggestions that might have helped

As indicated in the opencv closed GitHub issue, some change was made in the libopencv around April '19 which could've also possibly facilitated the usage of available Info.plist in the build directory to prompt the user for permission to access camera. So I also upgraded my opencv to latest stable 4.1.0 release using brew upgrade.

P.S. I'm running MacOS Mojave 10.14.5, Xcode 10.2.1 and OpenCV 4.1.0




回答6:


I found a work around for this:

First, reset your camera's rules:

tccutil reset Camera

Next, I ran a 3rd party software to access camera from terminal. By running the following:

brew install imagesnap
imagesnap -w 1 snapshot.png

I was asked if I wanted to allow terminal to access my camera. I clicked "Yes". And now my C++ program can now access the camera from terminal.

Note: The pictures shown ZipZit were very similar except I did not have terminal listed under camera.

But after running 3rd party program,. it was then added to list.




回答7:


We're getting this exact issue running on opencv 4.1.1-pre. We solved the issue by rolling back to 4.0.1.



来源:https://stackoverflow.com/questions/56084303/opencv-command-line-app-cant-access-camera-under-macos-mojave

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!