Front Camera recording is MUTE

可紊 提交于 2019-12-24 15:28:49

问题


I am working with some camera recoding app. I want to record video using front and back camera both. For back camera my video is working fine but for front camera my final video is mute (without audio).

CODE:

- (id)initWithPreviewView:(UIView *)previewView {

    self = [super init];

    if (self) {

        NSError *error;

        self.captureSession = [[AVCaptureSession alloc] init];
        self.captureSession.sessionPreset = AVCaptureSessionPresetHigh;

        //AVCaptureSessionPresetHigh AVCaptureSessionPresetPhoto

//        AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

//AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
        AVCaptureDevice *videoDevice;
//        if (isNeededToSave)
//        {
//            //for Front cam
//            videoDevice = [self frontCamera];
//
//        }
//        else
//        {
//            //for back cam
            videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
       // }

        AVCaptureDeviceInput *videoIn = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];

        if (error) {
            NSLog(@"Video input creation failed");
            return nil;
        }

        if (![self.captureSession canAddInput:videoIn]) {
            NSLog(@"Video input add-to-session failed");
            return nil;
        }
        [self.captureSession addInput:videoIn];

        /*Take PHoto*/
        self.isUsingFrontFacingCamera = 0;
        // Make a still image output
        stillImageOutput = [AVCaptureStillImageOutput new];
        [stillImageOutput addObserver:self forKeyPath:@"capturingStillImage" options:NSKeyValueObservingOptionNew context:(__bridge void *)(AVCaptureStillImageIsCapturingStillImageContext)];
        if ( [self.captureSession canAddOutput:stillImageOutput] )
            [self.captureSession addOutput:stillImageOutput];

        // Make a video data output
        videoDataOutput = [AVCaptureVideoDataOutput new];

        // we want BGRA, both CoreGraphics and OpenGL work well with 'BGRA'
        NSDictionary *rgbOutputSettings = [NSDictionary dictionaryWithObject:
                                           [NSNumber numberWithInt:kCMPixelFormat_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
        [videoDataOutput setVideoSettings:rgbOutputSettings];
        [videoDataOutput setAlwaysDiscardsLateVideoFrames:YES]; // discard if the data output queue is blocked (as we process the still image)

        // create a serial dispatch queue used for the sample buffer delegate as well as when a still image is captured
        // a serial dispatch queue must be used to guarantee that video frames will be delivered in order
        // see the header doc for setSampleBufferDelegate:queue: for more information
        videoDataOutputQueue = dispatch_queue_create("VideoDataOutputQueue", DISPATCH_QUEUE_SERIAL);
        [videoDataOutput setSampleBufferDelegate:self queue:videoDataOutputQueue];

        if ( [self.captureSession canAddOutput:videoDataOutput] )
            [self.captureSession addOutput:videoDataOutput];
        [[videoDataOutput connectionWithMediaType:AVMediaTypeVideo] setEnabled:NO];
        /*Take PHoto*/
        // save the default format
        self.defaultFormat = videoDevice.activeFormat;
        defaultVideoMaxFrameDuration = videoDevice.activeVideoMaxFrameDuration;


        AVCaptureDevice *audioDevice= [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];

        AVCaptureDeviceInput *audioIn = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error];
        [self.captureSession addInput:audioIn];

        self.fileOutput = [[AVCaptureMovieFileOutput alloc] init];
        [self.captureSession addOutput:self.fileOutput];


        self.previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.captureSession];
        self.previewLayer.frame = previewView.bounds;
        self.previewLayer.contentsGravity = kCAGravityResizeAspectFill;
        self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
        [previewView.layer insertSublayer:self.previewLayer atIndex:0];
        [self.captureSession startRunning];
    }
    return self;
}

- (void)switchCameras
{

  //  [self frontCamera];
//    AVCaptureDevicePosition desiredPosition;
//       desiredPosition = AVCaptureDevicePositionFront;

    AVCaptureDevicePosition desiredPosition;


    NSInteger isFront = [[NSUserDefaults standardUserDefaults] integerForKey:@"isUsingFrontFacingCamera"];

    if (isFront)
        desiredPosition = AVCaptureDevicePositionBack;
    else
    desiredPosition = AVCaptureDevicePositionFront;



    for (AVCaptureDevice *d in [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]) {
        if ([d position] == desiredPosition) {
            [[self.previewLayer  session] beginConfiguration];
            AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:d error:nil];
            for (AVCaptureInput *oldInput in [[self.previewLayer  session] inputs]) {
                [[self.previewLayer  session] removeInput:oldInput];
            }
            [[self.previewLayer session] addInput:input];
            [[self.previewLayer  session] commitConfiguration];
            break;
        }
    }
    if (isFront==0)
    {
    [[NSUserDefaults standardUserDefaults] setInteger:1 forKey:@"isUsingFrontFacingCamera"];
    }
    else
    {
        [[NSUserDefaults standardUserDefaults] setInteger:0 forKey:@"isUsingFrontFacingCamera"];

    }

    [[NSUserDefaults standardUserDefaults] synchronize];

    //NSInteger  isFront1= [[NSUserDefaults standardUserDefaults] integerForKey:@"isUsingFrontFacingCamera"];


}

来源:https://stackoverflow.com/questions/31457302/front-camera-recording-is-mute

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!