iOS AVFoundation: Setting Orientation of Video

后端 未结 5 1311
感情败类
感情败类 2020-12-01 10:13

I\'ve been struggling with several dimensions to the problem of controlling video orientation during and after capture on an iOS device. Thanks to previous answers and docum

5条回答
  •  盖世英雄少女心
    2020-12-01 10:40

    In Apple's documentation here it states:

    Clients may now receive physically rotated CVPixelBuffers in their AVCaptureVideoDataOutput -captureOutput:didOutputSampleBuffer:fromConnection: delegate callback. In previous iOS versions, the front-facing camera would always deliver buffers in AVCaptureVideoOrientationLandscapeLeft and the back-facing camera would always deliver buffers in AVCaptureVideoOrientationLandscapeRight. All 4 AVCaptureVideoOrientations are supported, and rotation is hardware accelerated. To request buffer rotation, a client calls -setVideoOrientation: on the AVCaptureVideoDataOutput's video AVCaptureConnection. Note that physically rotating buffers does come with a performance cost, so only request rotation if it's necessary. If, for instance, you want rotated video written to a QuickTime movie file using AVAssetWriter, it is preferable to set the -transform property on the AVAssetWriterInput rather than physically rotate the buffers in AVCaptureVideoDataOutput.

    So the posted solution by Aaron Vegh that uses an AVAssetExportSession works, but is not needed. Like the Apple doc's say, if you'd like to have the orientation set correctly so that it plays in non-apple quicktime players like VLC or on the web using Chrome, you must set the video orientation on the AVCaptureConnection for the AVCaptureVideoDataOutput. If you try to set it for the AVAssetWriterInput you will get an incorrect orientation for players like VLC and Chrome.

    Here is my code where I set it during setting up the capture session:

    // DECLARED AS PROPERTIES ABOVE
    @property (strong,nonatomic) AVCaptureDeviceInput *audioIn;
    @property (strong,nonatomic) AVCaptureAudioDataOutput *audioOut;
    @property (strong,nonatomic) AVCaptureDeviceInput *videoIn;
    @property (strong,nonatomic) AVCaptureVideoDataOutput *videoOut;
    @property (strong,nonatomic) AVCaptureConnection *audioConnection;
    @property (strong,nonatomic) AVCaptureConnection *videoConnection;
    ------------------------------------------------------------------
    ------------------------------------------------------------------
    
    -(void)setupCaptureSession{
    // Setup Session
    self.session = [[AVCaptureSession alloc]init];
    [self.session setSessionPreset:AVCaptureSessionPreset640x480];
    
    // Create Audio connection ----------------------------------------
    self.audioIn = [[AVCaptureDeviceInput alloc]initWithDevice:[self getAudioDevice] error:nil];
    if ([self.session canAddInput:self.audioIn]) {
        [self.session addInput:self.audioIn];
    }
    
    self.audioOut = [[AVCaptureAudioDataOutput alloc]init];
    dispatch_queue_t audioCaptureQueue = dispatch_queue_create("Audio Capture Queue", DISPATCH_QUEUE_SERIAL);
    [self.audioOut setSampleBufferDelegate:self queue:audioCaptureQueue];
    if ([self.session canAddOutput:self.audioOut]) {
        [self.session addOutput:self.audioOut];
    }
    self.audioConnection = [self.audioOut connectionWithMediaType:AVMediaTypeAudio];
    
    // Create Video connection ----------------------------------------
    self.videoIn = [[AVCaptureDeviceInput alloc]initWithDevice:[self videoDeviceWithPosition:AVCaptureDevicePositionBack] error:nil];
    if ([self.session canAddInput:self.videoIn]) {
        [self.session addInput:self.videoIn];
    }
    
    self.videoOut = [[AVCaptureVideoDataOutput alloc]init];
    [self.videoOut setAlwaysDiscardsLateVideoFrames:NO];
    [self.videoOut setVideoSettings:nil];
    dispatch_queue_t videoCaptureQueue =  dispatch_queue_create("Video Capture Queue", DISPATCH_QUEUE_SERIAL);
    [self.videoOut setSampleBufferDelegate:self queue:videoCaptureQueue];
    if ([self.session canAddOutput:self.videoOut]) {
        [self.session addOutput:self.videoOut];
    }
    
    self.videoConnection = [self.videoOut connectionWithMediaType:AVMediaTypeVideo];
    // SET THE ORIENTATION HERE -------------------------------------------------
    [self.videoConnection setVideoOrientation:AVCaptureVideoOrientationPortrait];
    // --------------------------------------------------------------------------
    
    // Create Preview Layer -------------------------------------------
    AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc]initWithSession:self.session];
    CGRect bounds = self.videoView.bounds;
    previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
    previewLayer.bounds = bounds;
    previewLayer.position=CGPointMake(CGRectGetMidX(bounds), CGRectGetMidY(bounds));
    [self.videoView.layer addSublayer:previewLayer];
    
    // Start session
    [self.session startRunning];
    

    }

提交回复
热议问题