AVMutableVideoComposition rotated video captured in portrait mode

怎甘沉沦 提交于 2019-11-26 16:23:31
BornCoder

By default, when you export video using AVAssetExportSession then video will be rotated from its original orientation. You have to apply its transform to set it exact orientation.You please try below code to do the same.

- (AVMutableVideoCompositionLayerInstruction *)layerInstructionAfterFixingOrientationForAsset:(AVAsset *)inAsset 
                                                                                     forTrack:(AVMutableCompositionTrack *)inTrack
                                                                                       atTime:(CMTime)inTime
{
    //FIXING ORIENTATION//
    AVMutableVideoCompositionLayerInstruction *videolayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:inTrack];
    AVAssetTrack *videoAssetTrack = [[inAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
    UIImageOrientation videoAssetOrientation_  = UIImageOrientationUp;
    BOOL  isVideoAssetPortrait_  = NO;
    CGAffineTransform videoTransform = videoAssetTrack.preferredTransform;

    if(videoTransform.a == 0 && videoTransform.b == 1.0 && videoTransform.c == -1.0 && videoTransform.d == 0)  {videoAssetOrientation_= UIImageOrientationRight; isVideoAssetPortrait_ = YES;}
    if(videoTransform.a == 0 && videoTransform.b == -1.0 && videoTransform.c == 1.0 && videoTransform.d == 0)  {videoAssetOrientation_ =  UIImageOrientationLeft; isVideoAssetPortrait_ = YES;}
    if(videoTransform.a == 1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == 1.0)   {videoAssetOrientation_ =  UIImageOrientationUp;}
    if(videoTransform.a == -1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == -1.0) {videoAssetOrientation_ = UIImageOrientationDown;}

    CGFloat FirstAssetScaleToFitRatio = 320.0 / videoAssetTrack.naturalSize.width;

    if(isVideoAssetPortrait_) {
        FirstAssetScaleToFitRatio = 320.0/videoAssetTrack.naturalSize.height;
        CGAffineTransform FirstAssetScaleFactor = CGAffineTransformMakeScale(FirstAssetScaleToFitRatio,FirstAssetScaleToFitRatio);
        [videolayerInstruction setTransform:CGAffineTransformConcat(videoAssetTrack.preferredTransform, FirstAssetScaleFactor) atTime:kCMTimeZero];
    }else{
        CGAffineTransform FirstAssetScaleFactor = CGAffineTransformMakeScale(FirstAssetScaleToFitRatio,FirstAssetScaleToFitRatio);
        [videolayerInstruction setTransform:CGAffineTransformConcat(CGAffineTransformConcat(videoAssetTrack.preferredTransform, FirstAssetScaleFactor),CGAffineTransformMakeTranslation(0, 160)) atTime:kCMTimeZero];
    }
    [videolayerInstruction setOpacity:0.0 atTime:inTime];
    return videolayerInstruction;
}

I hope this will help you.

AVAssetTrack *assetTrack = [[inAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

AVMutableCompositionTrack *mutableTrack = [mergeComposition mutableTrackCompatibleWithTrack:assetTrack];

AVMutableVideoCompositionLayerInstruction *assetInstruction = [self layerInstructionAfterFixingOrientationForAsset:inAsset forTrack:myLocalVideoTrack atTime:videoTotalDuration];

Above is the code to call mentioned method where inAsset is your video asset and videoTotalDuration is your video total duration in CMTime.mergeComposition is object of AVMutableComposition class.

Hope this will help.

EDIT: This is not any callback method or event, you have to call it expectedly with required parameters as mentioned above.

Here's a slightly easier way if you simply want to maintain original rotation.

// Grab the source track from AVURLAsset for example.
AVAssetTrack *assetVideoTrack = [asset tracksWithMediaType:AVMediaTypeVideo].lastObject;

// Grab the composition video track from AVMutableComposition you already made.
AVMutableCompositionTrack *compositionVideoTrack = [composition tracksWithMediaType:AVMediaTypeVideo].lastObject;

// Apply the original transform.    
if (assetVideoTrack && compositionVideoTrack) {
   [compositionVideoTrack setPreferredTransform:assetVideoTrack.preferredTransform];
}

// Export...
Paresh Navadiya

Use these below method to set correct orientation according to video asset orientation in AVMutableVideoComposition

-(AVMutableVideoComposition *) getVideoComposition:(AVAsset *)asset
{
  AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
  AVMutableComposition *composition = [AVMutableComposition composition];
  AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
  CGSize videoSize = videoTrack.naturalSize;
  BOOL isPortrait_ = [self isVideoPortrait:asset];
  if(isPortrait_) {
      NSLog(@"video is portrait ");
      videoSize = CGSizeMake(videoSize.height, videoSize.width);
  }
  composition.naturalSize     = videoSize;
  videoComposition.renderSize = videoSize;
  // videoComposition.renderSize = videoTrack.naturalSize; //
  videoComposition.frameDuration = CMTimeMakeWithSeconds( 1 / videoTrack.nominalFrameRate, 600);

  AVMutableCompositionTrack *compositionVideoTrack;
  compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
  [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:videoTrack atTime:kCMTimeZero error:nil];
  AVMutableVideoCompositionLayerInstruction *layerInst;
  layerInst = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
  [layerInst setTransform:videoTrack.preferredTransform atTime:kCMTimeZero];
  AVMutableVideoCompositionInstruction *inst = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
  inst.timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration);
  inst.layerInstructions = [NSArray arrayWithObject:layerInst];
  videoComposition.instructions = [NSArray arrayWithObject:inst];
  return videoComposition;
}


-(BOOL) isVideoPortrait:(AVAsset *)asset
{
  BOOL isPortrait = FALSE;
  NSArray *tracks = [asset tracksWithMediaType:AVMediaTypeVideo];
  if([tracks    count] > 0) {
    AVAssetTrack *videoTrack = [tracks objectAtIndex:0];

    CGAffineTransform t = videoTrack.preferredTransform;
    // Portrait
    if(t.a == 0 && t.b == 1.0 && t.c == -1.0 && t.d == 0)
    {
        isPortrait = YES;
    }
    // PortraitUpsideDown
    if(t.a == 0 && t.b == -1.0 && t.c == 1.0 && t.d == 0)  {

        isPortrait = YES;
    }
    // LandscapeRight
    if(t.a == 1.0 && t.b == 0 && t.c == 0 && t.d == 1.0)
    {
        isPortrait = NO;
    }
    // LandscapeLeft
    if(t.a == -1.0 && t.b == 0 && t.c == 0 && t.d == -1.0)
    {
        isPortrait = NO;
    }
   }
  return isPortrait;
}

In swift dizy answer..this works for me

  var assetVideoTrack = (sourceAsset.tracksWithMediaType(AVMediaTypeVideo)).last as! AVAssetTrack

  var compositionVideoTrack = (composition.tracksWithMediaType(AVMediaTypeVideo)).last as! AVMutableCompositionTrack

  if (assetVideoTrack.playable && compositionVideoTrack.playable) {

       compositionVideoTrack.preferredTransform = assetVideoTrack.preferredTransform
   }

swift 2:


do {
            let paths = NSSearchPathForDirectoriesInDomains(
                NSSearchPathDirectory.DocumentDirectory, NSSearchPathDomainMask.UserDomainMask, true)
            let documentsDirectory: AnyObject = paths[0]
            //this will be changed to accommodate dynamic videos
            let dataPath = documentsDirectory.stringByAppendingPathComponent(videoFileName+".MOV")
            let videoAsset = AVURLAsset(URL: NSURL(fileURLWithPath: dataPath), options: nil)
            let imgGenerator = AVAssetImageGenerator(asset: videoAsset)
            imgGenerator.appliesPreferredTrackTransform = true
            let cgImage = try imgGenerator.copyCGImageAtTime(CMTimeMake(0, 1), actualTime: nil)
            let uiImage = UIImage(CGImage: cgImage)

            videoThumb.image = uiImage
        } catch let err as NSError {
            print("Error generating thumbnail: \(err)")
        }

This is the code for swift 4 works perfectly to merge videos in best orientation

    let mainComposition = AVMutableComposition()

    let compositionVideoTrack = mainComposition.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid)



    let soundtrackTrack = mainComposition.addMutableTrack(withMediaType: .audio, preferredTrackID: kCMPersistentTrackID_Invalid)

    var insertTime = kCMTimeZero



    for videoAsset in arrayVideos {
        try! compositionVideoTrack?.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), of: videoAsset.tracks(withMediaType: .video)[0], at: insertTime)


        var assetVideoTrack = (videoAsset.tracks(withMediaType: AVMediaType.video)).last as! AVAssetTrack


        var compositionVideoTrack = (mainComposition.tracks(withMediaType: AVMediaType.video)).last as! AVMutableCompositionTrack


            compositionVideoTrack.preferredTransform = assetVideoTrack.preferredTransform




        try! soundtrackTrack?.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), of: videoAsset.tracks(withMediaType: .audio)[0], at: insertTime)

        insertTime = CMTimeAdd(insertTime, videoAsset.duration)

    }

    let outputFileURL = URL(fileURLWithPath: NSTemporaryDirectory() + "merge1uupo.MOV")

    let fileManager = FileManager()
    //fileManager.removeItemIfExisted(outputFileURL)



    let exporter = AVAssetExportSession(asset: mainComposition, presetName: AVAssetExportPresetHighestQuality)

    exporter?.outputURL = outputFileURL
    exporter?.outputFileType = AVFileType.mov
    exporter?.shouldOptimizeForNetworkUse = true

    exporter?.exportAsynchronously {
        DispatchQueue.main.async {

        // Do what you want at the end    


        }
    }
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!