merging videos together (AVFoundation)

≯℡__Kan透↙ 提交于 2019-12-04 10:03:27

问题


In my app, I'm recording small videos and adding them into an NSMutableArray as AVAsset so that i keep record of what has been captured. when the user press a button to merge them, the final result is only the first video taken (example, if three short videos where taken, the final result after merging is only the first video and the others do not appear). my code on iterating in the NSMutableArray and stitching the videos together is here:

if (self.capturedVideos.count != 0) {        
    //Create AVMutableComposition Object.This object will hold our multiple AVMutableCompositionTrack.
    AVMutableComposition* mixComposition = [[AVMutableComposition alloc] init];

    for (AVAsset *asset in self.capturedVideos) {
        //check if the video is the first one captures so that it  is placed at time 0.
        if ([self.capturedVideos indexOfObject:asset] == 0) {
            AVMutableCompositionTrack *firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
            [firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:[[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];
            previousAsset = asset;
        } else{
            AVMutableCompositionTrack *track = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
            [track insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:[[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:previousAsset.duration error:nil];
            previousAsset = asset;
        }
    }

    NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *documentsDirectory = [paths objectAtIndex:0];
    NSString *myPathDocs =  [documentsDirectory stringByAppendingPathComponent:
    [NSString stringWithFormat:@"mergeVideo-%d.mov",arc4random() % 1000]];
    NSURL *url = [NSURL fileURLWithPath:myPathDocs];
    // 5 - Create exporter
    AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
    exporter.outputURL=url;
    exporter.outputFileType = AVFileTypeQuickTimeMovie;
    exporter.shouldOptimizeForNetworkUse = YES;
    [exporter exportAsynchronouslyWithCompletionHandler:^{
        dispatch_async(dispatch_get_main_queue(), ^{
            [self exportDidFinish:exporter];
        });
    }];
}

what's after the for loop is for exporting the video to be saved in camera roll. so where is my mistake? the durations are right (so there is no over lapping). however, i'm doubting in something. There is an instance variable i added after @implementation in braces which is previousAsset which tracks the previous asset added thus knowing where to place the next one. it's of class AVAsset so i didn't initialize it because when i try to it's showing me an error.

previousAsset = [[AVAsset alloc] init];


回答1:


Swift version

func merge(arrayVideos:[AVAsset], completion:@escaping (_ exporter: AVAssetExportSession) -> ()) -> Void {

  let mainComposition = AVMutableComposition()
  let compositionVideoTrack = mainComposition.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid)
  compositionVideoTrack?.preferredTransform = CGAffineTransform(rotationAngle: .pi / 2)

  let soundtrackTrack = mainComposition.addMutableTrack(withMediaType: .audio, preferredTrackID: kCMPersistentTrackID_Invalid)

  var insertTime = kCMTimeZero

  for videoAsset in arrayVideos {
    try! compositionVideoTrack?.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), of: videoAsset.tracks(withMediaType: .video)[0], at: insertTime)
    try! soundtrackTrack?.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), of: videoAsset.tracks(withMediaType: .audio)[0], at: insertTime)

    insertTime = CMTimeAdd(insertTime, videoAsset.duration)
  }

  let outputFileURL = URL(fileURLWithPath: NSTemporaryDirectory() + "merge.mp4")

  let fileManager = FileManager()
  fileManager.removeItemIfExisted(outputFileURL)

  let exporter = AVAssetExportSession(asset: mainComposition, presetName: AVAssetExportPresetHighestQuality)

  exporter?.outputURL = outputFileURL
  exporter?.outputFileType = AVFileType.mp4
  exporter?.shouldOptimizeForNetworkUse = true

  exporter?.exportAsynchronously {
    DispatchQueue.main.async {
      completion(exporter!)
    }
  }
}



回答2:


This will work fine

      AVMutableComposition *mainComposition = [[AVMutableComposition alloc] init];
      AVMutableCompositionTrack *compositionVideoTrack = [mainComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];


      AVMutableCompositionTrack *soundtrackTrack = [mainComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
      CMTime insertTime = kCMTimeZero;

      for (AVAsset *videoAsset in assets) {

          [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:insertTime error:nil];

          [soundtrackTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:insertTime error:nil];

          // Updating the insertTime for the next insert
          insertTime = CMTimeAdd(insertTime, videoAsset.duration);
      }

      NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
      NSString *documentsDirectory = [paths objectAtIndex:0];

      // Creating a full path and URL to the exported video
      NSString *outputVideoPath =  [documentsDirectory stringByAppendingPathComponent:
                              [NSString stringWithFormat:@"mergeVideo-%d.mov",arc4random() % 1000]];

      // NSString *documentsDirectory = [paths objectAtIndex:0];
      NSString *myPathDocs =  [documentsDirectory stringByAppendingPathComponent:
                         current_name];
      NSURL *outptVideoUrl = [NSURL fileURLWithPath:myPathDocs];
      AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mainComposition presetName:AVAssetExportPreset640x480];

      // Setting attributes of the exporter
      exporter.outputURL=outptVideoUrl;
      exporter.outputFileType =AVFileTypeMPEG4;   //AVFileTypeQuickTimeMovie;
      exporter.shouldOptimizeForNetworkUse = YES;
      [exporter exportAsynchronouslyWithCompletionHandler:^{
          dispatch_async(dispatch_get_main_queue(), ^{
              //completion(exporter);
              [self exportDidFinish:exporter];
              // [self exportDidFinish:exporter:assets];
          });
      }];

this will work fine..



来源:https://stackoverflow.com/questions/18540195/merging-videos-together-avfoundation

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!