iOS - AVAssestExportSession can only export maximum 8 tracks after playing with AVPlayer

这一生的挚爱 提交于 2019-12-06 08:33:53

问题


I'm trying to loop some fragments of a recorded video and merge them into one video. I've successfully merged and exported a composition with up to 16 tracks. But when I try to play the composition using AVPlayer before merging, I can only export a maximum of 8 tracks.

First, I create AVComposition and AVVideoComposition

    +(void)previewUserClipDanceWithAudio:(NSURL*)videoURL audioURL:(NSURL*)audioFile loop:(NSArray*)loopTime slowMotion:(NSArray*)slowFactor showInViewController:(UIViewController*)viewController completion:(void(^)(BOOL success, AVVideoComposition* videoComposition, AVComposition* composition))completion{

AVMutableComposition* mixComposition = [[AVMutableComposition alloc] init];
NSMutableArray *arrayInstruction = [[NSMutableArray alloc] init];
AVMutableVideoCompositionInstruction *videoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];

AVURLAsset  *audioAsset = [[AVURLAsset alloc]initWithURL:audioFile options:nil];
//NSLog(@"audio File %@",audioFile);

CMTime duration = kCMTimeZero;

AVAsset *currentAsset = [AVAsset assetWithURL:videoURL];
BOOL  isCurrentAssetPortrait  = YES;

for(NSInteger i=0;i< [loopTime count]; i++) {

    //handle looptime array
    NSInteger loopDur = [[loopTime objectAtIndex:i] intValue];
    NSInteger value = labs(loopDur);
    //NSLog(@"loopInfo %d value %d",loopInfo,value);
    //handle slowmotion array
    double slowInfo = [[slowFactor objectAtIndex:i] doubleValue];
    double videoScaleFactor = fabs(slowInfo);

    AVMutableCompositionTrack *currentTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    AVMutableCompositionTrack *audioTrack;
    audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio
                                             preferredTrackID:kCMPersistentTrackID_Invalid];
    if (i==0) {
        [currentTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, currentAsset.duration) ofTrack:[[currentAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:duration error:nil];

        [audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, currentAsset.duration) ofTrack:[[currentAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:duration error:nil];

    } else {

        [currentTrack insertTimeRange:CMTimeRangeMake(CMTimeSubtract(currentAsset.duration, CMTimeMake(value, 10)), CMTimeMake(value, 10)) ofTrack:[[currentAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:duration error:nil];

        if (videoScaleFactor==1) {

            [audioTrack insertTimeRange:CMTimeRangeMake(CMTimeSubtract(currentAsset.duration, CMTimeMake(value, 10)), CMTimeMake(value, 10)) ofTrack:[[currentAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:duration error:nil];
        }
        //slow motion here
        if (videoScaleFactor!=1) {

            [currentTrack scaleTimeRange:CMTimeRangeMake(CMTimeSubtract(currentAsset.duration, CMTimeMake(value, 10)), CMTimeMake(value, 10))
                              toDuration:CMTimeMake(value*videoScaleFactor, 10)];
            NSLog(@"slowmo %f",value*videoScaleFactor);
        }
    }

    AVMutableVideoCompositionLayerInstruction *currentAssetLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:currentTrack];
    AVAssetTrack *currentAssetTrack = [[currentAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

    BOOL  isCurrentAssetPortrait  = YES;
    //CGFloat assetScaleToFitRatio;
    //assetScaleToFitRatio = [self getScaleToFitRatioCurrentTrack:currentTrack];

    if(isCurrentAssetPortrait){
        //NSLog(@"portrait");
        if (slowInfo<0) {
            CGRect screenRect = [[UIScreen mainScreen] bounds];
            CGFloat ratio = screenRect.size.height / screenRect.size.width;

            // we have to adjust the ratio for 16:9 screens
            if (ratio == 1.775) ratio = 1.77777777777778;

            CGFloat complimentSize = (currentAssetTrack.naturalSize.height*ratio);
            CGFloat tx = (currentAssetTrack.naturalSize.width-complimentSize)/2;

            // invert translation because of portrait
            tx *= -1;
            // t1: rotate and position video since it may have been cropped to screen ratio
            CGAffineTransform t1 = CGAffineTransformTranslate(currentAssetTrack.preferredTransform, tx, 0);
            // t2/t3: mirror video vertically

            CGAffineTransform t2 = CGAffineTransformTranslate(t1, currentAssetTrack.naturalSize.width, 0);
            CGAffineTransform t3 = CGAffineTransformScale(t2, -1, 1);

            [currentAssetLayerInstruction setTransform:t3 atTime:duration];

        } else if (loopDur<0) {
            CGRect screenRect = [[UIScreen mainScreen] bounds];
            CGFloat ratio = screenRect.size.height / screenRect.size.width;

            // we have to adjust the ratio for 16:9 screens
            if (ratio == 1.775) ratio = 1.77777777777778;

            CGFloat complimentSize = (currentAssetTrack.naturalSize.height*ratio);
            CGFloat tx = (currentAssetTrack.naturalSize.width-complimentSize)/2;

            // invert translation because of portrait
            tx *= -1;
            // t1: rotate and position video since it may have been cropped to screen ratio
            CGAffineTransform t1 = CGAffineTransformTranslate(currentAssetTrack.preferredTransform, tx, 0);
            // t2/t3: mirror video horizontally
            CGAffineTransform t2 = CGAffineTransformTranslate(t1, 0, currentAssetTrack.naturalSize.height);
            CGAffineTransform t3 = CGAffineTransformScale(t2, 1, -1);

            [currentAssetLayerInstruction setTransform:t3 atTime:duration];

        } else {

            [currentAssetLayerInstruction setTransform:currentAssetTrack.preferredTransform atTime:duration];

        }
    }else{
        //            CGFloat translateAxisX = (currentTrack.naturalSize.width > MAX_WIDTH )?(0.0):0.0;// if use <, 640 video will be moved left by 10px. (float)(MAX_WIDTH - currentTrack.naturalSize.width)/(float)4.0
        //            CGAffineTransform FirstAssetScaleFactor = CGAffineTransformMakeScale(assetScaleToFitRatio,assetScaleToFitRatio);
        //            [currentAssetLayerInstruction setTransform:
        //             CGAffineTransformConcat(CGAffineTransformConcat(currentAssetTrack.preferredTransform, FirstAssetScaleFactor),CGAffineTransformMakeTranslation(translateAxisX, 0)) atTime:duration];
    }
    if (i==0) {
        duration=CMTimeAdd(duration, currentAsset.duration);
    } else  {
        if (videoScaleFactor!=1) {
            duration=CMTimeAdd(duration, CMTimeMake(value*videoScaleFactor, 10));
        } else {
            duration=CMTimeAdd(duration, CMTimeMake(value, 10));
        }
    }

    [currentAssetLayerInstruction setOpacity:0.0 atTime:duration];
    [arrayInstruction addObject:currentAssetLayerInstruction];
}

AVMutableCompositionTrack *AudioBGTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[AudioBGTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioAsset.duration) ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:CMTimeSubtract(duration, audioAsset.duration) error:nil];

videoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, duration);
videoCompositionInstruction.layerInstructions = arrayInstruction;

CGSize naturalSize;
if(isCurrentAssetPortrait){
    naturalSize = CGSizeMake(MAX_HEIGHT,MAX_WIDTH);//currentAssetTrack.naturalSize.height,currentAssetTrack.naturalSize.width);
} else {
    naturalSize = CGSizeMake(MAX_WIDTH,MAX_HEIGHT);//currentAssetTrack.naturalSize;
}

AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.instructions = [NSArray arrayWithObject:videoCompositionInstruction];
videoComposition.frameDuration = CMTimeMake(1, 30);
videoComposition.renderSize = CGSizeMake(naturalSize.width,naturalSize.height);
NSLog(@"prepared");

AVVideoComposition *composition = [videoComposition copy];
AVComposition *mixedComposition = [mixComposition copy];
completion(YES, composition, mixedComposition);
}

Then, I set the AVPlayer

    -(void)playVideoWithComposition:(AVVideoComposition*)videoComposition inMutableComposition:(AVComposition*)composition{

MBProgressHUD *hud = [MBProgressHUD showHUDAddedTo:self.view animated:YES];
hud.label.text = myLanguage(@"kMergeClip");

savedComposition = [composition copy];
savedVideoComposition = [videoComposition copy];
playerItem = [AVPlayerItem playerItemWithAsset:composition];
playerItem.videoComposition = videoComposition;

[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(repeatVideo:) name:AVPlayerItemDidPlayToEndTimeNotification object:playerItem];

if (!player) {
    player = [AVPlayer playerWithPlayerItem:playerItem];
    layer = [AVPlayerLayer playerLayerWithPlayer:player];
    layer.frame = [UIScreen mainScreen].bounds;
    [self.ibPlayerView.layer insertSublayer:layer atIndex:0];
    NSLog(@"create new player");
}

if (player.currentItem != playerItem ) {
    [player replaceCurrentItemWithPlayerItem:playerItem];
}
player.actionAtItemEnd = AVPlayerActionAtItemEndNone;
//[player seekToTime:kCMTimeZero];

[playerItem addObserver:self
             forKeyPath:@"status"
                options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew
                context:@"AVPlayerStatus"];
}

When user previews all the video they want and hit save. I use this method to export

    +(void)mergeUserCLip:(AVVideoComposition*)videoComposition withAsset:(AVComposition*)mixComposition showInViewController:(UIViewController*)viewController completion:(void(^)(BOOL success, NSURL *fileURL))completion{

MBProgressHUD *hud = [MBProgressHUD showHUDAddedTo:viewController.view animated:YES];
hud.mode = MBProgressHUDModeDeterminateHorizontalBar;
hud.label.text = myLanguage(@"kMergeClip");

//Name merge clip using beat name
//NSString* beatName = [[[NSString stringWithFormat:@"%@",audioFile] lastPathComponent] stringByDeletingPathExtension];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *tmpDir = [[documentsDirectory stringByDeletingLastPathComponent] stringByAppendingPathComponent:@"tmp"];
NSString *myPathDocs =  [tmpDir stringByAppendingPathComponent:[NSString stringWithFormat:@"merge-beat.mp4"]];
//Not remove here, will remove when call previewPlayVC
[[NSFileManager defaultManager] removeItemAtPath:myPathDocs error:nil];

// 1 - set up the overlay
CALayer *overlayLayer = [CALayer layer];
UIImage *overlayImage = [UIImage imageNamed:@"watermark.png"];

[overlayLayer setContents:(id)[overlayImage CGImage]];
overlayLayer.frame = CGRectMake(720-221, 1280-109, 181, 69);
[overlayLayer setMasksToBounds:YES];

//    aLayer  = [CALayer layer];
//    [aLayer addSublayer:labelLogo.layer];
//    aLayer.frame = CGRectMake(MAX_WIDTH- labelLogo.width - 10.0, MAX_HEIGHT-50.0, 20.0, 20.0);
//    aLayer.opacity = 1;

// 2 - set up the parent layer
CALayer *parentLayer = [CALayer layer];
CALayer *videoLayer = [CALayer layer];
parentLayer.frame = CGRectMake(0, 0, MAX_HEIGHT,MAX_WIDTH);
videoLayer.frame = CGRectMake(0, 0, MAX_HEIGHT,MAX_WIDTH);
[parentLayer addSublayer:videoLayer];
[parentLayer addSublayer:overlayLayer];

// 3 - apply magic
AVMutableVideoComposition *mutableVideoComposition = [videoComposition copy];
mutableVideoComposition.animationTool = [AVVideoCompositionCoreAnimationTool
                                  videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];

NSURL *url = [NSURL fileURLWithPath:myPathDocs];
myLog(@"Path: %@", myPathDocs);
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPreset1280x720];
exporter.outputURL = url;
exporter.outputFileType = AVFileTypeMPEG4;
exporter.videoComposition = mutableVideoComposition;
exporter.shouldOptimizeForNetworkUse = NO;

[exporter exportAsynchronouslyWithCompletionHandler:^ {
    //NSLog(@"exporting");
    switch (exporter.status) {
        case AVAssetExportSessionStatusCompleted: {
            NSURL *url = [NSURL fileURLWithPath:myPathDocs];
            hud.progress = 1.0f;
            dispatch_async(dispatch_get_main_queue(), ^{
                [MBProgressHUD hideHUDForView:viewController.view animated:YES];
            });
            [self checkTmpSize];
            if (completion) {
                completion(YES, url);
            }
        }
            break;
        case AVAssetExportSessionStatusExporting:
            myLog(@"Exporting!");
            break;
        case AVAssetExportSessionStatusWaiting:
            myLog(@"Waiting");
            break;
        default:
            break;
    }
}];
}

If select options to loop less than 8 times, the above code works fine. If select options more than 8 times, export session freezes showing export.progress = 0.0000000 If I remove this line

    playerItem.videoComposition = videoComposition;

Then I cannot preview the mixed video but enable to export normally (up to 16 tracks).

Or If I remove the line in export code:

    exporter.videoComposition = mutableVideoComposition;

Then it's possible to preview the mixed video, and export normally WITHOUT video composition.

So I guess there's something wrong with AVVideoComposition and/or the way I implement it.

I would appreciate any suggestion. Many thanks.

I highly doubt the reason for this is using AVPlayer to preview video somehow hinders AVAssetExportSession as described in below posts:

iOS 5: Error merging 3 videos with AVAssetExportSession

AVPlayerItem fails with AVStatusFailed and error code “Cannot Decode”


回答1:


I ran into this issue while attempting to concatenate N videos while playing up to 3 videos I AVPlayer instances in an UICollectionView. It's been discussed in the Stack Overflow question you linked that iOS is capable of only handling so many instances of AVPlayer. Each instance uses up a "render pipeline". I discovered that each instance of AVMutableCompositionTrack also uses up one of these render pipelines.

Therefore if you use too many AVPlayer instances or try to create an AVMutableComposition with too many AVMutableCompositionTrack tracks, you can run out of resources to decode H264 and you will receive the "Cannot Decode" error. I was able to get around the issue by only using two instances of AVMutableCompositionTrack. This way I could "overlap" segments of video while also applying transitions (which requires two video tracks to "play" concurrently).

In short: minimize your usage of AVMutableCompositionTrack as well as AVPlayer. You can check out the AVCustomEdit sample code by Apple for an example of this. Specifically, check out the buildTransitionComposition method inside the APLSimpleEditor class.




回答2:


try this, clear player item before exporting

[self.player replaceCurrentItemWithPlayerItem:nil];


来源:https://stackoverflow.com/questions/35475253/ios-avassestexportsession-can-only-export-maximum-8-tracks-after-playing-with

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!