Mixing Images and Video using AVFoundation

房东的猫 提交于 2019-12-03 14:11:33
Tom Haygarth

Well I solved my issue in another way. The animation route was not working, so my solution was to compile all my insertable images into a temporary video file and use that video to insert the images into my final output video.

Starting with the first link I originally posted ASSETWriterInput for making Video from UIImages on Iphone Issues I created the following function to create my temporary video

void CreateFrameImageVideo(NSString* path)
{
    NSLog(@"Creating writer at path %@", path);
    NSError *error = nil;
    AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
                                  [NSURL fileURLWithPath:path] fileType:AVFileTypeMPEG4
                                                              error:&error];

    NSLog(@"Creating video codec settings");
    NSDictionary *codecSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                   [NSNumber numberWithInt:gVideoTrack.estimatedDataRate/*128000*/], AVVideoAverageBitRateKey,
                                   [NSNumber numberWithInt:gVideoTrack.nominalFrameRate],AVVideoMaxKeyFrameIntervalKey,
                                   AVVideoProfileLevelH264MainAutoLevel, AVVideoProfileLevelKey,
                                   nil];

    NSLog(@"Creating video settings");
    NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                   AVVideoCodecH264, AVVideoCodecKey,
                                   codecSettings,AVVideoCompressionPropertiesKey,
                                   [NSNumber numberWithInt:1280], AVVideoWidthKey,
                                   [NSNumber numberWithInt:720], AVVideoHeightKey,
                                   nil];

    NSLog(@"Creating writter input");
    AVAssetWriterInput* writerInput = [[AVAssetWriterInput
                                        assetWriterInputWithMediaType:AVMediaTypeVideo
                                        outputSettings:videoSettings] retain];

    NSLog(@"Creating adaptor");
    AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
                                                     assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
                                                     sourcePixelBufferAttributes:nil];

    [videoWriter addInput:writerInput];

    NSLog(@"Starting session");
    //Start a session:
    [videoWriter startWriting];
    [videoWriter startSessionAtSourceTime:kCMTimeZero];


    CMTime timeOffset = kCMTimeZero;//CMTimeMake(0, 600);

    NSLog(@"Video Width %d, Height: %d, writing frame video to file", gWidth, gHeight);

    CVPixelBufferRef buffer;

    for(int i = 0; i< gAnalysisFrames.size(); i++)
    {
        while (adaptor.assetWriterInput.readyForMoreMediaData == FALSE) {
            NSLog(@"Waiting inside a loop");
            NSDate *maxDate = [NSDate dateWithTimeIntervalSinceNow:0.1];
            [[NSRunLoop currentRunLoop] runUntilDate:maxDate];
        }

        //Write samples:
        buffer = pixelBufferFromCGImage(gAnalysisFrames[i].frameImage, gWidth, gHeight);

        [adaptor appendPixelBuffer:buffer withPresentationTime:timeOffset];



        timeOffset = CMTimeAdd(timeOffset, gAnalysisFrames[i].duration);
    }

    while (adaptor.assetWriterInput.readyForMoreMediaData == FALSE) {
        NSLog(@"Waiting outside a loop");
        NSDate *maxDate = [NSDate dateWithTimeIntervalSinceNow:0.1];
        [[NSRunLoop currentRunLoop] runUntilDate:maxDate];
    }

    buffer = pixelBufferFromCGImage(gAnalysisFrames[gAnalysisFrames.size()-1].frameImage, gWidth, gHeight);
    [adaptor appendPixelBuffer:buffer withPresentationTime:timeOffset];

    NSLog(@"Finishing session");
    //Finish the session:
    [writerInput markAsFinished];
    [videoWriter endSessionAtSourceTime:timeOffset];
    BOOL successfulWrite = [videoWriter finishWriting];

    // if we failed to write the video
    if(!successfulWrite)
    {

        NSLog(@"Session failed with error: %@", [[videoWriter error] description]);

        // delete the temporary file created
        NSFileManager *fileManager = [NSFileManager defaultManager];
        if ([fileManager fileExistsAtPath:path]) {
            NSError *error;
            if ([fileManager removeItemAtPath:path error:&error] == NO) {
                NSLog(@"removeItemAtPath %@ error:%@", path, error);
            }
        }
    }
    else
    {
        NSLog(@"Session complete");
    }

    [writerInput release];

}

After the video is created it is then loaded as an AVAsset and it's track is extracted then the video is inserted by replacing the following line (from the first code block in the original post)

[mutableCompositionTrack insertEmptyTimeRange:CMTimeRangeMake(CMTimeAdd(gFrames[i].startTime, timeOffset), gFrames[i].duration)];

with:

[mutableCompositionTrack insertTimeRange:CMTimeRangeMake(timeOffset,gAnalysisFrames[i].duration)
                                     ofTrack:gFramesTrack
                                     atTime:CMTimeAdd(gAnalysisFrames[i].startTime, timeOffset) error:&gError];

where gFramesTrack is the AVAssetTrack created from the temporary frame video.

all the code relating to CALayer and CABasicAnimation objects have been removed as it just was not working.

Not the most elegant solution, I don't think but one that at least works. I hope that someone finds this useful.

This code also works on iOS devices (tested using an iPad 3)

Side note: The DebugLog function from the first post is just a callback to a function that prints out log messages, they can be replaced with NSLog() calls if need be.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!