How do I export UIImage array as a movie?

后端 未结 10 2053
臣服心动
臣服心动 2020-11-22 02:11

I have a serious problem: I have an NSArray with several UIImage objects. What I now want to do, is create movie from those UIImages.

10条回答
  •  傲寒
    傲寒 (楼主)
    2020-11-22 02:35

    Here is the latest working code on iOS8 in Objective-C.

    We had to make a variety of tweaks to @Zoul's answer above to get it to work on the latest version of Xcode and iOS8. Here is our complete working code that takes an array of UIImages, makes them into a .mov file, saves it to a temp directory, then moves it to the camera roll. We assembled code from multiple different posts to get this working. We have highlighted the traps we had to solve to get the code working in our comments.

    (1) Create a collection of UIImages

    [self saveMovieToLibrary]
    
    
    - (IBAction)saveMovieToLibrary
    {
        // You just need the height and width of the video here
        // For us, our input and output video was 640 height x 480 width
        // which is what we get from the iOS front camera
        ATHSingleton *singleton = [ATHSingleton singletons];
        int height = singleton.screenHeight;
        int width = singleton.screenWidth;
    
        // You can save a .mov or a .mp4 file        
        //NSString *fileNameOut = @"temp.mp4";
        NSString *fileNameOut = @"temp.mov";
    
        // We chose to save in the tmp/ directory on the device initially
        NSString *directoryOut = @"tmp/";
        NSString *outFile = [NSString stringWithFormat:@"%@%@",directoryOut,fileNameOut];
        NSString *path = [NSHomeDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:outFile]];
        NSURL *videoTempURL = [NSURL fileURLWithPath:[NSString stringWithFormat:@"%@%@", NSTemporaryDirectory(), fileNameOut]];
    
        // WARNING: AVAssetWriter does not overwrite files for us, so remove the destination file if it already exists
        NSFileManager *fileManager = [NSFileManager defaultManager];
        [fileManager removeItemAtPath:[videoTempURL path]  error:NULL];
    
    
        // Create your own array of UIImages        
        NSMutableArray *images = [NSMutableArray array];
        for (int i=0; i

    This is the main method that creates your AssetWriter and adds images to it for writing.

    (2) Wire up an AVAssetWriter

    -(void)writeImageAsMovie:(NSArray *)array toPath:(NSString*)path size:(CGSize)size
    {
    
        NSError *error = nil;
    
        // FIRST, start up an AVAssetWriter instance to write your video
        // Give it a destination path (for us: tmp/temp.mov)
        AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:path]
                                                               fileType:AVFileTypeQuickTimeMovie
                                                                  error:&error];
    
    
        NSParameterAssert(videoWriter);
    
        NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                       AVVideoCodecH264, AVVideoCodecKey,
                                       [NSNumber numberWithInt:size.width], AVVideoWidthKey,
                                       [NSNumber numberWithInt:size.height], AVVideoHeightKey,
                                       nil];
    
        AVAssetWriterInput* writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo
                                                                             outputSettings:videoSettings];
    
        AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
                                                                                                                         sourcePixelBufferAttributes:nil];
        NSParameterAssert(writerInput);
        NSParameterAssert([videoWriter canAddInput:writerInput]);
        [videoWriter addInput:writerInput];
    

    (3) Start a writing Session (NOTE: the method is continuing from above)

        //Start a SESSION of writing. 
        // After you start a session, you will keep adding image frames 
        // until you are complete - then you will tell it you are done.
        [videoWriter startWriting];
        // This starts your video at time = 0
        [videoWriter startSessionAtSourceTime:kCMTimeZero];
    
        CVPixelBufferRef buffer = NULL;
    
        // This was just our utility class to get screen sizes etc.    
        ATHSingleton *singleton = [ATHSingleton singletons];
    
        int i = 0;
        while (1)
        {
            // Check if the writer is ready for more data, if not, just wait
            if(writerInput.readyForMoreMediaData){
    
                CMTime frameTime = CMTimeMake(150, 600);
                // CMTime = Value and Timescale.
                // Timescale = the number of tics per second you want
                // Value is the number of tics
                // For us - each frame we add will be 1/4th of a second
                // Apple recommend 600 tics per second for video because it is a 
                // multiple of the standard video rates 24, 30, 60 fps etc.
                CMTime lastTime=CMTimeMake(i*150, 600);
                CMTime presentTime=CMTimeAdd(lastTime, frameTime);
    
                if (i == 0) {presentTime = CMTimeMake(0, 600);} 
                // This ensures the first frame starts at 0.
    
    
                if (i >= [array count])
                {
                    buffer = NULL;
                }
                else
                {
                    // This command grabs the next UIImage and converts it to a CGImage
                    buffer = [self pixelBufferFromCGImage:[[array objectAtIndex:i] CGImage]];
                }
    
    
                if (buffer)
                {
                    // Give the CGImage to the AVAssetWriter to add to your video
                    [adaptor appendPixelBuffer:buffer withPresentationTime:presentTime];
                    i++;
                }
                else
                {
    

    (4) Finish the Session (Note: Method continues from above)

                    //Finish the session:
                    // This is important to be done exactly in this order
                    [writerInput markAsFinished];
                    // WARNING: finishWriting in the solution above is deprecated. 
                    // You now need to give a completion handler.
                    [videoWriter finishWritingWithCompletionHandler:^{
                        NSLog(@"Finished writing...checking completion status...");
                        if (videoWriter.status != AVAssetWriterStatusFailed && videoWriter.status == AVAssetWriterStatusCompleted)
                        {
                            NSLog(@"Video writing succeeded.");
    
                            // Move video to camera roll
                            // NOTE: You cannot write directly to the camera roll. 
                            // You must first write to an iOS directory then move it!
                            NSURL *videoTempURL = [NSURL fileURLWithPath:[NSString stringWithFormat:@"%@", path]];
                            [self saveToCameraRoll:videoTempURL];
    
                        } else
                        {
                            NSLog(@"Video writing failed: %@", videoWriter.error);
                        }
    
                    }]; // end videoWriter finishWriting Block
    
                    CVPixelBufferPoolRelease(adaptor.pixelBufferPool);
    
                    NSLog (@"Done");
                    break;
                }
            }
        }    
    }
    

    (5) Convert your UIImages to a CVPixelBufferRef
    This method will give you a CV pixel buffer reference which is needed by the AssetWriter. This is obtained from a CGImageRef which you get from your UIImage (above).

    - (CVPixelBufferRef) pixelBufferFromCGImage: (CGImageRef) image
    {
        // This again was just our utility class for the height & width of the
        // incoming video (640 height x 480 width)
        ATHSingleton *singleton = [ATHSingleton singletons];
        int height = singleton.screenHeight;
        int width = singleton.screenWidth;
    
        NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                                 [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
                                 [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
                                 nil];
        CVPixelBufferRef pxbuffer = NULL;
    
        CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, width,
                                              height, kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef) options,
                                              &pxbuffer);
    
        NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);
    
        CVPixelBufferLockBaseAddress(pxbuffer, 0);
        void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
        NSParameterAssert(pxdata != NULL);
    
        CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
    
        CGContextRef context = CGBitmapContextCreate(pxdata, width,
                                                     height, 8, 4*width, rgbColorSpace,
                                                     kCGImageAlphaNoneSkipFirst);
        NSParameterAssert(context);
        CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
        CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image),
                                               CGImageGetHeight(image)), image);
        CGColorSpaceRelease(rgbColorSpace);
        CGContextRelease(context);
    
        CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
    
        return pxbuffer;
    }
    

    (6) Move Your Video to the Camera Roll Because AVAssetWriter cannot write directly to the camera roll, this moves the video from "tmp/temp.mov" (or whatever filename you named it above) to the camera roll.

    - (void) saveToCameraRoll:(NSURL *)srcURL
    {
        NSLog(@"srcURL: %@", srcURL);
    
        ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
        ALAssetsLibraryWriteVideoCompletionBlock videoWriteCompletionBlock =
        ^(NSURL *newURL, NSError *error) {
            if (error) {
                NSLog( @"Error writing image with metadata to Photo Library: %@", error );
            } else {
                NSLog( @"Wrote image with metadata to Photo Library %@", newURL.absoluteString);
            }
        };
    
        if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:srcURL])
        {
            [library writeVideoAtPathToSavedPhotosAlbum:srcURL
                                        completionBlock:videoWriteCompletionBlock];
        }
    }
    

    Zoul's answer above gives a nice outline of what you will be doing. We extensively commented this code so you can then see how it was done using working code.

提交回复
热议问题