Creating Thumbnail for Video in iOS

前端 未结 5 1219
别跟我提以往
别跟我提以往 2020-12-05 03:12

I have an application that I am developing for the iPhone. What it does is, it captures the video from the camera and stores the video file onto the File System.

I

相关标签:
5条回答
  • 2020-12-05 03:36

    Try this :

    generate.requestedTimeToleranceBefore = kCMTimeZero;
    generate.requestedTimeToleranceAfter = kCMTimeZero;
    

    Needs to added to get correct frame.

    0 讨论(0)
  • 2020-12-05 03:38

    Code for solution that uses AVFoundation framework and Swift 3.0 (commented code isn't necessary and is discussed below the code - you have to decide whether you need it or not):

    import AVFoundation
    
    func generateThumbnailForVideo(at url: URL) -> UIImage? {
        let kPreferredTimescale: Int32 = 1000
        let asset = AVURLAsset(url: url)
        let generator = AVAssetImageGenerator(asset: asset)
        generator.appliesPreferredTrackTransform = true
        //generator.requestedTimeToleranceBefore = kCMTimeZero
        //generator.requestedTimeToleranceAfter = kCMTimeZero
        //generator.maximumSize = CGSize(width: 100, height: 100)
    
        var actualTime: CMTime = CMTime(seconds: 0, preferredTimescale: kPreferredTimescale)
        //generates thumbnail at first second of the video
        let cgImage = try? generator.copyCGImage(at: CMTime(seconds: 1, preferredTimescale: kPreferredTimescale), actualTime: &actualTime)
        return cgImage.flatMap() { return UIImage(cgImage: $0, scale: UIScreen.main.scale, orientation: .up) }
    }
    

    Note that you may consider running this code on background thread as thumbnail creation can be potentially a costly operation.

    Also, please take a look at some of the properties of AVAssetImageGenerator class:

    1. requestedTimeToleranceBefore (Apple's documentation):

    The maximum length of time before a requested time for which an image may be generated.

    The default value is kCMTimePositiveInfinity.

    Set the values of requestedTimeToleranceBefore and requestedTimeToleranceAfter to kCMTimeZero to request frame-accurate image generation; this may incur additional decoding delay.

    1. requestedTimeToleranceAfter (Apple's documentation):

    The maximum length of time after a requested time for which an image may be generated.

    The default value is kCMTimePositiveInfinity.

    Set the values of requestedTimeToleranceBefore and requestedTimeToleranceAfter to kCMTimeZero to request frame-accurate image generation; this may incur additional decoding delay.

    1. maximumSize (Apple's documentation):

    Specifies the maximum dimensions for generated image.

    The default value is CGSizeZero, which specifies the asset’s unscaled dimensions.

    AVAssetImageGenerator scales images such that they fit within the defined bounding box. Images are never scaled up. The aspect ratio of the scaled image is defined by the apertureMode property.

    0 讨论(0)
  • 2020-12-05 03:41

    A better solution actually is to use the AVFoundation framework to do this. It bypasses the need to construct an MPMoviePlayerController which causes the problem that the Iris of the camera remains closed if used in conjuction with the UIImagePickerController (at least that's what I experienced).

    The code I use:

    + (UIImage *)thumbnailFromVideoAtURL:(NSURL *)contentURL {
        UIImage *theImage = nil;
        AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:contentURL options:nil];
        AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
        generator.appliesPreferredTrackTransform = YES;
        NSError *err = NULL;
        CMTime time = CMTimeMake(1, 60);
        CGImageRef imgRef = [generator copyCGImageAtTime:time actualTime:NULL error:&err];
    
        theImage = [[[UIImage alloc] initWithCGImage:imgRef] autorelease];
    
        CGImageRelease(imgRef);
        [asset release];
        [generator release];
    
        return theImage;
    }
    
    0 讨论(0)
  • 2020-12-05 03:46

    Try this (it doesn't actually show the movie player):

    + (UIImage *)imageFromMovie:(NSURL *)movieURL atTime:(NSTimeInterval)time {
      // set up the movie player
      MPMoviePlayerController *mp = [[MPMoviePlayerController alloc] 
        initWithContentURL:movieURL];
      mp.shouldAutoplay = NO;
      mp.initialPlaybackTime = time;
      mp.currentPlaybackTime = time;
      // get the thumbnail
      UIImage *thumbnail = [mp thumbnailImageAtTime:time 
                               timeOption:MPMovieTimeOptionNearestKeyFrame];
      // clean up the movie player
      [mp stop];
      [mp release];
      return(thumbnail);
    }
    

    It's supposed to be a synchronous call, so it might block the main thread some, but seems to be running pretty instantly for me when I use a time at the beginning of the movie. If you're doing this a lot, you can add it as a category on UIImage, which is what I did.

    I see from your question that you want to do this before the movie is saved, and I guess it might not work without a file url. However, if you're using the UIImagePickerController for camera capture, you can pass this function the URL returned in the info dictionary of imagePickerController:didFinishPickingMediaWithInfo: with the key UIImagePickerControllerMediaURL.

    0 讨论(0)
  • 2020-12-05 03:55

    Very simple try this...

    Step 1: Import header #import <MediaPlayer/MediaPlayer.h>

    Step 2: Get url path

    NSURL *videoURL = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:@"Sample" ofType:@"m4v"]];
    

    Step 3: Finally get thumbnail

    - (UIImage *)VideoThumbNail:(NSURL *)videoURL
    {
        MPMoviePlayerController *player = [[MPMoviePlayerController alloc] initWithContentURL:videoURL];
        UIImage *thumbnail = [player thumbnailImageAtTime:52.0 timeOption:MPMovieTimeOptionNearestKeyFrame];
        [player stop];
        return thumbnail;
    }
    
    0 讨论(0)
提交回复
热议问题