iOS: AVPlayer - getting a snapshot of the current frame of a video

前端 未结 2 1586
难免孤独
难免孤独 2020-12-14 22:42

I have spent the whole day and went through a lot of SO answers, Apple references, documentations, etc, but no success.

I want a simple thing: I am playing a

相关标签:
2条回答
  • 2020-12-14 23:06

    AVAssetImageGenerator is the best way to snapshot a video, this method return asynchronously a UIImage :

    import AVFoundation
    
    // ...
    
    var player:AVPlayer? = // ...
    
    func screenshot(handler:@escaping ((UIImage)->Void)) {
        guard let player = player ,
            let asset = player.currentItem?.asset else {
                return
        }
    
        let imageGenerator = AVAssetImageGenerator(asset: asset)
        imageGenerator.appliesPreferredTrackTransform = true
        let times = [NSValue(time:player.currentTime())]
    
        imageGenerator.generateCGImagesAsynchronously(forTimes: times) { _, image, _, _, _ in
            if image != nil {
                handler(UIImage(cgImage: image!))
            }
        }
    }
    

    (It's Swift 4.2)

    0 讨论(0)
  • 2020-12-14 23:26

    AVPlayerItemVideoOutput works fine for me from an m3u8. Maybe it's because I don't consult hasNewPixelBufferForItemTime and simply call copyPixelBufferForItemTime? This code produces a CVPixelBuffer instead of a UIImage, but there are answers that describe how to do that.

    This answer mostly cribbed from here

    #import "ViewController.h"
    #import <AVFoundation/AVFoundation.h>
    
    @interface ViewController ()
    
    @property (nonatomic) AVPlayer *player;
    @property (nonatomic) AVPlayerItem *playerItem;
    @property (nonatomic) AVPlayerItemVideoOutput *playerOutput;
    
    @end
    
    @implementation ViewController
    - (void)setupPlayerWithLoadedAsset:(AVAsset *)asset {
        NSDictionary* settings = @{ (id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA) };
        self.playerOutput = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:settings];
        self.playerItem = [AVPlayerItem playerItemWithAsset:asset];
        [self.playerItem addOutput:self.playerOutput];
        self.player = [AVPlayer playerWithPlayerItem:self.playerItem];
    
        AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
        playerLayer.frame = self.view.frame;
        [self.view.layer addSublayer:playerLayer];
    
        [self.player play];
    }
    
    - (IBAction)grabFrame {
        CVPixelBufferRef buffer = [self.playerOutput copyPixelBufferForItemTime:[self.playerItem currentTime] itemTimeForDisplay:nil];
        NSLog(@"The image: %@", buffer);
    }
    
    - (void)viewDidLoad {
        [super viewDidLoad];
    
    
        NSURL *someUrl = [NSURL URLWithString:@"http://qthttp.apple.com.edgesuite.net/1010qwoeiuryfg/sl.m3u8"];
        AVURLAsset *asset = [AVURLAsset URLAssetWithURL:someUrl options:nil];
    
        [asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:@"tracks"] completionHandler:^{
    
            NSError* error = nil;
            AVKeyValueStatus status = [asset statusOfValueForKey:@"tracks" error:&error];
            if (status == AVKeyValueStatusLoaded)
            {
                dispatch_async(dispatch_get_main_queue(), ^{
                    [self setupPlayerWithLoadedAsset:asset];
                });
            }
            else
            {
                NSLog(@"%@ Failed to load the tracks.", self);
            }
        }];
    }
    
    @end
    
    0 讨论(0)
提交回复
热议问题