Can I use AVFoundation to stream downloaded video frames into an OpenGL ES texture?

前端 未结 1 748
抹茶落季
抹茶落季 2020-12-12 17:45

I\'ve been able to use AVFoundation\'s AVAssetReader class to upload video frames into an OpenGL ES texture. It has a caveat, however, in that it fails when use

相关标签:
1条回答
  • 2020-12-12 18:25

    There's some API that was released with iOS 6 that I've been able to use to make the process a breeze. It doesn't use AVAssetReader at all, and instead relies on a class called AVPlayerItemVideoOutput. An instance of this class can be added to any AVPlayerItem instance via a new -addOutput: method.

    Unlike the AVAssetReader, this class will work fine for AVPlayerItems that are backed by a remote AVURLAsset, and also has the benefit of allowing for a more sophisticated playback interface that supports non-linear playback via -copyPixelBufferForItemTime:itemTimeForDisplay: (instead of of AVAssetReader's severely limiting -copyNextSampleBuffer method.


    SAMPLE CODE

    // Initialize the AVFoundation state
    AVURLAsset *asset = [AVURLAsset URLAssetWithURL:someUrl options:nil];
    [asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:@"tracks"] completionHandler:^{
    
        NSError* error = nil;
        AVKeyValueStatus status = [asset statusOfValueForKey:@"tracks" error:&error];
        if (status == AVKeyValueStatusLoaded)
        {
            NSDictionary* settings = @{ (id)kCVPixelBufferPixelFormatTypeKey : [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] };
            AVPlayerItemVideoOutput* output = [[[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:settings] autorelease];
            AVPlayerItem* playerItem = [AVPlayerItem playerItemWithAsset:asset];
            [playerItem addOutput:[self playerItemOutput]];
            AVPlayer* player = [AVPlayer playerWithPlayerItem:playerItem];
    
            // Assume some instance variable exist here. You'll need them to control the
            // playback of the video (via the AVPlayer), and to copy sample buffers (via the AVPlayerItemVideoOutput).
            [self setPlayer:player];
            [self setPlayerItem:playerItem];
            [self setOutput:output];
        }
        else
        {
            NSLog(@"%@ Failed to load the tracks.", self);
        }
    }];
    
    // Now at any later point in time, you can get a pixel buffer
    // that corresponds to the current AVPlayer state like this:
    CVPixelBufferRef buffer = [[self output] copyPixelBufferForItemTime:[[self playerItem] currentTime] itemTimeForDisplay:nil];
    

    Once you've got your buffer, you can upload it to OpenGL however you want. I recommend the horribly documented CVOpenGLESTextureCacheCreateTextureFromImage() function, because you'll get hardware acceleration on all the newer devices, which is much faster than glTexSubImage2D(). See Apple's GLCameraRipple and RosyWriter demos for examples.

    0 讨论(0)
提交回复
热议问题