Processing all frames in an AVAsset

泄露秘密 提交于 2019-12-22 10:30:29

问题


I am trying to go through each frame in an AVAsset and process each frame as if it were an image. I have not been able to find anything from my searches.

The task I am trying to accomplish would look like this in pseudo-code

for each frame in asset

      take the frame as an image and convert to a cvMat
      Process and store data of center points
      Store center points in array

The only part in that pseudo-code I do not know how to write is the going though each frame and capturing it in an image.

Can anyone help?


回答1:


One answer is to use AVAssetImageGenerator.

1) Load the movie file into an AVAsset object.
2) Create an AVAssetImageGenerator object.
3) Pass in an estimated time of the frame where you want to get an image back from the movie.

Setting the 2 properties requestedTimeToleranceBefore and requestedTimeToleranceAfter on the AVAssetImageGenerator object to kCMTimeZero will increase the ability to get individual frames, but increases the processing time.

However this method is slow and I have not found a faster way.

//Load the Movie from a URL
self.movieAsset = [AVAsset assetWithURL:self.movieURL];
NSArray *movieTracks = [self.movieAsset tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack *movieTrack = [movieTracks objectAtIndex:0];

//Make the image Generator    
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc] initWithAsset:self.movieAsset];

//Create a variables for the time estimation
Float64 durationSeconds = CMTimeGetSeconds(self.movieAsset.duration);
Float64 timePerFrame = 1.0 / (Float64)movieTrack.nominalFrameRate;
Float64 totalFrames = durationSeconds * movieTrack.nominalFrameRate;

//Step through the frames
for (int counter = 0; counter <= totalFrames; counter++){
    CMTime actualTime;
    Float64 secondsIn = ((float)counter/totalFrames)*durationSeconds;
    CMTime imageTimeEstimate = CMTimeMakeWithSeconds(secondsIn, 600);
    NSError *error;
    CGImageRef image = [imageGenerator copyCGImageAtTime:imageTimeEstimate actualTime:&actualTime error:&error];

    ...Do some processing on the image

    CGImageRelease(image);    
}



回答2:


You could simply gen each frame using AVAssetReaderTrackOutput:

            let asset = AVAsset(url: inputUrl)
            let reader = try! AVAssetReader(asset: asset)
            let videoTrack = asset.tracks(withMediaType: .video).first!
            let outputSettings = [String(kCVPixelBufferPixelFormatTypeKey): NSNumber(value: kCVPixelFormatType_32BGRA)]
            let trackReaderOutput = AVAssetReaderTrackOutput(track: videoTrack,
                                                             outputSettings: outputSettings)

            reader.add(trackReaderOutput)
            reader.startReading()

            while let sampleBuffer = trackReaderOutput.copyNextSampleBuffer() {
                if let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) {
                    // do what you want
                }
            }


来源:https://stackoverflow.com/questions/26600979/processing-all-frames-in-an-avasset

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!