How do I control AVAssetWriter to write at the correct FPS

此生再无相见时 提交于 2019-11-29 04:08:54

I'm reaching here, but I think this is where you're going wrong. Think of your video capture as a pipeline.

(1) Capture buffer -> (2) Do Something With buffer -> (3) Write buffer as frames in video.

Sounds like you've successfully completed (1) and (2), you're getting the buffer fast enough and you're processing them so you can vend them as frames.

The problem is almost certainly in (3) writing the video frames.

https://developer.apple.com/reference/avfoundation/avmutablevideocomposition

Check out the frameDuration setting in your AVMutableComposition, you'll need something like CMTime(1, 60) //60FPS or CMTime(1, 240) // 240FPS to get what you're after (telling the video to WRITE this many frames and encode at this rate).

Using AVAssetWriter, it's exactly the same principle but you set the frame rate as a property in the AVAssetWriterInput outputSettings adding in the AVVideoExpectedSourceFrameRateKey.

NSDictionary *videoCompressionSettings = @{AVVideoCodecKey                  : AVVideoCodecH264,
                                         AVVideoWidthKey                  : @(videoWidth),
                                         AVVideoHeightKey                 : @(videoHeight),
                                       AVVideoExpectedSourceFrameRateKey : @(60),
                                         AVVideoCompressionPropertiesKey  : @{ AVVideoAverageBitRateKey      : @(bitsPerSecond),
                                                                               AVVideoMaxKeyFrameIntervalKey : @(1)}
                                         };

To expand a little more - you can't strictly control or sync your camera capture exactly to the output / playback rate, the timing just doesn't work that way and isn't that exact, and of course the processing pipeline adds overhead. When you capture frames they are time stamped, which you've seen, but in the writing / compression phase, it's using only the frames it needs to produce the output specified for the composition.

It goes both ways, you could capture only 30 FPS and write out at 240 FPS, the video would display fine, you'd just have a lot of frames "missing" and being filled in by the algorithm. You can even vend only 1 frame per second and play back at 30FPS, the two are separate from each other (how fast I capture Vs how many frames and what I present per second)

As to how to play it back at different speed, you just need to tweak the playback speed - slow it down as needed.

If you've correctly set the time base (frameDuration), it will always play back "normal" - you're telling it "play back is X Frames Per Second", of course, your eye may notice a difference (almost certainly between low FPS and high FPS), and the screen may not refresh that high (above 60FPS), but regardless the video will be at a "normal" 1X speed for it's timebase. By slowing the video, if my timebase is 120, and I slow it to .5x I know effectively see 60FPS and one second of playback takes two seconds.

You control the playback speed by setting the rate property on AVPlayer https://developer.apple.com/reference/avfoundation/avplayer

The iOS screen refresh is locked at 60fps, so the only way to "see" the extra frames is, as you say, to slow down the playback rate, a.k.a slow motion.

So

  1. yes, you are right
  2. the screen refresh rate (and perhaps limitations of the human visual system, assuming you're human?) means that you cannot perceive 120 & 240fps frame rates. You can play them at normal speed by downsampling to the screen refresh rate. Surely this is what AVPlayer already does, although I'm not sure if that's the answer you're looking for.
  3. you control the framerate of the file when you write it with the CMSampleBuffer presentation timestamps. If your frames are coming from the camera, you're probably passing the timestamps straight through, in which case check that you really are getting the framerate you asked for (a log statement in your capture callback should be enough to verify this). If you're procedurally creating frames, then you choose the presentation timestamps so that they're spaced 1.0/desiredFrameRate seconds apart!

Is 3. not working for you?

p.s. you can discard & ignore AVVideoMaxKeyFrameIntervalKey - it's a quality setting and has nothing to do with playback framerate.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!