avfoundation

Alternatives to deprecated AVCaptureConnection frame-duration properties?

回眸只為那壹抹淺笑 提交于 2019-12-22 06:55:53
问题 According to this document the properties and methods relating to video-frame maximum and minimum duration: supportsVideoMaxFrameDuration supportsVideoMinFrameDuration videoMaxFrameDuration videoMinFrameDuration have all been deprecated. Are there alternatives? 回答1: According to the header file (AVCaptureSession.h), This property is deprecated on iOS, where min and max frame rate adjustments are applied exclusively at the AVCaptureDevice using the activeVideoMinFrameDuration and

How to add static and dynamic overlays to video with AVAssetWriter?

喜欢而已 提交于 2019-12-22 06:49:38
问题 What's the right way to add an image overlay to a video created with AVAssetWriter? It's possible to do so with AVAssetExportSession, but this question is about how to do so with AVAssetWriter so there is more control over the quality and output. There are two scenarios: 1) Simple: Add single overlay that is present the entire duration of the video (similar to a watermark). 2) Complex: Add different overlays that animate in and out of the video at different times (similar to using

AVAssetExportSession is nil iPhone 7 - Plus Simulator

社会主义新天地 提交于 2019-12-22 05:43:13
问题 AVAssetExportSession works fine on iPhone 6 and below but not on iPhone 7, iPhone 7 Plus simulator. Xcode 8.0 This code return nil in exportSession, when is executed on iPhone 7 - Plus Simulator, but not in iPhone SE, iPhone 6s ... Simulator. Please find the below code for more information. NSURL *inputURL = [[NSBundle mainBundle] URLForResource: @"example" withExtension:@"m4a"]; AVURLAsset *assetAV = [AVURLAsset URLAssetWithURL:inputURL options:nil]; AVAssetExportSession *exportSession = [

How to save a UIImage to documents directory?

╄→尐↘猪︶ㄣ 提交于 2019-12-22 05:37:48
问题 I'm trying to save both a recorded video's file path, and a thumbnail from the video to the documents directory. Then, set those two values to an object using the file paths so I can use the object to populate a collection view. With the code I have currently (below), after I record a video, the video path gets saved to the documents directory, and the video path and thumbnail get set to my Post object, and the thumbnail appears properly in my collection view. All good so far. However only

mp4 video starts at different time on Quicktime/AVplayer vs Chrome/Firefox

南笙酒味 提交于 2019-12-22 05:09:18
问题 I have a very strange issue. My OSX app is generating an mp4 video based on a screen cast. For some reason, if I open this video in Quicktime or any OSX-based AVPlayer, it will start about 14-15 frames in advance of frame 0. If I open the mp4 with Chrome or Firefox, it will actually start playing at frame 0. What could cause this ignoring of beginning frames? Here's a screenshot of a timer countdown comparing Quicktime vs Firefox at time zero. Notice how the Firefox player starts at 9:55,

SKVideoNode as texture for SCNSphere

为君一笑 提交于 2019-12-22 04:17:17
问题 i'm trying to use a SKVideoNode as a video texture source for a SCNSphere within my SCNView I'm following this answer: SKVideoNode (embedded in SKScene) as texture for for Scene Kit Node not working And with my code (pasted at the end of the question) I do get a video and an audio playing. Issue is, the mapping only occurs on a quarter of the sphere (the all xy-positive quarter). The cameraNode is inside (0,0,0) the sphere and independent of the sphereNode. I do apply a scale to the sphere

AVAudioSession configuration to record and play with others

旧城冷巷雨未停 提交于 2019-12-22 04:15:13
问题 I want to configure AVAudioSession that I can record video with audio and also play music from Music app (or any other app that is producing sound, typically internet radios apps) I configure session like this: try? AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord, with: .mixWithOthers) try? AVAudioSession.sharedInstance().setActive(true, with: .notifyOthersOnDeactivation) This works the way that when my recording app is running I can start playing audio from

Seeking accurately, as opposed to two seconds short, in AVPlayer

99封情书 提交于 2019-12-22 02:06:11
问题 I'm using AVPlayer in a Cocoa app, and I've implemented a command that jumps to the end of the video. The problem is, AVPlayer doesn't seek to where I told it to. For example, one of the videos I have is 4 minutes and 14 seconds long. When I seek to the end, AVPlayer seeks to 4 minutes and 12 seconds—two seconds short. If I then hit play, the player will play for two seconds, then reach the end. My first attempt was this: [self.player seekToTime:self.player.currentItem.duration]; I've

Seeking accurately, as opposed to two seconds short, in AVPlayer

牧云@^-^@ 提交于 2019-12-22 02:05:30
问题 I'm using AVPlayer in a Cocoa app, and I've implemented a command that jumps to the end of the video. The problem is, AVPlayer doesn't seek to where I told it to. For example, one of the videos I have is 4 minutes and 14 seconds long. When I seek to the end, AVPlayer seeks to 4 minutes and 12 seconds—two seconds short. If I then hit play, the player will play for two seconds, then reach the end. My first attempt was this: [self.player seekToTime:self.player.currentItem.duration]; I've

Capture picture from video using AVFoundation

a 夏天 提交于 2019-12-22 01:08:52
问题 I am trying to capture a picture from a video on my iPad. I used Apple's AVCam example as a starting point. I was able to see the video in my application and to take pictures from it. My problem is that the pixel size of the result image is wrong. I want a fullscreen picture (1024x768) but I get a smaller one (1024x720). Those are my instance variables: @property (retain) AVCaptureStillImageOutput *stillImageOutput; @property (retain) AVCaptureVideoPreviewLayer *previewLayer; @property