avfoundation

Strange behaviour after modifying exposure duration and going back to AVCaptureExposureModeContinuousAutoExposure

故事扮演 提交于 2019-12-10 20:20:02
问题 I am working on an app that exposes manual controls for the camera with the new APIs introduced in iOS 8, and I am using this sample app from WWDC 2014 as a reference. However I noticed a strange bahaviour (on my 5s and on a 6): after setting the exposure mode to "custom" and then back to "auto" the image continues to lag as if the exposure duration was not affected by this change. Here is the code involved in each step (from the sample app, without any modification): - (IBAction

Why won't AVFoundation accept my planar pixel buffers on an iOS device?

我的未来我决定 提交于 2019-12-10 19:42:51
问题 I've been struggling to figure out what the problem is with my code. I'm creating a planar CVPixelBufferRef to write to an AVAssetWriter . This pixel buffer is created manually through some other process (i.e., I'm not getting these samples from the camera or anything like that). On the iOS Simulator, it has no problem appending the samples and creating a valid output movie. But on the device, it immediately fails at the first sample and provides less than useless error information:

Play sound with a little delay

柔情痞子 提交于 2019-12-10 19:21:31
问题 I have a sound in my app that starts automatically when appear the view; but, as the title says, I'd like that this sounds starts with a little delay, about an half second after the view appear. I tried to use PlayAtTime, but or it does not work or I have set somethings wrong... This is my code: var player = AVAudioPlayer? override func viewDidLoad() { super.viewDidLoad() playAudioWithDelay() } func playAudioWithDelay() { let file = NSBundle.mainBundle().URLForResource("PR1", withExtension:

iOS Capture high resolution photo while using a low AVCaptureSessionPreset for video output

放肆的年华 提交于 2019-12-10 17:40:59
问题 I have pretty much the same question as this one below: Switching AVCaptureSession preset when capturing a photo The issue however is that the (self) answer doesn't help me one bit. I am wondering if someone has a clue as to how to do this. I am capture video frames so I can process them and do something with them. For this, I am using the AVCaptureSessionPrese640x480 as I need all the frame rate I can get while getting a decent frame for computation. Now, when the user wants to capture the

AVAssetResourceLoaderDelegate not being called

谁说胖子不能爱 提交于 2019-12-10 17:34:57
问题 I've been trying to get some messages back on the AVAssetResourceLoaderDelegate protocol but it never seems to get called. I've verified that everything is happening on the main thread; from creation of AVURLAsset , creation of the AVPlayerItem , creation of the delegate, and the delegate queue is set to the main thread. I'm trying to stream web-hosted MP4 content and unencrypted HLS content. My declarations: @property (readwrite, strong) AVPlayer* player; @property (strong) AVPlayerItem*

AVFoundation sound working on iOS 6 simulator but not device?

不想你离开。 提交于 2019-12-10 16:43:53
问题 Help! I can play sound on the iOS simulator, but not my device! Here is my code (yes, the audio file is in the specified location, it definitely works: SystemSoundID hashtag; NSString *path = [[NSBundle mainBundle] pathForResource:@"hashtag" ofType:@"wav"]; NSURL *pathURL = [NSURL fileURLWithPath:path]; AudioServicesCreateSystemSoundID((__bridge CFURLRef) pathURL, &hashtag); AudioServicesPlaySystemSound(hashtag); 回答1: Two suggestions, and I have had the same problem caused by both! 1) The ios

UIImagepickercontroller: converting to low quality video error

余生长醉 提交于 2019-12-10 15:27:11
问题 I am getting inputurl [info objectForKey:UIImagePickerControllerMediaURL] from UIImagepickercontroller's didFinishPickingMediaWithInfo 's method. NSURL *inputURL = [NSURL URLWithString:inputurlstring]; I am giving outputurl from this code NSString *documentsDirectory = [paths objectAtIndex:0]; NSString *videoPath = [NSString stringWithFormat:@"%@/%@", documentsDirectory,@"capturedvideo.MOV"]; NSURL *outputURL = [NSURL fileURLWithPath:videoPath]; I used the following code to get low quality

AVPlayer / AVPlayerLayer not appearing in subview using auto layout

生来就可爱ヽ(ⅴ<●) 提交于 2019-12-10 15:17:28
问题 I'm trying to display a video inside of a subview which contains the video and some text. I used the recommended UIView subclass from Apple to create a UIView to contain my AVPlayer: import Foundation import AVFoundation class JSAVPlayerView : UIView { var player:AVPlayer? { get { return (self.layer as! AVPlayerLayer).player } set(newPlayer) { (self.layer as! AVPlayerLayer).player = newPlayer } } override class func layerClass() -> AnyClass { return AVPlayerLayer.self } } I used

AVFoundation UIImage behind video track

南笙酒味 提交于 2019-12-10 14:39:09
问题 I'm currently rendering a video track that is smaller than the output size which is working fine. I want to draw a UIImage into the background so that the video is on top with the image showing in the area where the video isn't. I've tried using CoreAnimation Layers along with videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:inLayer: but layers below the video layer don't seem to show through (ones above show just fine) - just black or whatever background color I set on the

AVAssetImageGenerator fails at copying image

不羁岁月 提交于 2019-12-10 14:23:54
问题 I am using AVAssetImageGenerator to create an image from the last frame of a video. This usually works fine, but every now and then copyCGImageAtTime fails with the error NSLocalizedDescription = "Cannot Open"; NSLocalizedFailureReason = "This media cannot be used."; NSUnderlyingError = "Error Domain=NSOSStatusErrorDomain Code=-12431"; I am verifying that the AVAsset is not nil and I'm pulling the CMTime directly from the asset, so I do not understand why this keeps happening. This only