avfoundation

How to use layer instruction into videoCompositionWithAsset:applyingCIFiltersWithHandler method

浪子不回头ぞ 提交于 2019-12-06 10:11:41
Engineering has provided the following information regarding this issue: CoreImage filtering and layer instruction based composition can't be used simultaneously. Layer instructions won't be run when added to an AVMutableVideoComposition that it is initialized with +[videoCompositionWithAsset:applyingCIFiltersWithHandler:]. To use layer instructions in this case, move the functionality into the handler instead of adding the layer instructions to the AVMutableVideoComposition. This is what i found from apple engineer. It says To use layer instructions in this case, move the functionality into

Routing audio input to receive from TOP microphone on iPhone

天大地大妈咪最大 提交于 2019-12-06 09:47:23
I am writing a little app to record multiple tracks and play them back over one another. I am using the PlaybackAndRecord mode and i am routing my output to the main speakers. Problem is that the bottom microphone is still being used for input as well so now I when I record I get the output from the other tracks really loud on the new track. Here is what I have so far: audioSession = [AVAudioSession sharedInstance]; [audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error:nil]; OSStatus propertySetError = 0; UInt32 allowMixing = true; propertySetError = AudioSessionSetProperty

AVFoundation crop captured still image according to the preview aspect ratio

时光毁灭记忆、已成空白 提交于 2019-12-06 08:54:46
My question is mostly similar to this one: Cropping image captured by AVCaptureSession I have an application which uses AVFoundation for capturing still images. My AVCaptureVideoPreviewLayer has AVLayerVideoGravityResizeAspectFill video gravity thus making preview picture which is shown to the user to be cropped from the top and from the bottom parts. When user is pressing "Capture" button, the image actually captured is differs from the preview picture shown to user. My question is how to crop captured image accordingly? Thanks in advance. I used UIImage+Resize category provided in here with

iOS - AVAssestExportSession can only export maximum 8 tracks after playing with AVPlayer

这一生的挚爱 提交于 2019-12-06 08:33:53
问题 I'm trying to loop some fragments of a recorded video and merge them into one video. I've successfully merged and exported a composition with up to 16 tracks. But when I try to play the composition using AVPlayer before merging, I can only export a maximum of 8 tracks. First, I create AVComposition and AVVideoComposition +(void)previewUserClipDanceWithAudio:(NSURL*)videoURL audioURL:(NSURL*)audioFile loop:(NSArray*)loopTime slowMotion:(NSArray*)slowFactor showInViewController:

How to control video frame rate with AVAssetReader and AVAssetWriter?

一世执手 提交于 2019-12-06 08:28:51
问题 We are trying to understand how to control/specify the frame rate for videos that we are encoding with AVAssetReader and AVAssetWriter . Specifically, we are are using AVAssetReader and AVAssetWriter to transcode/encode/compress a video that we have accessed from the photo/video gallery. We are able to control things like bit rate, aspect ratio changes, etc., but cannot figure out how to control the frame rate. To be specific, we'd like to be able to take as input a 30 FPS video that's 5

Get an edited photo's URL from PHAsset

元气小坏坏 提交于 2019-12-06 08:20:06
问题 I'm trying to get a photo's URL from a PHAsset using this code. let options: PHContentEditingInputRequestOptions = PHContentEditingInputRequestOptions() options.canHandleAdjustmentData = {(adjustmeta: PHAdjustmentData) -> Bool in return true } asset.requestContentEditingInput(with: options, completionHandler: { (contentEditingInput, info) in guard let url = contentEditingInput?.fullSizeImageURL else { observer.onError(PHAssetError.imageRequestFailed) return } /// Using this `url` }) Most of

Need hints for using iPhone SDK's AVMutableVideoComposition

橙三吉。 提交于 2019-12-06 08:07:13
问题 The AVFoundation framework provides the AVMutableVideoComposition class (the mutable variant of AVVideoComposition). It looks like you can render CoreAnimations directly to an instance of this class to create a video but I don't know how to save the composition to a file or how to work with it at all, really. The following code called from a UIViewController appears to work to create the composition and the animation but, then, well, I'm stumped as to how to work with the composition. Any

Cropping AVAsset video with AVFoundation not working iOS 8

本小妞迷上赌 提交于 2019-12-06 08:04:39
问题 This has been bugging me for the last day, I used to use this method in ObjC to crop videos into square, it seems to be the only method i've found in a few years that worked but after recently trying to crop using it in Swift & iOS 8 it doesn't seem to crop the video at all, Hopefully somebody can help? func captureOutput(captureOutput: AVCaptureFileOutput!, didFinishRecordingToOutputFileAtURL outputFileURL: NSURL!, fromConnections connections: [AnyObject]!, error: NSError!) { if error != nil

AVFoundation - Get grayscale image from Y plane (kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)

雨燕双飞 提交于 2019-12-06 08:01:03
问题 I'm using AVFoundation to take a video and I'm recording in kCVPixelFormatType_420YpCbCr8BiPlanarFullRange format. I want to make grayscale image directly from the Y plane of the YpCbCr format. I've tried to create CGContextRef by calling CGBitmapContextCreate , but the problem is, that I don't know what colorspace and pixelformat to choose. - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *

Capturing a still image from camera in OpenCV Mat format

こ雲淡風輕ζ 提交于 2019-12-06 07:52:13
问题 I am developing an iOS application and trying to get a still image snapshot from the camera using capture session but I'm unable to convert it successfully to an OpenCV Mat. The still image output is created using this code: - (void)createStillImageOutput; { // setup still image output with jpeg codec self.stillImageOutput = [[AVCaptureStillImageOutput alloc] init]; NSDictionary *outputSettings = [NSDictionary dictionaryWithObjectsAndKeys:AVVideoCodecJPEG, AVVideoCodecKey, nil]; [self