avfoundation

Buffer size of CMSampleBufferRef

南楼画角 提交于 2019-12-11 11:59:24
问题 I am trying to get the size of CMSampleBufferRef from AVFoundation call-back - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection According to documentation https://developer.apple.com/library/mac/documentation/CoreMedia/Reference/CMSampleBuffer/index.html#//apple_ref/c/func/CMSampleBufferGetSampleSize size_t CMSampleBufferGetTotalSampleSize ( CMSampleBufferRef sbuf ); If I understand it

AVAudioPlayer.play() works but AVAudioPlayerNode.play() fails

扶醉桌前 提交于 2019-12-11 11:45:47
问题 I have the following Swift playground code that plays an audio file using AVAudioPlayerNode. import AVFoundation import Foundation NSSetUncaughtExceptionHandler { exception in print("Exception thrown: \(exception)") } var filePath = "/Users/fractor/Desktop/TestFile.mp3" let file : AVAudioFile do { file = try AVAudioFile(forReading: URL(fileURLWithPath: filePath)) } catch let error { print(error.localizedDescription) throw error } let audioEngine = AVAudioEngine() let playerNode =

PlayBack different voices

落花浮王杯 提交于 2019-12-11 11:25:20
问题 I am working on iphone app, I required multiple voices , e.g. Adult voice become children voice. How can i implement it, I am using AVFoundation for playing voice. Plz provide some coded. 回答1: To implement what you need, you need pitch shifting capabilities. You can use OpenAL to achieve this. Specifically, in OpenAL, to set pitch you do alSourcef(source, AL_PITCH, 1.2f); where source is the id of the OpenAL sound source. If you are new to OpenAL, get started here: http://benbritten.com/2008

AVAssetExportSession video not saving as portrait orientation

跟風遠走 提交于 2019-12-11 10:09:50
问题 I want to record on video in custom view so I'm using AVFoundation as per below code. if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputFileURL]) { /** Below code works fine (save in portrait orientation) [library writeVideoAtPathToSavedPhotosAlbum:outputFileURL completionBlock:^(NSURL *assetURL, NSError *error) { if (!error) { self.doneButton.userInteractionEnabled = YES; [videoAddr addObject:assetURL]; videoURL = outputFileURL; } }];*/ AVMutableComposition *mixComposition = [

CMSampleBufferRef to bitmap?

孤人 提交于 2019-12-11 09:46:34
问题 I'm playing around with the AVScreenShack example from Apple's website (Xcode project) which captures the desktop and displays the capture in a window in quasi real-time. I have modified the project a little bit and inserted this line of code: -(void)captureOutput:(AVCaptureOutput*) captureOutput didOutputSampleBuffer:(CMSampleBufferRef) sampleBuffer fromConnection:(AVCaptureConnection*) connection { ... } My question is : How do I convert the CMSampleBufferRef instance to CGImageRef ? Thank

AVCaptureVideoPreviewLayer is not visible on the screenshot

旧时模样 提交于 2019-12-11 09:18:06
问题 I have an application that adds some live animations and images to preview view in AV Foundation camera. I can do "hardware screenshot" (holding the Side button and Volume Up button) and it's ok. However, I need a button that makes a screenshot. All the methods of taking screenshot like UIGraphicsGetImageFromCurrentImageContext (or view.drawHierarchy() ) result in black screen where video preview is. All other elements are on the screenshot and images are visible except

Error while recording video on iphone using AVFoundation

∥☆過路亽.° 提交于 2019-12-11 08:34:40
问题 Im trying to record video using AVFoundation I can save images but not video. When trying to save a video, I got an error saying: [AVCaptureMovieFileOutput startRecordingToOutputFileURL:recordingDelegate:] - no active/enabled connections.' And here is my code: session = [[AVCaptureSession alloc] init]; //session is global object. session.sessionPreset = AVCaptureSessionPresetMedium; AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession

When is the iPhone app cache is cleared?

我只是一个虾纸丫 提交于 2019-12-11 07:45:31
问题 I'm working on an app that lets users record voice (among other things) to the Documents directory of the app. But when I'm recording the voice, I'm recording to the caches directory of the app and then after the user says "Okay, save this one", then I'm coping it to the Documents directory. So far all these work. But if I try to delete the data file in cache, or when I try to move it, I get problems. So my question is, shall I just leave the data in cache so that iOS will handle it or do I

Why does observer for status change of AVAsset not work?

馋奶兔 提交于 2019-12-11 07:37:11
问题 I implemented the following code from Apple. It is meant to observe for a change in the status of a playerItem. The problem is that for some reason it does not work. The observe function does not run when ready. All relevant code is below: func preloadVideo(media: Media) { //setup code, and then: media.playerItem1!.addObserver(self, forKeyPath: #keyPath(AVPlayerItem.status), options: [.old, .new], context: &playerItemContext) } Observe method: private var playerItemContext = 0 override func

iPhone: Real-time video color info, focal length, aperture?

一笑奈何 提交于 2019-12-11 06:58:27
问题 Is there any way using AVFoundation and CoreVideo to get color info, aperture and focal length values in real-time? Let me explain. Say when I am shooting video I want to sample the color in a small portion of the screen and output that in RGB values to the screen? Also, I would like to show what the current aperture is set at. Does anyone know if it is possible to gather these values? Currently I have only seen that this is possible with still images. Ideas? 回答1: AVCaptureStillImageOutput