avfoundation

Swift3 Microphone Audio Input - play without record

为君一笑 提交于 2019-12-10 09:31:06
问题 I am trying to play microphone audio input using swift3 without recording. I can record the audio with the following code: let session = AVAudioSession.sharedInstance() try! session.setCategory(AVAudioSessionCategoryPlayAndRecord, with:AVAudioSessionCategoryOptions.defaultToSpeaker) try! audioRecorder = AVAudioRecorder(url: filePath!, settings: [:]) audioRecorder.delegate = self audioRecorder.isMeteringEnabled = true audioRecorder.prepareToRecord() audioRecorder.record() and then play it back

Video orientation issues when exporting with AVMutableVideoComposition

丶灬走出姿态 提交于 2019-12-10 05:45:13
问题 Here is the function I used to export video: - (void) videoOutput { //1 - Early exit if there's no video file selected if (!self.videoAsset) { UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Error" message:@"Please Load a Video Asset First" delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil]; [alert show]; return; } // 2 - Create AVMutableComposition object. This object will hold your AVMutableCompositionTrack instances. AVMutableComposition *mixComposition = [

Play audio from AVCaptureAudioDataOutputSampleBufferDelegate

你。 提交于 2019-12-10 02:54:25
问题 I'm capturing audio using AVCaptureAudioDataOutputSampleBufferDelegate _captureSession = [[AVCaptureSession alloc] init]; [self.captureSession setSessionPreset:AVCaptureSessionPresetLow]; // Setup Audio input AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio]; AVCaptureDeviceInput *captureAudioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error]; if(error){ NSLog(@"Error Start capture Audio=%@", error); }else{ if ([self

AVCaptureSession and AVAudioSession recording video while background music playing only works once

僤鯓⒐⒋嵵緔 提交于 2019-12-10 02:19:22
问题 After looking at this question: AVAudioSession AVAudioSessionCategoryPlayAndRecord glitch, i tried to take a stab at trying to get video recording with background music playing working correctly. I'm settling for the audio glitch when recording starts, and when it ends, and it works fine the first time the recording happens. But if I try to record again, the music will stop. Any ideas why? Here's a snippet of my code: captureSession = AVCaptureSession() captureSession?

AVCaptureDeviceOutput not calling delegate method captureOutput

↘锁芯ラ 提交于 2019-12-10 01:08:09
问题 I am building an iOS application (my first) that processes video still frames on the fly. To dive into this, I followed an example from the AV* documentation from Apple. The process involves setting up an input (the camera) and an output. The output works with a delegate, which in this case is the controller itself (it conforms and implements the method needed). The problem I am having is that the delegate method never gets called. The code below is the implementation of the controller and it

How fast can iPhone to be programmed take 2 pictures at one time? [closed]

倖福魔咒の 提交于 2019-12-10 00:48:05
问题 It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center. Closed 7 years ago . I wish to write an iphone app that allows your to take 2 consecutive pictures in a very short time, and I wonder if it is achievable. May apps in the market seems to only take low resolution still frames out of

CVPixelBufferLockBaseAddress why? Capture still image using AVFoundation

倖福魔咒の 提交于 2019-12-09 17:21:43
问题 I'm writing an iPhone app that creates still images from the camera using AVFoundation. Reading the programming guide I've found a code that does almost I need to do, so I'm trying to "reverse engineering" and understand it. I'm founding some difficulties to understand the part that converts a CMSampleBuffer into an image. So here is what I understood and later the code. The CMSampleBuffer represent a buffer in the memory where the image with additional data is stored. Later I call the

AVCaptureSession for audio in simulator

空扰寡人 提交于 2019-12-09 17:16:09
问题 I'm trying to capture audio, using the method in this question; with AVCaptureSession and AVCaptureAudioDataOutput. This seems to work fine with 1 inconvenience: it doesn't work in the simulator. Both AVAudioRecorder, and the good old SpeakHere demo app, work fine in the simulator, using the internal microphone on my MacBook Pro. Problem is that [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio] gives null in the simulator, so subsequent code fails with the message (when it tries

From UnsafePointer<UnsafePointer<CFloat>> to an array of floats in Swift?

醉酒当歌 提交于 2019-12-09 16:13:41
问题 I'm trying to access AVAudioPCMBuffer.floatChannelData using Swift but it is of type UnsafePointer<UnsafePointer<CFloat>> (in Objective-C, @property(nonatomic, readonly) float *const *floatChannelData ) and any attempt I make to access it results in execution failed. Sample code to set-up a quick AVAudioPCMBuffer in a Swift Playground is included in a previous question: Getting AVAudioPCMBuffer working (AVAudioFile.mm error code -50) 回答1: Does this work? let channels = UnsafeBufferPointer

Transform not working in AVMutableVideoComposition while exporting

十年热恋 提交于 2019-12-09 14:54:59
问题 My goal is to compose a set of clips recorded from the camera and export them at a certain preferred size. Of course, the video orientation needs to be rotated before exporting. I'm doing this by composing an AVMutableComposition from an array of video clips, stored in avAssets below. I am able to compose them fine, and export it. However, the rotation transform I am setting on the AVMutableVideoComposition is not being honored. If I use the same transform and set it on the preferredTransform