core-audio

How to handle UnsafeMutablePointer correctly

…衆ロ難τιáo~ 提交于 2019-12-01 02:28:37
问题 I am a little confused. When do I have to call free and when destroy/dealloc? I am working on a short code snippet learning core audio. I thought if I call UnsafeMutablePointer<Type>.alloc(size) then I should call destroy & dealloc . But if I use malloc() or calloc() I am supposed to call free() . In this example from Learning Core Audio the following code snippet makes me wonder: var asbds = UnsafeMutablePointer<AudioStreamBasicDescription>.alloc(Int(infoSize)) audioErr =

Split CMSampleBufferRef containing Audio

做~自己de王妃 提交于 2019-12-01 00:59:26
I'am splitting the recording into different files while recording... The problem is, captureOutput video and audio sample buffers doesn't correspond 1:1 (which is logical) - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection AUDIO START: 36796.833236847 | DURATION: 0.02321995464852608 | END: 36796.856456802 VIDEO START: 36796.842089239 | DURATION: nan | END: nan AUDIO START: 36796.856456805 | DURATION: 0.02321995464852608 | END: 36796.87967676 AUDIO START: 36796.879676764 | DURATION: 0

CMSampleBufferSetDataBufferFromAudioBufferList returning error 12731

不羁的心 提交于 2019-12-01 00:49:10
I am trying to capture app sound and pass it to AVAssetWriter as input. I am setting callback for audio unit to get AudioBufferList. The problem starts with converting AudioBufferList to CMSampleBufferRef. It always return error -12731 which indicates that required parameter is missing Thanks Karol -(OSStatus) recordingCallbackWithRef:(void*)inRefCon flags:(AudioUnitRenderActionFlags*)flags timeStamp:(const AudioTimeStamp*)timeStamp busNumber:(UInt32)busNumber framesNumber:(UInt32)numberOfFrames data:(AudioBufferList*)data { AudioBufferList bufferList; bufferList.mNumberBuffers = 1; bufferList

AVAssetReader and Audio Queue streaming problem

假如想象 提交于 2019-12-01 00:41:26
I have a problem with the AVAssetReader here to get samples from the iPod library and stream it via Audio Queue. I have not been able to find any such example so I try to implement my own but it seems that somehow the AssetReader is "screwed up" at the callback function of audio queue. Specifically it fails when it does the copyNextSampleBuffer ie it returns null when it is not finished yet. I have made sure the pointer exists and such so it will be great if anyone can help. Below is the callback function code that I have used. This callback function 'works' when it is not called by the

OSX: CoreAudio API for setting IO Buffer length?

你说的曾经没有我的故事 提交于 2019-12-01 00:08:20
This is a follow-up to a previous question: OSX CoreAudio: Getting inNumberFrames in advance - on initialization? I am trying to figure out what will be the AudioUnit API for possibly setting inNumberFrames or preffered IO buffer duration of an input callback for a single HAL audio component instance in OSX (not a plug-in!). While I understand there is a comprehensive documentation on how this can be achieved in iOS, by means of AVAudioSession API, I can neither figure out nor find documentation on setting these values in OSX, whichever API. The web is full of expert, yet conflicting

How do I get my sound to play when a remote notification is received?

我的未来我决定 提交于 2019-11-30 23:52:47
I'm trying to automatically play a sound file (that is not part of my app bundle and is not a notification sound) upon receiving a remote notification. I want this to happen whether the app is in the foreground or background when the notification is received. I'm using the Amazing Audio Engine as a wrapper around the core audio libraries. In my App Delegate's didReceiveRemoteNotification I create an Audio Controller and add AEAudioFilePlayer to it like so: NSURL *file = [NSURL fileURLWithPath:sourceFilePath]; AEAudioFilePlayer *notificationPlayer = [AEAudioFilePlayer audioFilePlayerWithURL

What's the reason of using Circular Buffer in iOS Audio Calling APP?

﹥>﹥吖頭↗ 提交于 2019-11-30 22:52:31
My question is pretty much self explanatory. Sorry if it seems too dumb. I am writing a iOS VoIP dialer and have checked some open-source code(iOS audio calling app). And almost all of those use Circular Buffer for storing recorded and received PCM audio data. SO i am wondering why we need to use a Circular Buffer in this case. What's the exact reason for using such audio buffer. Thanks in advance. Good question. There is another good reason for using Circular Buffer. In iOS, if you use callbacks(Audio unit) for recording and playing audio(In-fact you need to use it if you want to create a

iPhone: NSData representation of Audio file for Editing

╄→гoц情女王★ 提交于 2019-11-30 22:40:57
I have been scratching my head since long now but not getting around to this. I haven't found a single example for Audio editing! I want to insert new Audio file in between somewhere in original Audio file, save it as new converted audio files. For this I have written following code. I got this idea from here . NSString *file1 = [[NSBundle mainBundle] pathForResource:@"file1" ofType:@"caf"]; // Using PCM format NSString *file2 = [[NSBundle mainBundle] pathForResource:@"file2" ofType:@"caf"]; NSData *file1Data = [[NSData alloc] initWithContentsOfFile:file1]; NSData *file2Data = [[NSData alloc]

iOS8 AVAudioEngine how to send microphone data over Multipeer Connectivity?

送分小仙女□ 提交于 2019-11-30 21:10:51
I want to send microphone audio data over Multipeer Connectivity (iOS 8) and play it through the speaker of the receiving peer. I've also setup the AVAudioEngine and I can hear the microphone data from the (upper) speaker output, but I don't know how to send AVAudioPCMBuffer over the network. Here's my code snippet: AVAudioInputNode *inputNode =[self.engine inputNode]; AVAudioMixerNode *mainMixer = [self.engine mainMixerNode]; [self.engine connect:inputNode to:mainMixer format:[inputNode inputFormatForBus:0]]; [mainMixer installTapOnBus:0 bufferSize:4096 format:[mainMixer outputFormatForBus:0]

AVAssetReader and Audio Queue streaming problem

跟風遠走 提交于 2019-11-30 19:20:07
问题 I have a problem with the AVAssetReader here to get samples from the iPod library and stream it via Audio Queue. I have not been able to find any such example so I try to implement my own but it seems that somehow the AssetReader is "screwed up" at the callback function of audio queue. Specifically it fails when it does the copyNextSampleBuffer ie it returns null when it is not finished yet. I have made sure the pointer exists and such so it will be great if anyone can help. Below is the