core-audio

How to convert AudioBufferList to CMSampleBuffer?

六眼飞鱼酱① 提交于 2019-11-29 08:13:25
I have an AudioTapProcessor attached to AVPlayerItem. which will call static void tap_ProcessCallback(MTAudioProcessingTapRef tap, CMItemCount numberFrames, MTAudioProcessingTapFlags flags, AudioBufferList *bufferListInOut, CMItemCount *numberFramesOut, MTAudioProcessingTapFlags *flagsOut) when processing. I need to convert the AudioBufferList to CMSampleBuffer so I could use AVAssetWriterAudioInput.appendSampleBuffer to write it into a movie file. So how to convert AudioBufferList to CMSampleBuffer ? I tried this but got -12731 error:Error cCMSampleBufferSetDataBufferFromAudioBufferList

AVPlayer playback of single channel audio stereo->mono

生来就可爱ヽ(ⅴ<●) 提交于 2019-11-29 07:54:20
In my iPad/iPhone App I'm playing back a video using AVPlayer. The video file has a stereo audio track but I need to playback only one channel of this track in mono. Deployment target is iOS 6. How can I achieve this? Thanks a lot for your help. I now finally found an answer to this question - at least for deployment on iOS 6. You can easily add an MTAudioProcessingTap to your existing AVPlayer item and copy the selected channels samples to the other channel during your process callback function. Here is a great tutorial explaining the basics: http://chritto.wordpress.com/2013/01/07/processing

how to get Audio Device UID to pass into NSSound's setPlaybackDeviceIdentifier:

末鹿安然 提交于 2019-11-29 05:06:49
How can i get audio device UID (USB speaker) to pass into NSSound's setPlaybackDeviceIdentifier: method Thanks To avoid the deprecated AudioHardwareGetProperty and AudioDeviceGetProperty calls replace them with something like this: AudioObjectPropertyAddress propertyAddress; AudioObjectID *deviceIDs; UInt32 propertySize; NSInteger numDevices; propertyAddress.mSelector = kAudioHardwarePropertyDevices; propertyAddress.mScope = kAudioObjectPropertyScopeGlobal; propertyAddress.mElement = kAudioObjectPropertyElementMaster; if (AudioObjectGetPropertyDataSize(kAudioObjectSystemObject,

How to use “kAudioUnitSubType_VoiceProcessingIO” subtype of core audio API in mac os?

倾然丶 夕夏残阳落幕 提交于 2019-11-29 04:51:50
I'm finding an example of simple play-thru application using built-in mic/speaker with kAudioUnitSubType_VoiceProcessingIO subtype(not kAudioUnitSubType_HALOutput) in macosx. The comments on the core audio api says that kAudioUnitSubType_VoiceProcessingIO is available on the desktop and with iPhone 3.0 or greater, so I think that there must be an example somewhere for macos. Do you have any idea where the sample is? or Is there anyone who know how to use the kAudioUnitSubType_VoiceProcessingIO subtype in macos? I already tried the same way that I did in iOS, but it didn't work. I discovered a

Framework not found AudioUnit

有些话、适合烂在心里 提交于 2019-11-29 02:54:28
Been banging my head agains the wall for awhile now. My Xcode project went a little haywire while refactoring, and refused to build. I've squashed all the other errors, except one last linktime error: Framework not found AudioUnit I have the AudioUnit headers, the AudioUnit.framework is included in my project as it was before (Targets > Get Info > General > Linked Libraries > + ), but I cannot figure out why it does not work now. AudioToolbox.framework is also included. Remove AudioUnit.framework and add CoreAudio.framework Helped for me: removing AudioUnit.framework, then adding AudioToolbox

iOS Code to Convert m4a to WAV

心已入冬 提交于 2019-11-29 02:46:44
Does anyone have any code snippets that show how to convert an M4a file to WAV? I know there are libraries that convert the other way around. Thanks. Just to update for Swift 3: func convertAudio(_ url: URL, outputURL: URL) { var error : OSStatus = noErr var destinationFile: ExtAudioFileRef? = nil var sourceFile : ExtAudioFileRef? = nil var srcFormat : AudioStreamBasicDescription = AudioStreamBasicDescription() var dstFormat : AudioStreamBasicDescription = AudioStreamBasicDescription() ExtAudioFileOpenURL(url as CFURL, &sourceFile) var thePropertySize: UInt32 = UInt32(MemoryLayout.stride

iOS: Using Bluetooth audio output (kAudioSessionProperty_OverrideCategoryEnableBluetoothInput) AudioSession

主宰稳场 提交于 2019-11-29 02:28:40
I have several questions to the CoreAudio AudioSession framework related to several Bluetooth tasks and I hope someone can help me with these issues or at least can confirm my latest findings. The usecase is a navigation app that wants to connect with a bluetooth enabled radio which supports both, HFP and A2DP. I have read the whole AudioSession Programming Guidelines but I have still some open issues especially using audio output through Bluetooth. Bluetooth HFP audio output (kAudioSessionOutputRoute_BluetoothHFP) is only possible in case of AudioSession kAudioSessionCategory_PlayAndRecord is

Determine Number of Frames in a Core Audio AudioBuffer

前提是你 提交于 2019-11-29 02:24:09
I am trying to access the raw data for an audio file on the iPhone/iPad. I have the following code which is a basic start down the path I need. However I am stumped at what to do once I have an AudioBuffer. AVAssetReader *assetReader = [AVAssetReader assetReaderWithAsset:urlAsset error:nil]; AVAssetReaderTrackOutput *assetReaderOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:[[urlAsset tracks] objectAtIndex:0] outputSettings:nil]; [assetReader addOutput:assetReaderOutput]; [assetReader startReading]; CMSampleBufferRef ref; NSArray *outputs = assetReader.outputs;

Sound not playing with AVAudioPlayer

[亡魂溺海] 提交于 2019-11-29 02:17:09
I've searched and I believe my problem is quite unique. I'm aware of the Simulator 5.1 bug when using AVAudioPlayer which isn't my problem. I'm running on a iOS 5.1 device. Here's my header file: #import <UIKit/UIKit.h> #import <Foundation/Foundation.h> #import <AVFoundation/AVAudioPlayer.h> -(IBAction)pushBell; @end and my implementation file: #import "BellViewController.h" @interface BellViewController () @end @implementation BellViewController -(IBAction)pushBell { NSString *soundPath =[[NSBundle mainBundle] pathForResource:@"bell1" ofType:@"caf"]; NSURL *soundURL = [NSURL fileURLWithPath

AVAudioEngine multiple AVAudioInputNodes do not play in perfect sync

青春壹個敷衍的年華 提交于 2019-11-28 23:52:53
I've been trying to use AVAudioEngine to schedule multiple audio files to play in perfect sync, but when listening to the output there seems to be a very slight delay between input nodes. The audio engine is implemented using the following graph: // //AVAudioPlayerNode1 --> //AVAudioPlayerNode2 --> //AVAudioPlayerNode3 --> AVAudioMixerNode --> AVAudioUnitVarispeed ---> AvAudioOutputNode //AVAudioPlayerNode4 --> | //AVAudioPlayerNode5 --> AudioTap // | //AVAudioPCMBuffers // And I am using the following code to load the samples and schedule them at the same time: - (void