core-audio

How can I specify the format of AVAudioEngine Mic-Input?

旧街凉风 提交于 2019-11-29 17:44:06
问题 I'd like to record the some audio using AVAudioEngine and the users Microphone. I already have a working sample, but just can't figure out how to specify the format of the output that I want... My requirement would be that I need the AVAudioPCMBuffer as I speak which it currently does... Would I need to add a seperate node that does some transcoding? I can't find much documentation/samples on that problem... And I am also a noob when it comes to Audio-Stuff. I know that I want NSData

How to mute the mic's audio input and recognize only the internal audio of the device by using Auriotouch

让人想犯罪 __ 提交于 2019-11-29 16:42:16
I have used auriotouch codes in my app and when i record the audio, it shows the audio waves . So, while recording the sound, the mic recognizes the audio input and then the waves would act accordingly to whatever the sound the mic receives. So far its fine. But now, when I click on the play button to play the sound I just recorded, the mic's input should be off, so that the waves would act only according to the audio I recorded before and the waves should not act even if I speak while it plays the previously recorded audio. So, its more or like muting the mic's input to avoid the recognition

ios - mixing midi files, each with own sound font

南笙酒味 提交于 2019-11-29 15:29:28
问题 I'm looking for a way to mix 2 or more midi files, each with their own sound font files. I've found the following code for one file and tried to do multiple music players but i guess that shouldn't be the right approach. Also i get some weird pop sound every second. So is there any other way, maybe without the musicplayer and musicsequence methods, using only au units? Here's the code i found in another thread: -(void) playMusic:(NSString*) name { NSString *presetURLPath = [[NSBundle

Simplest way to capture raw audio from audio input for real time processing on a mac

末鹿安然 提交于 2019-11-29 15:20:50
问题 What is the simplest way to capture audio from the built in audio input and be able to read the raw sampled values (as in a .wav) in real time as they come in when requested, like reading from a socket. Hopefully code that uses one of Apple's frameworks (Audio Queues). Documentation is not very clear, and what I need is very basic. 回答1: Try the AudioQueue Framework for this. You mainly have to perform 3 steps: setup an audio format how to sample the incoming analog audio start a new recording

How do I connect an AudioFilePlayer AudioUnit to a 3DMixer?

时光毁灭记忆、已成空白 提交于 2019-11-29 14:55:11
问题 I am trying to connect an AudioFilePlayer AudioUnit to an AU3DMixerEmbedded Audio Unit, but I'm having no success. Here's what I'm doing: create an AUGraph with NewAUGraph() Open the graph Initalize the graph Add 3 nodes: outputNode: kAudioUnitSubType_RemoteIO mixerNode: kAudioUnitSubType_AU3DMixerEmbedded filePlayerNode: kAudioUnitSubType_AudioFilePlayer Connect the nodes: filePlayerNode -> mixerNode mixerNode -> outputNode Configure the filePlayer Audio Unit to play the required file Start

core audio: how can one packet = one byte when clearly one packet = 4 bytes

烂漫一生 提交于 2019-11-29 11:55:20
I was going over core audio conversion services in the Learning Core Audio and I was struck by this example in their sample code : while(1) { // wrap the destination buffer in an AudioBufferList AudioBufferList convertedData; convertedData.mNumberBuffers = 1; convertedData.mBuffers[0].mNumberChannels = mySettings->outputFormat.mChannelsPerFrame; convertedData.mBuffers[0].mDataByteSize = outputBufferSize; convertedData.mBuffers[0].mData = outputBuffer; UInt32 frameCount = packetsPerBuffer; // read from the extaudiofile CheckResult(ExtAudioFileRead(mySettings->inputFile, &frameCount,

iOS Core Audio : Converting between kAudioFormatFlagsCanonical and kAudioFormatFlagsAudioUnitCanonical

╄→гoц情女王★ 提交于 2019-11-29 11:48:12
I need to convert between this format : format.mSampleRate = 44100.0; format.mFormatID = kAudioFormatLinearPCM; format.mFormatFlags = kAudioFormatFlagsCanonical | kLinearPCMFormatFlagIsNonInterleaved; format.mBytesPerPacket = sizeof(AudioUnitSampleType); format.mFramesPerPacket = 1; format.mBytesPerFrame = sizeof(AudioUnitSampleType); format.mChannelsPerFrame = 2 ; format.mBitsPerChannel = sizeof(AudioUnitSampleType)*8; and this format format.mSampleRate = 44100.0; format.mFormatID = kAudioFormatLinearPCM; format.mFormatFlags = kAudioFormatFlagsAudioUnitCanonical; format.mBytesPerPacket =

Change Volume on Mac programmatically

最后都变了- 提交于 2019-11-29 10:44:13
I'm looking for a not applescript way to change the system volume on Mac OS X programmatically. I just couldn't find a solution. Anyone any ideas? Take a look at this class: https://github.com/InerziaSoft/ISSoundAdditions It can change system volume and make use of CoreAudio API. An example of usage should look like this: [NSSound setSystemVolume:0.5] 来源: https://stackoverflow.com/questions/6278589/change-volume-on-mac-programmatically

ios - combine/concatenate multiple audio files

强颜欢笑 提交于 2019-11-29 09:01:15
I have a bunch of audio files (mp3, aac, whatever) and I want to concatenate them into 1 audio file. Has anyone done this before? Aaron Hayman I have done this. To do the concatenation you first need to load the audio files into AVAssets . Specifically, you'll want to use a subclass of AVAsset called AVURLAssets , which can load up your URL: Loading AVAsset . You can then add each AVAsset into a AVMutableComposition , which is designed to contain multiple AVAssets . Once you've gotten it loaded into AVMutableComposition , you can use AVAssetExportSession to write the composition to a file.

Callback function of output unit of audio graph - can't read data

余生长醉 提交于 2019-11-29 08:44:46
I am trying to capture the input data stream to the output unit in an audio graph so I can write it to a file. I have registered an input callback function for the output unit (default output) after creating the graph like so: AudioComponent comp = AudioComponentFindNext (NULL, &cd); if (comp == NULL) { printf ("can't get output unit"); exit (-1); } CheckError (AudioComponentInstanceNew(comp, &player->outputUnit), "Couldn't open component for outputUnit"); // outputUnit is of type AudioUnit // register render callback AURenderCallbackStruct input; input.inputProc = MyRenderProc; input