core-audio

Setting rate on AudioUnit subtype kAudioUnitSubType_NewTimePitch

不问归期 提交于 2019-12-12 03:59:49
问题 I'm trying to get/set the rate of an AudioUnit with subtype kAudioUnitSubType_NewTimePitch . The audio unit is added to an AUGraph , through an AUNode , which has the following component description: acd->componentType = kAudioUnitType_Effect; acd->componentSubType = kAudioUnitSubType_NewTimePitch; acd->componentManufacturer = kAudioUnitManufacturer_Apple; According to AudioUnitParameters.h , get/set the rate should be as simple as get/set the rate parameter on the audio unit. // rate control

Core audio: file playback render callback function

隐身守侯 提交于 2019-12-12 03:45:48
问题 I am using RemoteIO Audio Unit for audio playback in my app with kAudioUnitProperty_ScheduledFileIDs . Audio files are in PCM format. How can I implement a render callback function for this case, so I could manually modify buffer samples? Here is my code: static AudioComponentInstance audioUnit; AudioComponentDescription desc; desc.componentType = kAudioUnitType_Output; desc.componentSubType = kAudioUnitSubType_RemoteIO; desc.componentManufacturer = kAudioUnitManufacturer_Apple; desc

ios Core audio: AudioFilePlayer Unit render callback

岁酱吖の 提交于 2019-12-12 03:38:02
问题 I am trying to create a render callback for my AudioFilePlayer Unit. I created two audio units: static AudioComponentInstance audioUnit; // AudioFilePlayer static AudioComponentInstance rioUnit; // RemoteIO Unit Audio units init code: AudioComponentDescription filePlayerDesc; filePlayerDesc.componentType = kAudioUnitType_Generator; filePlayerDesc.componentSubType = kAudioUnitSubType_AudioFilePlayer; filePlayerDesc.componentManufacturer = kAudioUnitManufacturer_Apple; filePlayerDesc

How to convert a wav or caf audio file on iPhone

你。 提交于 2019-12-12 02:33:20
问题 I am working on a iPhone IM app, which support audio message. I've tried caf and wav format follow "Speak Here", but the files are too large to be send through the internet. So I'm wondering if I can convert any of them to some small format, like mp3 or amr. Thank you for you help in advance. 回答1: CoreAudio has som great built-in datatypes for just this Core Audio Data Types Use iLBC, short for Internet Low Bitrate Codec, to record your audio. Limit your sample rate to 8kHz. If you insist on

OSX Core MIDI- Calling MIDIPacketListAdd from NSTimer

岁酱吖の 提交于 2019-12-12 01:52:20
问题 I'm sending a string of MIDI packets using MIDIPacketListAdd. I want to change the packets dynamically during playback, so I'm using a NSTimer to add each packet before the scheduled time the packet should go out. The code works perfectly, the address of the current packet is updated correctly, I have verified that all the data is correct. However, no MIDI data is being sent when MIDIPacketListAdd is called from the timer. I add the first two packets before starting the timer to make sure

Recording audio output only from speaker of iphone excluding microphone

ⅰ亾dé卋堺 提交于 2019-12-12 00:03:06
问题 I am trying to record the sound from iPhone speaker. I am able to do that, but I am unable to avoid mic input in the recorded output. Tried with sample code available in different websites with no luck. The sample which I used does the recording with audio units. I need to know if there is any property for audio unit to set the mic input volume to zero. Above that I came to from other posts that Audio Queue services can do the thing for me. Can any one redirect me with sample code for the

General param error retrieving record format with AudioQueueGetProperty

和自甴很熟 提交于 2019-12-11 18:34:10
问题 I am getting a -50 (general param error) from a call to AudioQueueGetProperty. Please help me as it has been several months since I've touched XCode and any iPhone work. This is likely a simple goof on my behalf but I cannot resolve it. My code leading to the -50: //Setup format AudioStreamBasicDescription recordFormat; memset(&recordFormat, 0, sizeof(recordFormat)); recordFormat.mFormatID = kAudioFormatMPEG4AAC; recordFormat.mChannelsPerFrame = 2; CCGetDefaultInputDeviceSampleRate(

Interperating AudioBuffer.mData to display audio visualization

馋奶兔 提交于 2019-12-11 18:31:58
问题 I am trying to process audio data in real-time so that I can display an on-screen spectrum analyzer/visualization based on sound input from the microphone. I am using AVFoundation's AVCaptureAudioDataOutputSampleBufferDelegate to capture the audio data, which is triggering the delgate function captureOutput . Function below: func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) { autoreleasepool { guard captureOutput != nil

OpenAL alSourceUnqueueBuffers & alSourceUnqueueBuffers

♀尐吖头ヾ 提交于 2019-12-11 17:46:43
问题 everyone , I have a problem about the API-- alSourceUnqueueBuffers when I use the OpenAL Libaray. My problem as follows: 1.I play a pcm-music though streaming mechanism. 2.The application can queue up one or multiple buffer names using alSourceQueueBuffers. when a buffer has been processed. I want to fill new audio data in my function: getSourceState . but when I use the API of OpenAL alSourceUnqueueBuffers. it returns an error --- AL_INVALID_OPERATION . I do this as the document about the

Audio units dynamic registration

血红的双手。 提交于 2019-12-11 16:49:41
问题 We have developed a custom audio unit and audio unit hosting application. We are trying to register the custom audio unit dynamically from the application. Below code snippet is used to register audio unit dynmaically. (this code snippet is mentioned in Apple technical note Technical Note TN2247) #include <AudioUnit/AudioComponent.h> extern AudioComponentPlugInInterface* MyExampleAUFactoryFunction(const AudioComponentDescription *inDesc); OSStatus RegisterMyExampleAudioUnit() { // fill out