core-audio

Calling MusicDeviceMIDIEvent from the audio unit's render thread

不羁岁月 提交于 2020-08-26 05:54:49
问题 There's one thing I don't understand about MusicDeviceMIDIEvent . In every single example I ever seen (searched Github and Apple examples) it was always used from the main thread. Now, in order to use the sample offset parameter the documentation states: inOffsetSampleFrame: If you are scheduling the MIDI Event from the audio unit's render thread, then you can supply a sample offset that the audio unit may apply when applying that event in its next audio unit render. This allows you to

Calling MusicDeviceMIDIEvent from the audio unit's render thread

◇◆丶佛笑我妖孽 提交于 2020-08-26 05:54:30
问题 There's one thing I don't understand about MusicDeviceMIDIEvent . In every single example I ever seen (searched Github and Apple examples) it was always used from the main thread. Now, in order to use the sample offset parameter the documentation states: inOffsetSampleFrame: If you are scheduling the MIDI Event from the audio unit's render thread, then you can supply a sample offset that the audio unit may apply when applying that event in its next audio unit render. This allows you to

NewTimePitch with Mixer

你离开我真会死。 提交于 2020-08-25 07:31:40
问题 I have a graph working that is very similar to the example app provided by Apple. https://developer.apple.com/library/ios/samplecode/MixerHost/Listings/Classes_MixerHostAudio_m.html#//apple_ref/doc/uid/DTS40010210-Classes_MixerHostAudio_m-DontLinkElementID_6 My mixerNode is being fed by custom data (rather than guitar/beats) - but the setup is similar. Both buses are stereo on the mixer. I am trying to time shift the content, but so far have been unsuccessful. I have tried adding a

What's the interleaved audio ? [closed]

房东的猫 提交于 2020-07-03 09:34:24
问题 Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers. Want to improve this question? Update the question so it's on-topic for Stack Overflow. Closed 6 years ago . Improve this question i see this interleaved audio many times on core audio documents.Can someone explain me what's really functionality of this property. 回答1: Generally speaking, if you have 2 channels, let's call them L for left and R for right, and you want to transmit or store 20

Does AUGraph deprecation means no more audio render callbacks?

北城余情 提交于 2020-06-12 08:17:30
问题 I have an app with an elaborated render callback that I doubt could do with AVAudioEngine. Anyway to use my AUGraph render callback ( with multiple buses ) with AVAudioEngine ? Any sample code ? 回答1: The Audio Unit API is not deprecated, only AUGraph which is presumably built on top of it. Make connections using AudioUnitSetProperty with kAudioUnitProperty_MakeConnection with an AudioUnitConnection struct. Start and stop your output unit with AudioOutputUnitStart and AudioOutputUnitStop. Set

Does AUGraph deprecation means no more audio render callbacks?

余生长醉 提交于 2020-06-12 08:17:07
问题 I have an app with an elaborated render callback that I doubt could do with AVAudioEngine. Anyway to use my AUGraph render callback ( with multiple buses ) with AVAudioEngine ? Any sample code ? 回答1: The Audio Unit API is not deprecated, only AUGraph which is presumably built on top of it. Make connections using AudioUnitSetProperty with kAudioUnitProperty_MakeConnection with an AudioUnitConnection struct. Start and stop your output unit with AudioOutputUnitStart and AudioOutputUnitStop. Set

How to display a red status bar when the home button is pressed?

Deadly 提交于 2020-05-27 01:54:18
问题 How do I display a red status bar when the home button is pressed while my recoding app is doing some recording? I checked this question: How to hide the red bar under the iOS's status when recording? However, I use Core Audio and Extended Audio File Services to do recording. I am unable to find any proper documentation to programatically do this. Any help or pointers will be appreciated. 回答1: I believe from saying a red status bar when the home button is pressed you are asking whether you

How to display a red status bar when the home button is pressed?

半腔热情 提交于 2020-05-27 01:54:11
问题 How do I display a red status bar when the home button is pressed while my recoding app is doing some recording? I checked this question: How to hide the red bar under the iOS's status when recording? However, I use Core Audio and Extended Audio File Services to do recording. I am unable to find any proper documentation to programatically do this. Any help or pointers will be appreciated. 回答1: I believe from saying a red status bar when the home button is pressed you are asking whether you

Connection of varispeed with RemoteIO in iOS

邮差的信 提交于 2020-05-14 03:55:06
问题 I am working with audio units to play and change the speed of playback. Since AudioGraph is deprecated. What I have done, I have successfully played Audio coming from UDP via audio-units and made connections like: converterUnit -> varispeed -> outConverterUnit -> RemoteIO (Out) Our format for playing is int16(PCM) , but varispeed requires float datatype, So we are using converters for varispeed . Here is my code: var ioFormat = CAStreamBasicDescription( sampleRate: 48000.0, numChannels: 1,

AVAudioSinkNode with non-default, but still device-native sample rates

喜夏-厌秋 提交于 2020-04-17 07:17:09
问题 I've configured AVAudioSinkNode attached to AVAudioEngine 's inputNode like so: let sinkNode = AVAudioSinkNode() { (timestamp, frames, audioBufferList) -> OSStatus in print("SINK: \(timestamp.pointee.mHostTime) - \(frames) - \(audioBufferList.pointee.mNumberBuffers)") return noErr } audioEngine.attach(sinkNode) audioEngine.connect(audioEngine.inputNode, to: sinkNode, format: nil) audioEngine.prepare() do { try audioEngine.start() print("AudioEngine started.") } catch { print("AudioEngine did