core-audio

How to select external microphone

前提是你 提交于 2021-02-18 08:13:27
问题 I've successfully written a simple recording app for iOS that uses AVAudioRecorder. So far it works with either the internal microphone or an external microphone if it's plugged in to the headphone jack. How do I select an audio source that is connected through the USB "lightning port"? Do I have to dive into Core Audio? Specifically I'm trying to connect an Apogee Electronics ONE USB audio interface. 回答1: Using AVAudioSession, get the availableInputs. The return value is an array of

Loop AVMutableCompositionTrack

六眼飞鱼酱① 提交于 2021-02-17 17:27:29
问题 I have got two audio tracks on me that I combine with one another like this: AVMutableComposition *composition = [[AVMutableComposition alloc] init]; AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; [compositionAudioTrack setPreferredVolume:1.0]; AVAsset *avAsset = [AVURLAsset URLAssetWithURL:originalContentURL options:nil]; AVAssetTrack *clipAudioTrack = [[avAsset tracksWithMediaType

Loop AVMutableCompositionTrack

跟風遠走 提交于 2021-02-17 17:23:44
问题 I have got two audio tracks on me that I combine with one another like this: AVMutableComposition *composition = [[AVMutableComposition alloc] init]; AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; [compositionAudioTrack setPreferredVolume:1.0]; AVAsset *avAsset = [AVURLAsset URLAssetWithURL:originalContentURL options:nil]; AVAssetTrack *clipAudioTrack = [[avAsset tracksWithMediaType

Loop AVMutableCompositionTrack

南笙酒味 提交于 2021-02-17 17:22:07
问题 I have got two audio tracks on me that I combine with one another like this: AVMutableComposition *composition = [[AVMutableComposition alloc] init]; AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; [compositionAudioTrack setPreferredVolume:1.0]; AVAsset *avAsset = [AVURLAsset URLAssetWithURL:originalContentURL options:nil]; AVAssetTrack *clipAudioTrack = [[avAsset tracksWithMediaType

IOS, No output when AUGraphConnectNodeInput is set

馋奶兔 提交于 2021-02-11 14:12:50
问题 I am stuck with AUGraph for a while now and would really appreciate if my problem is resolved. What is am trying to do right now is to play data(bytes) coming from UDP. I have successfully achieved how to play data using AUGraph but i can't figure out how to change its playback speed. My current scenario is to get Data from UDP, pass it a converter -> newTimePitch -> converter -> ioUnit. Converters: Converters, convert ASBD of 48000 to desired format for timePitch and again convert it back to

Popping noise between AudioQueueBuffers

99封情书 提交于 2021-02-10 05:26:11
问题 I'm trying to play a pure sine wave tone using Core Audio AudioQueue's (Swift 3). It plays nicely, but I'm getting popping noises every time my AudioQueueOutputCallback is invoked to fill a new buffer with audio data. My AudioStreamer class looks like: let kNumberBuffers = 3 protocol AudioStreamerDelegate { func requestAudioData() -> [Float] } let sampleRate = 48000.0 let bufferSize = Int(sampleRate) / 50 let bufferByteSize = UInt32(bufferSize * sizeof(Float)) // 20 mili sec of audio class

Popping noise between AudioQueueBuffers

霸气de小男生 提交于 2021-02-10 05:25:51
问题 I'm trying to play a pure sine wave tone using Core Audio AudioQueue's (Swift 3). It plays nicely, but I'm getting popping noises every time my AudioQueueOutputCallback is invoked to fill a new buffer with audio data. My AudioStreamer class looks like: let kNumberBuffers = 3 protocol AudioStreamerDelegate { func requestAudioData() -> [Float] } let sampleRate = 48000.0 let bufferSize = Int(sampleRate) / 50 let bufferByteSize = UInt32(bufferSize * sizeof(Float)) // 20 mili sec of audio class

play the sound from mic using the AUGraph

余生颓废 提交于 2021-02-09 15:28:22
问题 when i'am using the AUGraph to realize playing sound from the mic, i have a problem, in the device(iphone 3g), i can only hear from the right side of the headset, but it is well in the simulator, i can hear from both of the side of the headset. here are the code i use to connect the input to the output: AUGraphConnectNodeInput(auGraph, remoteIONode, 1, remoteIONode, 0); someone help me? tks! 回答1: output is double channel, input is also double channel. i find that the data from mic is single

play the sound from mic using the AUGraph

寵の児 提交于 2021-02-09 15:24:41
问题 when i'am using the AUGraph to realize playing sound from the mic, i have a problem, in the device(iphone 3g), i can only hear from the right side of the headset, but it is well in the simulator, i can hear from both of the side of the headset. here are the code i use to connect the input to the output: AUGraphConnectNodeInput(auGraph, remoteIONode, 1, remoteIONode, 0); someone help me? tks! 回答1: output is double channel, input is also double channel. i find that the data from mic is single

Connecting AVAudioSourceNode to AVAudioSinkNode does not work

…衆ロ難τιáo~ 提交于 2021-02-08 08:26:19
问题 Context I am writing a signal interpreter using AVAudioEngine which will analyse microphone input. During development, I want to use a default input buffer so I don't have to make noises for the microphone to test my changes. I am developing using Catalyst. Problem I am using AVAudioSinkNode to get the sound buffer (the performance is allegedly better than using .installTap ). I am using (a subclass of) AVAudioSourceNode to generate a sine wave. When I connect these two together, I expect the