core-audio

Playing WAV data with AVAudioEngine

…衆ロ難τιáo~ 提交于 2019-12-10 22:17:33
问题 Currently, I'm getting an EXC_BAD_ACCESS error on the audio thread, and I'm trying to deduce what is going wrong. When converting .wav file data from Data to an AVAudioPCMBuffer , do I need to strip the RIFF header first? import AVFoundation public class Player : NSObject { let engine = AVAudioEngine() public override init() { super.init() do { let _ = engine.mainMixerNode try engine.start() } catch { print("Player error: \(error)") } } @objc public func play(_ data: Data) { let format =

AVAudioPCMBuffer built programmatically, not playing back in stereo

眉间皱痕 提交于 2019-12-10 21:55:32
问题 I'm trying to fill an AVAudioPCMBuffer programmatically in Swift to build a metronome. This is the first real app I'm trying to build, so it's also my first audio app. Right now I'm experimenting with different frameworks and methods of getting the metronome looping accurately. I'm trying to build an AVAudioPCMBuffer with the length of a measure/bar so that I can use the .Loops option of the AVAudioPlayerNode's scheduleBuffer method. I start by loading my file(2 ch, 44100 Hz, Float32, non

iOS sine wave generation - audible clicking

喜夏-厌秋 提交于 2019-12-10 21:47:05
问题 I am in the process of creating a synthesiser for iOS. After playing around and attempting to learn core audio, I have encountered a problem that I cannot get my head around. My sine wave makes a clicking noise on regular intervals, which Im guessing is related to the phase. I have looked at several guides and books on the subject, and all suggest that I am doing it correctly. If anybody would be so kind to look at my code for me it would be greatly appreciated. static OSStatus renderInput

AudioUnitInitialize returns -10851 (kAudioUnitErr_InvalidPropertyValue)

谁都会走 提交于 2019-12-10 21:17:47
问题 Suppose the code is: ... status = AudioUnitSetProperty( unit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Input, element, &format, sizeof(AudioStreamBasicDescription)); ... status = AudioUnitInitialize(unit); The error manifests in AudioUnitInitialize returning kAudioUnitErr_InvalidPropertyValue and the following message being printed in the debugger console: [pool] <aurioc> 806: failed: -10851 (enable 2, outf< 2 ch, 48000 Hz, Int16, inter> inf< 2 ch, 0 Hz, Float32, non-inter>) If you've

Audioqueue callback not being called

爷,独闯天下 提交于 2019-12-10 21:07:28
问题 So, basically I want to play some audio files (mp3 and caf mostly). But the callback never gets called. Only when I call them to prime the queue. Here's my data struct: struct AQPlayerState { CAStreamBasicDescription mDataFormat; AudioQueueRef mQueue; AudioQueueBufferRef mBuffers[kBufferNum]; AudioFileID mAudioFile; UInt32 bufferByteSize; SInt64 mCurrentPacket; UInt32 mNumPacketsToRead; AudioStreamPacketDescription *mPacketDescs; bool mIsRunning; }; Here's my callback function: static void

MIDI MusicDevice AudioUnit: Playing two notes of same pitch, stop one?

心不动则不痛 提交于 2019-12-10 17:38:40
问题 I am quite a novice when it comes to AudioUnits, so please forgive me if my question is very basic. I am using the MusicDevice AudioUnit to playback some notes. I am using MusicDeviceMIDIEvent to send the note-on and note-off messages. It works well. Sometimes more than one note should sound simultaneously, so I may send two note-on messages in a row. Sometimes these notes happen to have the same pitch. Then when I want to turn off one of the notes, I send a note-off event for this pitch. But

Haskell audio output on OS X?

こ雲淡風輕ζ 提交于 2019-12-10 13:04:23
问题 I'd like to be able to output audio from Haskell. I'm currently using GHC 6.10 on OS X (Snow Leopard). I've tried building the jack library (using JackOSX) and the PortAudio library, but neither of them seemed effective. Is there a relatively simple way to do live audio output from a Haskell program on a Mac? Edit: Clarity 回答1: I've been using PortAudio successfully. I took some excerpts from my toy program to make a very simple "echo" example, below: (run with headphones. this is a feedback

Changing setPreferredIOBufferDuration at Runtime results in Core Audio Error -50

故事扮演 提交于 2019-12-10 12:07:19
问题 I am writing an Audio Unit (remote IO) based app that displays waveforms at a given buffer size. The app initially starts off with a preferred buffer size of 0.0001 which results in very small buffer frame sizes (i think its 14 frames). Than at runtime I have a UI element that allows switching buffer frame sizes via AVAudioSession 's method setPreferredIOBufferDuration:Error: . Here is the code where the first two cases change from a smaller to a larger sized buffer. 3-5 are not specified yet

How to get and save the mixed of multiple audios in to single audio in swift

折月煮酒 提交于 2019-12-10 11:48:08
问题 I have multiple audios files(more than 3). By using the AVAudioEngine and AVAudioMixerNode I am playing the all audio tracks into a single track. I want to save the mixed audio in the document directory. Give suggestions to mix the multiple audios files and save in a document directory. Thank you 回答1: Try this: func mergeAudioFiles(files: [URL], completion: @escaping (_ succeeded: Bool)->()) { let composition = AVMutableComposition() guard let compositionAudioTrack:AVMutableCompositionTrack =

iOS App will not die, writes to console & plays sounds after quit

☆樱花仙子☆ 提交于 2019-12-10 10:54:43
问题 My app has some kind of Zombie problem. (Not an NSZombie problem. Like, a coming-back-from-the-dead problem.) I first noticed that after a debugging session, when I would go for a run the music on my iPhone would pause every ~7 minutes, and when I'd unlock the device the app name would be flashing red in the status bar as though it had just crashed. Sometimes there would even be phantom sound from the app, like it was still running in the background somehow. Manually quit the app, continue.