core-audio

Tap Mic Input Using AVAudioEngine in Swift

这一生的挚爱 提交于 2019-11-30 10:53:43
问题 I'm really excited about the new AVAudioEngine. It seems like a good API wrapper around audio unit. Unfortunately the documentation is so far nonexistent, and I'm having problems getting a simple graph to work. Using the following simple code to set up an audio engine graph, the tap block is never called. It mimics some of the sample code floating around the web, though those also did not work. let inputNode = audioEngine.inputNode var error: NSError? let bus = 0 inputNode.installTapOnBus(bus

ios - mixing midi files, each with own sound font

这一生的挚爱 提交于 2019-11-30 10:07:20
I'm looking for a way to mix 2 or more midi files, each with their own sound font files. I've found the following code for one file and tried to do multiple music players but i guess that shouldn't be the right approach. Also i get some weird pop sound every second. So is there any other way, maybe without the musicplayer and musicsequence methods, using only au units? Here's the code i found in another thread: -(void) playMusic:(NSString*) name { NSString *presetURLPath = [[NSBundle mainBundle] pathForResource:@"GortsMiniPianoJ1" ofType:@"SF2"]; NSURL * presetURL = [NSURL fileURLWithPath

ExtAudioFileWrite to m4a/aac failing on dual-core devices (ipad 2, iphone 4s)

旧巷老猫 提交于 2019-11-30 09:49:36
I wrote a loop to encode pcm audio data generated by my app to aac using Extended Audio File Services. The encoding takes place in a background thread synchronously, and not in real-time. The encoding works flawlessly on ipad 1 and iphone 3gs/4 for both ios 4 and 5. However, for dual-core devices (iphone 4s, ipad 2) the third call to ExtAudioFileWrite crashes the encoding thread with no stack trace and no error code. Here is the code in question: The data formats AudioStreamBasicDescription AUCanonicalASBD(Float64 sampleRate, UInt32 channel){ AudioStreamBasicDescription audioFormat;

How do I connect an AudioFilePlayer AudioUnit to a 3DMixer?

守給你的承諾、 提交于 2019-11-30 09:26:14
I am trying to connect an AudioFilePlayer AudioUnit to an AU3DMixerEmbedded Audio Unit, but I'm having no success. Here's what I'm doing: create an AUGraph with NewAUGraph() Open the graph Initalize the graph Add 3 nodes: outputNode: kAudioUnitSubType_RemoteIO mixerNode: kAudioUnitSubType_AU3DMixerEmbedded filePlayerNode: kAudioUnitSubType_AudioFilePlayer Connect the nodes: filePlayerNode -> mixerNode mixerNode -> outputNode Configure the filePlayer Audio Unit to play the required file Start the graph This doesn't work: it balks at AUGraphInitialize with error 10868 (kAudioUnitErr

Combine two .wav files in iPhone using Objective C

我们两清 提交于 2019-11-30 07:46:54
I want to combine two .wav recording files together. Can any one help me figure out how to achieve this.I tried combining the data but the headers are creating problem. Can we do the combining as we do to combine the wave files. This is how i am doing the combine, NSMutableData *datas = [NSMutableData alloc]; NSData *data1 = [NSData dataWithContentsOfFile: [recordedTmpFile1 path]]; NSData *data2 = [NSData dataWithContentsOfFile: [recordedTmpFile2 path]]; NSLog(@"file1 size : %d", [data1 length]); NSLog(@"file2 size : %d", [data2 length]); [datas appendData:data1]; [datas appendData:data2];

iPhone SDK:Saving a streamed audio file to Documents folder

大城市里の小女人 提交于 2019-11-29 23:36:35
问题 I want to save an audio file to the Documents folder of my application while one of the classes in my app is streaming it. How can I do this? Is it possible to save the streamed audio directly to an mp3? (if the audio file being streamed is an mp3 or I have to use caf?) Thanks. Edit: What if I am running the save in another thread and the user exits the application? I know an app cannot run in the background in an iPhone, but is there any way I can stop the download and remove the partially

How to correctly read decoded PCM samples on iOS using AVAssetReader — currently incorrect decoding

我怕爱的太早我们不能终老 提交于 2019-11-29 23:24:03
I am currently working on an application as part of my Bachelor in Computer Science. The application will correlate data from the iPhone hardware (accelerometer, gps) and music that is being played. The project is still in its infancy, having worked on it for only 2 months. The moment that I am right now, and where I need help, is reading PCM samples from songs from the itunes library, and playing them back using and audio unit. Currently the implementation I would like working does the following: chooses a random song from iTunes, and reads samples from it when required, and stores in a

AVPlayer vs. AVAudioPlayer

心已入冬 提交于 2019-11-29 20:45:53
The documentation for AVPlayer states the following: [The] player works equally well with local and remote media files However, the documentation for AVAudioPlayer states the following: Apple recommends that you use this class for audio playback unless you are playing audio captured from a network stream For the work I am doing I need some of the capabilities of AVAudioPlayer, but all my audio is being streamed. The main thing I need from AVAudioPlayer that AVPlayer does not have is the "playing" property. It is difficult to build a player UI without that property, among others. So what is the

IIR coefficients for peaking EQ, how to pass them to vDSP_deq22?

…衆ロ難τιáo~ 提交于 2019-11-29 19:53:01
问题 I have these 6 coefficients for peaking EQ: b0 = 1 + (α ⋅ A) b1 = −2⋅ωC b2 = 1 - (α ⋅ A) a0 = 1 + (α / A) a1 = −2 ⋅ ωC a2 = 1 − (α / A) With these intermediate variables: ωc = 2 ⋅ π ⋅ fc / fs ωS = sin(ωc) ωC = cos(ωc) A = sqrt(10^(G/20)) α = ωS / (2Q) The documentation of vDSP_deq22() states that "5 single-precision inputs, filter coefficients" should be passed but I have 6 coefficients! Also, in what order do I pass them to vDSP_deq22() ? Update (17/05): I recommend everyone to use my DSP

error in audio Unit code -remoteIO for iphone

不羁岁月 提交于 2019-11-29 18:10:29
i have this code, in order to read buffer samples , but i get a strange mach-o linker error , Framework of audio unit couldnt loaded so i put the audioTollBox and coreAudio as i read. my code is : #define kOutputBus 0 #define kInputBus 1 AudioComponentInstance audioUnit; @implementation remoteIO //callback function : static OSStatus recordingCallback(void *inRefCon, AudioUnitRenderActionFlags *ioActionFlags, const AudioTimeStamp *inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames, AudioBufferList *ioData) { AudioBuffer buffer; buffer.mNumberChannels = 1; buffer.mDataByteSize =