core-audio

Implementing a post-processed low-pass filter using core audio

可紊 提交于 2019-12-04 16:56:37
I have implemented a rudimentary low-pass filter using a time based value. This is ok, but trying to find the correct time slice is guess work, and gives different results based on different input audio files. Here is what I have now: - (void)processDataWithInBuffer:(const int16_t *)buffer outBuffer:(int16_t *)outBuffer sampleCount:(int)len { BOOL positive; for(int i = 0; i < len; i++) { positive = (buffer[i] >= 0); currentFilteredValueOfSampleAmplitude = LOWPASSFILTERTIMESLICE * (float)abs(buffer[i]) + (1.0 - LOWPASSFILTERTIMESLICE) * previousFilteredValueOfSampleAmplitude;

iPhone Extended Audio File Services, mp3 -> PCM -> mp3

こ雲淡風輕ζ 提交于 2019-12-04 16:30:42
I would like to use the Core Audio extended audio file services framework to read a mp3 file, process it as a PCM, then write the modified file back as a mp3 file. I am able to convert the mp3 file to PCM, but am NOT able to write the PCM file back as a mp3. I have followed and analyzed the Apple ExtAudioFileConvertTest sample and also cannot get that to work. The failure point is when I set the client format for the output file(set to a canonical PCM type). This fails with error "fmt?" if the output target type is set to mp3. Is it possible to do mp3 -> PCM -> mp3 on the iPhone? If I remove

Headphones unplugged during playback cause bug on iPhone app

此生再无相见时 提交于 2019-12-04 15:23:39
I am creating an application based on the Speak Here example app . I want the audio to play through the headphones if they are plugged in or otherwise default to the speakers. I've used the bottom bit of code to make this happen and it works fine unless the headphones are unplugged during playback. At that point the playback ends, which is okay. The problem is when I hit play again the playback comes out all weird and the stop button stops working. It also starts playback from where it left off rather then resetting from the beginning as it normally does when you hit the stop button. Likewise

Play an audio file using RemoteIO and Audio Unit

爷,独闯天下 提交于 2019-12-04 14:35:48
问题 I am looking at Apple's 'aurioTouch' example for the iPhone and I would like to play an mp3 or wav instead of using the built in mic. I am very new to the audio portion of iPhone programming, but I think I need to modify the SetupRemoteIO(...) function and replace the AudioComponent named 'comp' with a custom AudioComponent that plays a file. Basically I want the app to function exactly the same as the original, but with an audio file as the input instead of the mic. 回答1: You just need to

core audio guidance / primer

假装没事ソ 提交于 2019-12-04 14:14:30
问题 I've being doing some reading up on core audio for ios 4 with the aim of building a little test app. I'm pretty confused at this point in reseach with all the api's. Ideally what I want to know how to do is to extract a number of samples from two mp3s into arrays. Then in a callback loop I want to mix these samples together and send them to the speaker. There are examples on the apple dev site but I'm finding them difficult to disect and digest. Is anybody aware of a nice stripped down

How to obtain accurate decibel level with Cocoa?

六月ゝ 毕业季﹏ 提交于 2019-12-04 13:49:57
问题 We are creating an application which records the surrounding sound and take necessary action if the sound crosses specified Decibel. In order to achieve the application objective we are using following method from AudioQueueObject.h - (void) getAudioLevels: (Float32 *) levels peakLevels: (Float32 *) peakLevels { UInt32 propertySize = audioFormat.mChannelsPerFrame * sizeof (AudioQueueLevelMeterState); AudioQueueGetProperty( self.queueObject, (AudioQueuePropertyID)kAudioQueueProperty

AudioQueue ate my buffer (first 15 milliseconds of it)

孤者浪人 提交于 2019-12-04 13:43:50
问题 I am generating audio programmatically. I hear gaps of silence between my buffers. When I hook my phone to a scope, I see that the first few samples of each buffer are missing, and in their place is silence. The length of this silence varies from almost nothing to as much as 20 ms. My first thought is that my original callback function takes too much time. I replace it with the shortest one possible--it re-renqueues the same buffer over and over. I observe the same behavior. AudioQueueRef aq;

AudioQueue in-memory playback example

你离开我真会死。 提交于 2019-12-04 13:37:14
问题 Does anybody know of any examples using AudioQueue that play from an in-memory source? All the examples I can find play from files (using AudioFileReadPackets) but in my particular case I am generating the data myself in realtime so ideally, I want to enqueue the data myself rather than sucking it out of a file using the callback. Any help much appreciated. 回答1: i know of an example using audio units that you could adapt, as the callbacks are very similiar, try here 回答2: The Audio Queue

How would you connect an iPod library asset to an Audio Queue Service and process with an Audio Unit?

99封情书 提交于 2019-12-04 12:07:40
I need to process audio that comes from the iPod library. The only way to read an asset for the iPod library is AVAssetReader. To process audio with an Audio Unit it needs to be in stereo format so I have values for the left and right channels. But when I use AVAssetReader to read an asset from the iPod library it does not allow me to get it out in stereo format. It comes out in interleaved format which I do not know how to break into left and right audio channels. To get to where I need to go I would need to do one of the following: Get AVAssetReader to give me an AudioBufferList in stereo

Playback and Recording simultaneously using Core Audio in iOS

寵の児 提交于 2019-12-04 11:31:39
问题 I need to play and record simultaneously using Core Audio. I really do not want to use AVFoundation API (AVAudioPlayer + AVAudioRecorder) to do this as I am making a music app and cannot have any latency issues. I've looked at the following source code from Apple: aurioTouch MixerHost I've already looked into the following posts: iOS: Sample code for simultaneous record and playback Record and play audio Simultaneously I am still not clear on how I can do playback and record the same thing