core-audio

ExtAudioFileWrite to m4a/aac failing on dual-core devices (ipad 2, iphone 4s)

谁说我不能喝 提交于 2019-12-18 13:34:00
问题 I wrote a loop to encode pcm audio data generated by my app to aac using Extended Audio File Services. The encoding takes place in a background thread synchronously, and not in real-time. The encoding works flawlessly on ipad 1 and iphone 3gs/4 for both ios 4 and 5. However, for dual-core devices (iphone 4s, ipad 2) the third call to ExtAudioFileWrite crashes the encoding thread with no stack trace and no error code. Here is the code in question: The data formats AudioStreamBasicDescription

Combine two .wav files in iPhone using Objective C

匆匆过客 提交于 2019-12-18 12:42:55
问题 I want to combine two .wav recording files together. Can any one help me figure out how to achieve this.I tried combining the data but the headers are creating problem. Can we do the combining as we do to combine the wave files. This is how i am doing the combine, NSMutableData *datas = [NSMutableData alloc]; NSData *data1 = [NSData dataWithContentsOfFile: [recordedTmpFile1 path]]; NSData *data2 = [NSData dataWithContentsOfFile: [recordedTmpFile2 path]]; NSLog(@"file1 size : %d", [data1

iPhone - convert mp3 to wav?

杀马特。学长 韩版系。学妹 提交于 2019-12-18 10:57:22
问题 Is there a way I can convert an mp3 file into a wav/aiff in my iPhone app? I have an mp3 but I want to combine it with other files and the only way I know this can be done is when using PCM formats. Can anyone help me out here? Thanks. Some things I tried I tried using the AudioConverterFillComplexBuffer() method and its callback to convert the mp3 to caf but the bytes per packet and per frame are read as 0. Any ideas how I can or whether I can use this function? Thanks again. 回答1: WAV and

AVPlayer vs. AVAudioPlayer

北城余情 提交于 2019-12-18 10:14:11
问题 The documentation for AVPlayer states the following: [The] player works equally well with local and remote media files However, the documentation for AVAudioPlayer states the following: Apple recommends that you use this class for audio playback unless you are playing audio captured from a network stream For the work I am doing I need some of the capabilities of AVAudioPlayer, but all my audio is being streamed. The main thing I need from AVAudioPlayer that AVPlayer does not have is the

error in audio Unit code -remoteIO for iphone

核能气质少年 提交于 2019-12-18 09:48:01
问题 i have this code, in order to read buffer samples , but i get a strange mach-o linker error , Framework of audio unit couldnt loaded so i put the audioTollBox and coreAudio as i read. my code is : #define kOutputBus 0 #define kInputBus 1 AudioComponentInstance audioUnit; @implementation remoteIO //callback function : static OSStatus recordingCallback(void *inRefCon, AudioUnitRenderActionFlags *ioActionFlags, const AudioTimeStamp *inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames,

error in audio Unit code -remoteIO for iphone

那年仲夏 提交于 2019-12-18 09:47:24
问题 i have this code, in order to read buffer samples , but i get a strange mach-o linker error , Framework of audio unit couldnt loaded so i put the audioTollBox and coreAudio as i read. my code is : #define kOutputBus 0 #define kInputBus 1 AudioComponentInstance audioUnit; @implementation remoteIO //callback function : static OSStatus recordingCallback(void *inRefCon, AudioUnitRenderActionFlags *ioActionFlags, const AudioTimeStamp *inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames,

how to set bitrate correctly for aac encoding OSX

佐手、 提交于 2019-12-18 07:11:43
问题 I have a 1 second PCM data which I write into an AAC file successfully. However, I can not control the bitrate of the output file. Here is the configuration of my AAC codec: AudioStreamBasicDescription clientFormat = {0}; clientFormat.mSampleRate = 44100; clientFormat.mFormatID = kAudioFormatMPEG4AAC; clientFormat.mFormatFlags = kMPEG4Object_AAC_Main; clientFormat.mChannelsPerFrame = 2; clientFormat.mBytesPerPacket = 0; clientFormat.mBytesPerFrame = 0; clientFormat.mFramesPerPacket = 1024;

Mac OS X Simple Voice Recorder

泄露秘密 提交于 2019-12-18 05:23:07
问题 Does anyone have some sample code for a SIMPLE voice recorder for Mac OS X? I would just like to record my voice coming from the internal microphone on my MacBook Pro and save it to a file. That is all. I have been searching for hours and yes, there are some examples that will record voice and save it to a file such as http://developer.apple.com/library/mac/#samplecode/MYRecorder/Introduction/Intro.html . The sample code for Mac OS X seems to be about 10 times more complicated than similar

How to use “kAudioUnitSubType_VoiceProcessingIO” subtype of core audio API in mac os?

落爺英雄遲暮 提交于 2019-12-18 04:18:28
问题 I'm finding an example of simple play-thru application using built-in mic/speaker with kAudioUnitSubType_VoiceProcessingIO subtype(not kAudioUnitSubType_HALOutput) in macosx. The comments on the core audio api says that kAudioUnitSubType_VoiceProcessingIO is available on the desktop and with iPhone 3.0 or greater, so I think that there must be an example somewhere for macos. Do you have any idea where the sample is? or Is there anyone who know how to use the kAudioUnitSubType

Sound on simulator but not device

谁说我不能喝 提交于 2019-12-18 04:16:35
问题 I'm using the following to play an m4a file: NSString *path = [[[NSBundle mainBundle] resourcePath] stringByAppendingPathComponent: fileName]; SystemSoundID soundID; NSURL *filePath = [NSURL fileURLWithPath:path isDirectory:NO]; AudioServicesCreateSystemSoundID((CFURLRef)filePath, &soundID); AudioServicesPlaySystemSound(soundID); It works fine on the simulator but I hear nothing on the device. Sounds files I'm using all stay in the bundle. Here is what filePath looks like from the device: