core-audio

Framework not found AudioUnit

佐手、 提交于 2019-12-18 03:49:21
问题 Been banging my head agains the wall for awhile now. My Xcode project went a little haywire while refactoring, and refused to build. I've squashed all the other errors, except one last linktime error: Framework not found AudioUnit I have the AudioUnit headers, the AudioUnit.framework is included in my project as it was before (Targets > Get Info > General > Linked Libraries > + ), but I cannot figure out why it does not work now. AudioToolbox.framework is also included. 回答1: Remove AudioUnit

Sound not playing with AVAudioPlayer

廉价感情. 提交于 2019-12-18 03:37:12
问题 I've searched and I believe my problem is quite unique. I'm aware of the Simulator 5.1 bug when using AVAudioPlayer which isn't my problem. I'm running on a iOS 5.1 device. Here's my header file: #import <UIKit/UIKit.h> #import <Foundation/Foundation.h> #import <AVFoundation/AVAudioPlayer.h> -(IBAction)pushBell; @end and my implementation file: #import "BellViewController.h" @interface BellViewController () @end @implementation BellViewController -(IBAction)pushBell { NSString *soundPath =[

Record audio iOS

安稳与你 提交于 2019-12-18 02:49:11
问题 How does one record audio using iOS? Not the input recording from the microphone, but I want to be able to capture/record the current playing audio within my app? So, e.g. I start a recording session, and any sound that plays within my app only, I want to record it to a file? I have done research on this but I am confused with what to use as it looks like mixing audio frameworks can cause problems? I just want to be able to capture and save the audio playing within my application. 回答1: Well

iOS: How to read an audio file into a float buffer

情到浓时终转凉″ 提交于 2019-12-18 00:50:06
问题 I have a really short audio file, say a 10th of a second in (say) .PCM format I want to use RemoteIO to loop through the file repeatedly to produce a continuous musical tone. So how do I read this into an array of floats? EDIT: while I could probably dig out the file format, extract the file into an NSData and process it manually, I'm guessing there is a more sensible generic approach... ( that eg copes with different formats ) 回答1: You can use ExtAudioFile to read data from any supported

How to control hardware mic input gain/level on iPhone?

只愿长相守 提交于 2019-12-17 18:33:42
问题 My audio-analysis function responds better on the iPad (2) than the iPhone (4). It seems sensitive to softer sounds on the iPad, whereas the iPhone requires much louder input to respond properly. Whether this is because of mic placement, different components, different software configurations or some other factor, I'd like to be able to control for it in my app. Obviously I could just multiply all of my audio samples to programmatically apply gain. Of course that has a software cost too, so:

How can I record AMR audio format on the iPhone?

强颜欢笑 提交于 2019-12-17 18:28:40
问题 A voice recorder doesn't need uncompressed Linear PCM audio. Compressed AMR would do fine. The iPhone framework built for recording audio is simple enough, but the only examples I've found for setting up the audio format (which come from Apple) use LinearPCM. I've tried various other combinations of values, but can't seem to get anything to work. Does anybody have any code that actually records AMR ? Edit: The AMR format is one of the options for setting the data type, but the other options

core audio offline rendering GenericOutput

≡放荡痞女 提交于 2019-12-17 06:31:33
问题 Anybody successfully done offline rendering using core-Audio.? I had to mix two audio files and apply reverb(used 2 AudioFilePlayer,MultiChannelMixer,Reverb2 and RemoteIO). Got it working. and i could save it while its previewing(on renderCallBack of RemoteIO). I need to save it without playing it (offline). Thanks in advance. 回答1: Offline rendering Worked for me using GenericOutput AudioUnit. I am sharing the working code here. core-audio framework seems a little though. But small-small

Drawing waveform with AVAssetReader

我的梦境 提交于 2019-12-17 04:10:49
问题 I reading song from iPod library using assetUrl (in code it named audioUrl) I can play it many ways, I can cut it, I can make some precessing with this but... I really don't understand what I gonna do with this CMSampleBufferRef to get data for drawing waveform! I need info about peak values, how I can get it this (maybe another) way? AVAssetTrack * songTrack = [audioUrl.tracks objectAtIndex:0]; AVAssetReaderTrackOutput * output = [[AVAssetReaderTrackOutput alloc] initWithTrack:songTrack

Drawing waveform with AVAssetReader

試著忘記壹切 提交于 2019-12-17 04:10:38
问题 I reading song from iPod library using assetUrl (in code it named audioUrl) I can play it many ways, I can cut it, I can make some precessing with this but... I really don't understand what I gonna do with this CMSampleBufferRef to get data for drawing waveform! I need info about peak values, how I can get it this (maybe another) way? AVAssetTrack * songTrack = [audioUrl.tracks objectAtIndex:0]; AVAssetReaderTrackOutput * output = [[AVAssetReaderTrackOutput alloc] initWithTrack:songTrack

Drawing waveform with AVAssetReader

不羁岁月 提交于 2019-12-17 04:09:59
问题 I reading song from iPod library using assetUrl (in code it named audioUrl) I can play it many ways, I can cut it, I can make some precessing with this but... I really don't understand what I gonna do with this CMSampleBufferRef to get data for drawing waveform! I need info about peak values, how I can get it this (maybe another) way? AVAssetTrack * songTrack = [audioUrl.tracks objectAtIndex:0]; AVAssetReaderTrackOutput * output = [[AVAssetReaderTrackOutput alloc] initWithTrack:songTrack