core-audio

Playback and Recording simultaneously using Core Audio in iOS

丶灬走出姿态 提交于 2019-12-03 07:28:36
I need to play and record simultaneously using Core Audio. I really do not want to use AVFoundation API (AVAudioPlayer + AVAudioRecorder) to do this as I am making a music app and cannot have any latency issues. I've looked at the following source code from Apple: aurioTouch MixerHost I've already looked into the following posts: iOS: Sample code for simultaneous record and playback Record and play audio Simultaneously I am still not clear on how I can do playback and record the same thing simultaneously using Core Audio. Any pointers towards how I can achieve this will be greatly appreciable.

Extracting Amplitude Data from Linear PCM on the iPhone

江枫思渺然 提交于 2019-12-03 05:15:35
问题 I'm having difficulty extracting amplitude data from linear PCM on the iPhone stored in a audio.caf. My questions are: Linear PCM stores amplitude samples as 16-bit values. Is this correct? How is amplitude stored in packets returned by AudioFileReadPacketData()? When recording mono linear PCM, isn't each sample, (in one frame, in one packet) just an array for SInt16? What is the byte order (big endian vs. little endian)? What does each step in linear PCM amplitude mean physically? When

iOS Audio Units : When is usage of AUGraph's necessary?

亡梦爱人 提交于 2019-12-03 04:35:30
问题 I'm totally new to iOS programing (I'm more an Android guy..) and have to build an application dealing with audio DSP. (I know it's not the easiest way to approach iOS dev ;) ) The app needs to be able to accept inputs both from : 1- built-in microphone 2- iPod library Then filters may be applied to the input sound and the resulting is to be outputed to : 1- Speaker 2- Record to a file My question is the following : Is an AUGraph necessary in order to be able for example to apply multiple

AudioUnit tone generator is giving me a chirp at the end of each tone generated

≡放荡痞女 提交于 2019-12-03 04:01:38
I'm creating a old school music emulator for the old GWBasic PLAY command. To that end I have a tone generator and a music player. Between each of the notes played I'm getting a chirp sound that mucking things up. Below are both of my classes: ToneGen.h #import <Foundation/Foundation.h> @interface ToneGen : NSObject @property (nonatomic) id delegate; @property (nonatomic) double frequency; @property (nonatomic) double sampleRate; @property (nonatomic) double theta; - (void)play:(float)ms; - (void)play; - (void)stop; @end ToneGen.m #import <AudioUnit/AudioUnit.h> #import "ToneGen.h" OSStatus

How can I get the current sound level of the current audio output device?

时光毁灭记忆、已成空白 提交于 2019-12-03 03:46:46
I'm looking for a way to tap into the current audio output on a Mac, then return a value representing the current sound level. By sound level, I mean the amount of noise being generated by the output. I'm NOT asking how to get the current volume level of the output device. the following code is pulled from Apple's Sample AVRecorder … this particular bit of code acquires a set of connections from this class's movieFileOutput's connections methods, gets the AVCaptureAudioChannel for each connection, and calculates decibel power based upon that. i would presume that if you are looking for an

How do I register for a notification for then the sound volume changes?

元气小坏坏 提交于 2019-12-03 03:40:25
I need my app to be notified when the OS X sound volume has changed. This is for a Desktop app, not for iOS. How can I register for this notification? This can be a tiny bit tricky because some audio devices support a master channel, but most don't so the volume will be a per-channel property. Depending on what you need to do you could observe only one channel and assume that all other channels the device supports have the same volume. Regardless of how many channels you want to watch, you observe the volume by registering a property listener for the AudioObject in question: // Some devices

How do you set the input level (gain) on the built-in input (OSX Core Audio / Audio Unit)?

孤者浪人 提交于 2019-12-03 03:22:12
I've got an OSX app that records audio data using an Audio Unit. The Audio Unit's input can be set to any available source with inputs, including the built-in input. The problem is, the audio that I get from the built-in input is often clipped, whereas in a program such as Audacity (or even Quicktime) I can turn down the input level and I don't get clipping. Multiplying the sample frames by a fraction, of course, doesn't work, because I get a lower volume, but the samples themselves are still clipped at time of input. How do I set the input level or gain for that built-in input to avoid the

iOS FFT Draw spectrum

寵の児 提交于 2019-12-03 03:20:08
问题 I've read these question: Using the Apple FFT and Accelerate Framework How do I set up a buffer when doing an FFT using the Accelerate framework? iOS FFT Accerelate.framework draw spectrum during playback They all describe how to setup fft with the accelerate framework. With their help I was able to setup fft and get a basic spectrum analyzer. Right now, I'm displaying all values I got from the fft. However, I only want to show 10-15, or a variable number, of bars respreseting certain

How to use kAudioUnitSubType_LowShelfFilter of kAudioUnitType_Effect which controls bass in core Audio?

断了今生、忘了曾经 提交于 2019-12-03 03:13:09
i'm back with one more question related to BASS . I already had posted this question How Can we control bass of music in iPhone , but not get as much attention of your people as it should get. But now I have done some more search and had read the Core AUDIO . I got one sample code which i want to share with you people here is the link to download it iPhoneMixerEqGraphTest . Have a look on it in this code what i had seen is the developer had use preset Equalizer given by iPod in Apple. Lets see some code snippet too:---- // iPodEQ unit CAComponentDescription eq_desc(kAudioUnitType_Effect,

AVPlayer not synchronized

随声附和 提交于 2019-12-03 03:10:19
I'm really out of ideas so I'll have to ask you guys again... I'm building an iPhone application which uses three instances of AVPlayer. They all play at the same time and it's very important that they do so. I used to run this code: CMClockRef syncTime = CMClockGetHostTimeClock(); CMTime hostTime = CMClockGetTime(hostTime); [self.playerOne setRate:1.0f time:kCMTimeInvalid atHostTime:hostTime]; [self.playerTwo setRate:1.0f time:kCMTimeInvalid atHostTime:hostTime]; [self.playerThree setRate:1.0f time:kCMTimeInvalid atHostTime:hostTime]; which worked perfectly. But a few days ago it just stopped