core-audio

Core Audio render thread and thread signalling

删除回忆录丶 提交于 2019-12-08 01:48:56
问题 Does iOS have any kind of very low level condition lock that does not include locking? I am looking for a way to signal an awaiting thread from within the Core Audio render thread, without the usage of locks. I was wondering if something low level as a Mach system call might exist. Right now I have a Core Audio thread that uses a non-blocking thread safe message queue to send messages to another thread. The other thread then pulls every 100ms to see if messages are available in the queue. But

Equalizer from audio data

ぃ、小莉子 提交于 2019-12-08 00:53:53
问题 I have a mp3 file, so I need to play it, and display an equalizer (like in attached image). So playing is not problem, but I have no idea how to measure power of current playing sample's dBs based on frequency. By which steps I can get data for displaying dBs of current sample? As I guess, I need to get some array of powers in different frequencies, am I right? Here is examples of what I want to achieve: https://www.youtube.com/watch?v=7oeb-OIOe-0 https://www.youtube.com/watch?v=PwmUhTDr0Y0

My custom Audio Unit opens in AU Lab but not Garageband or Ableton

旧城冷巷雨未停 提交于 2019-12-07 21:54:54
问题 I created a filter AU in Xcode as a project for a signals class I'm taking,and I can open it just fine in AU Lab from my components folder, but I cant seem to open it in a DAW. Can anyone help me get to the bottom of this? Could it be a setting in the .plist file? There is no error in the code because the AU works perfectly in AU lab. If anybody has an idea help would be much appreciated and I can supply any information you might need. 回答1: I made the investigation I mentioned in my comments

Sending audio to a bluetooth enabled speaker, IOS

﹥>﹥吖頭↗ 提交于 2019-12-07 20:48:30
问题 I want to add a function to my App, where the user can choose to play the audio on a bluetooth enabled speaker. I have a Parrot Easydrive in my car and this works for phonecalls and for example the Dictafoon App among others. I understand that I should use the Core Audio framework. WHen a bluetooth device is connected it is said that it is easy to stream the audio to that connection. I am now looking for Core Audio sample code (or a book) where connecting and streaming to a bluetooth device

HALOutput in AUGraph select and configure specific output device

十年热恋 提交于 2019-12-07 19:03:20
问题 I successfully managed to build a complex AUGraph that I'm able to reconfigure on the fly, and all is working well. I'm facing a wall now with what seems a very simple task: selecting a sepcific output device. I'm able to get the deviceUID and ID thanks to this post: AudioObjectGetPropertyData to get a list of input devices (that I've modified to get output devices) and to the code below (I can't remember where I've found it, unfortunately) - (AudioDeviceID) deviceIDWithUID:(NSString *)uid {

multi track mp3 playback for iOS application

偶尔善良 提交于 2019-12-07 18:17:02
问题 I am doing an application that involves playing back a song in a multi track format (drums, vocals, guitar, piano, etc...). I don't need to do any fancy audio processing to each track, all I need to be able to do is play, pause, and mute/unmute each track. I had been using multiple instances of AVAudioPlayer but when performing device testing, I noticed that the tracks are playing very slightly out of sync when they are first played. Furthermore, when I pause and play the tracks they continue

Eliminating current playing track sound in recording track

倖福魔咒の 提交于 2019-12-07 15:22:20
问题 I am looking to use RemoteI/O for audio recording and playing.I am very poor to understand core audio, because of that I followed Amazing Audio open source.So far I can able to Record and play with the same code, now I am trying to record through microphone and play though iphone speaker to avoid mixing of two audios while using simultaneous audio playing and recording. I have seen many posts in Stack Overflow that my question is duplicate, but I couldn't find exact answer for my problem. But

Which framework should I use to play an audio file (WAV, MP3, AIFF) in iOS with low latency?

爷,独闯天下 提交于 2019-12-07 12:06:42
问题 iOS has various audio frameworks from the higher-level that lets you simply play a specified file, to the lower level that lets you get at the raw PCM data, and everything in between. For our app, we just need to play external files (WAV, AIFF, MP3) but we need to do so in response to pressing a button and we need that latency to be as small as possible. (It's for queueing in live productions.) Now the AVAudioPlayer and such work to play simple file assets (via their URL), but its latency in

How do I create an AUAudioUnit that implements multiple audio units?

核能气质少年 提交于 2019-12-07 12:03:49
问题 In Apple's docs for creating an AUAudio Unit (Here: https://developer.apple.com/documentation/audiotoolbox/auaudiounit/1387570-initwithcomponentdescription) they claim that A single audio unit subclass may implement multiple audio units—for example, an effect that can also function as a generator, or a cluster of related effects. There are no examples of this online that I can find. Ideally it would be nice if your answer/solution involved using Swift and AVAudioEngine but I'd happily accept

Unable to Save Performance Parameters in AUSampler

耗尽温柔 提交于 2019-12-07 11:37:42
I'm trying to connect a performance parameter to control the amplifier gain of an AUSampler in AU Lab but I'm unable to save the parameter. When I click to another tab I get a message that says: You have a partially created performance parameter. Any changes will be lost. Would you like to continue editing your performance parameter or discard changes and leave the parameter editor? Does anyone know how to finalize the parameter? I believe I have connected it properly but it seems like I'm missing the last step. I've also noticed there is a gear icon in Apple's docs that shows the performance