audiotoolbox

iOS: Is there a performance difference between using playInputClick vs the (1104) sound file with audio toolbox?

半腔热情 提交于 2019-12-06 10:25:54
Apple recommends using playInputClick in custom keyboards to simulate a click sound. It's easier to implement AudioServicesPlaySystemSound(1104); so my question then becomes, does playInputClick provide better performance or is it the same thing? Reason Apple recommended this is probably not performance. AudioServicesPlaySystemSound(1104) will probably always play the same sound but playInputClick may play another sound in the future if Apple decides to change the input click sound. So they are the same right now but it might change and if it does your app will be the only one playing the old

MonoTouch: Playing sound

血红的双手。 提交于 2019-12-06 06:48:25
问题 I am trying to play a short sound when user taps on the specific button. But the problem is that I always receive Object reference not set to an instance object . means Null! I first tried MonoTouch.AudioToolBox.SystemSound. MonoTouch.AudioToolbox.AudioSession.Initialize(); MonoTouch.AudioToolbox.AudioSession.Category = MonoTouch.AudioToolbox.AudioSessionCategory.MediaPlayback; MonoTouch.AudioToolbox.AudioSession.SetActive(true); var t = MonoTouch.AudioToolbox.SystemSound.FromFile("click.mp3"

Is AudioServicesDisposeSystemSoundID required?

二次信任 提交于 2019-12-06 03:54:18
I recently started working with the AudioToolbox framework and noticed that there is a method called AudioServicesDisposeSystemSoundID() . Just to know, is it a memory leak not to call the above method when calling AudioServicesCreateSystemSoundID() to init my SystemSoundID ? I am calling it like: AudioServicesCreateSystemSoundID((CFURLRef)filePath, &sound); That way filePath being a NSURL and sound SystemSoundID . Yes. Call it when you're done with sound . Otherwise, you may leak any memory associated with sound (which can be significant for A/V files.) 来源: https://stackoverflow.com/questions

Audio Output Routes for AirPlay

≡放荡痞女 提交于 2019-12-06 03:53:52
问题 I have looked but can't find a way to access the Audio Output Routes so i can detect if the audio is coming out via AirPlay. This is what i found in the Documentation for iOS 5.0 kAudioSessionOutputRoute_AirPlay Discussion These strings are used as values for the kAudioSession_AudioRouteKey_Type key for the dictionary associated with the kAudioSession_AudioRouteKey_Outputs array. I can't find a way to get access to the kAudioSession_AudioRouteKey_Outputs array. Thanks 回答1: Even if Bassem

Stopping and Quickly Replaying an AudioQueue

ⅰ亾dé卋堺 提交于 2019-12-05 21:06:37
I've got an audio queue that I've got playing, stopping, pausing correctly but I'm finding the AudioQueueStop() function to be taking a long time to execute. I'd like to immediately stop and then restart playing an audio queue and was wondering what the quickest way to do so would be. In my project I have multiple audio queues that I keep around to play specific sounds over and over. There is a situation where I must stop some of those sounds and then immediately play them and many more at once. It isn't so bad if there are only a couple of audio queues that I do this to but it starts taking a

AudioServices.h not found in objective-C iOS project that includes AudioToolbox framework

╄→гoц情女王★ 提交于 2019-12-05 13:19:05
According to the apple documentation, AudioServices.h should be part of the AudioToolbox framework. Even though I have added the AudioToolbox framework to my Xcode project, when I #import AudioServices I get the error: AudioServices.h file not found. This happens whether I type #import "AudioServices.h" or #import "AudioToolbox/AudioServices.h" . Just in case, I tried removing and then re-adding the AudioToolbox framework, which had no effect. Could the AudioServices file be corrupted somehow? (If so, does anyone know where I could download another copy?) I am using XCode 4.2, but as I am

iOS: Airplay picker MPVolumeView alternative

拈花ヽ惹草 提交于 2019-12-05 12:19:22
I'm using MPVolumeView to pick airplay device for avplayer airplay playback. Is there any possible non-private API alternative for doing this, so I would be able to provide my own UI Controls for picking airplay device? By referring to the API, I mean, that all I need is: Ability to reroute audio to airplay-device specific audioRoute. Retrive airplay-device names. (get all available audioRoutes, then get descriptions for airplay audioRoutes) I know AudioToolbox framework provides some additional API to deal AudioSession, but the only way I found to reroute audio is AVAudioSession's: - (BOOL

Write array of floats to a wav audio file in swift

做~自己de王妃 提交于 2019-12-05 09:36:38
I have this flow now: i record audio with AudioEngine, send it to an audio processing library and get an audio buffer back, then i have a strong will to write it to a wav file but i'm totally confused how to do that in swift. I've tried this snippet from another stackoverflow answer but it writes an empty and corrupted file.( load a pcm into a AVAudioPCMBuffer ) //get data from library var len : CLong = 0 let res: UnsafePointer<Double> = getData(CLong(), &len ) let bufferPointer: UnsafeBufferPointer = UnsafeBufferPointer(start: res, count: len) //tranform it to Data let arrayDouble = Array

Mixing down two files together using Extended Audio File Services

╄→尐↘猪︶ㄣ 提交于 2019-12-05 09:20:19
问题 I am doing some custom audio post-processing using audio units. I have two files that I am merging together (links below), but am coming up with some weird noise in the output. What am I doing wrong? I have verified that before this step, the 2 files ( workTrack1 and workTrack2 ) are in a proper state and sound good. No errors are hit in the process as well. Buffer Processing code : - (BOOL)mixBuffersWithBuffer1:(const int16_t *)buffer1 buffer2:(const int16_t *)buffer2 outBuffer:(int16_t *

AudioServicesPlaySystemSound not working on iPad device

大城市里の小女人 提交于 2019-12-05 07:20:50
I'm in the early stages of developing my first iPad application, and for simplicity I have so far been using AudioServicesPlaySystemSound and the associated functions to play sounds. My code is based the SoundEffect class from Apple's Metronome example. The specific symptom is that I can hear the sounds in the simulator but not on the device, though I have verified that I can hear sounds in other applications on the device. AudioServicesCreateSystemSoundID is returning valid sound identifiers, so it isn't anything as simple as the name of the sound file having different case, i.e. "sound.mp3"