audiotoolbox

Xcode 6 - Press button to play sound

佐手、 提交于 2019-12-11 03:51:48
问题 Looking for some help trying to get my button to play sound. I've looked up over 50 tutorials but all of them are outdated and do not work with the new Xcode. Does anyone have a good resource to learn? Fairly new to Objective-C. I've looked through the Apple documentation for the Audio frameworks and my brain cells committed suicide. Any help or pointers in the right direction would be greatly appreciated. 回答1: in viewController.h #import <AudioToolbox/AudioToolbox.h> #import <AVFoundation

Swift 3 LPCM Audio Recorder | Error: kAudioFileInvalidPacketOffsetError

房东的猫 提交于 2019-12-11 02:18:30
问题 The below recorder works only the first time, if you tried recording a second time it gives the error 'kAudioFileInvalidPacketOffsetError' when trying to AudioFileWritePackets. Any idea why this is happening? Thank you in advance Repository located here Recorder import UIKit import CoreAudio import AudioToolbox class SpeechRecorder: NSObject { static let sharedInstance = SpeechRecorder() // MARK:- properties @objc enum Status: Int { case ready case busy case error } internal struct

AudioServices.h not found in objective-C iOS project that includes AudioToolbox framework

南楼画角 提交于 2019-12-10 09:27:55
问题 According to the apple documentation, AudioServices.h should be part of the AudioToolbox framework. Even though I have added the AudioToolbox framework to my Xcode project, when I #import AudioServices I get the error: AudioServices.h file not found. This happens whether I type #import "AudioServices.h" or #import "AudioToolbox/AudioServices.h" . Just in case, I tried removing and then re-adding the AudioToolbox framework, which had no effect. Could the AudioServices file be corrupted somehow

iOS: Airplay picker MPVolumeView alternative

心不动则不痛 提交于 2019-12-10 09:20:37
问题 I'm using MPVolumeView to pick airplay device for avplayer airplay playback. Is there any possible non-private API alternative for doing this, so I would be able to provide my own UI Controls for picking airplay device? By referring to the API, I mean, that all I need is: Ability to reroute audio to airplay-device specific audioRoute. Retrive airplay-device names. (get all available audioRoutes, then get descriptions for airplay audioRoutes) I know AudioToolbox framework provides some

AudioServicesPlaySystemSound not playing sounds

我是研究僧i 提交于 2019-12-10 03:21:01
问题 I am playing a small .wav file using the AudioToolBox. AudioServicesPlaySystemSound (soundFileObject); But sometimes it is not playing. What is the reason? 回答1: If you're in the Simulator, make sure that in System Prefrences → Sound, that "Play user interface sound effects" is not turned off. If you're on the device, check that the ringer switch is not set to silent. 回答2: Maybe it's this issue? AudioServicesDisposeSystemSoundID() will stop the sound work before playing. 回答3: System

Audio Session Services: kAudioSessionProperty_OverrideAudioRoute with different routes for input & output

南楼画角 提交于 2019-12-09 17:21:18
问题 I'm messing around with Audio Session Services. I'm trying to control the audio routes setting AudioSessionSetProperty: kAudioSessionProperty_OverrideAudioRoute as kAudioSessionOverrideAudioRoute_Speaker . The problem is that it changes the route both for input and output. What I want is to have input set from headset's mic, and output by speakers. Any ideas? Ty! 回答1: You can do this in iOS 5 with the properties: kAudioSessionProperty_InputSource kAudioSessionProperty_OutputDestination For

Swift: Retrieve audio file marker list from url?

谁说胖子不能爱 提交于 2019-12-08 15:53:55
问题 I just want to get a list of the markers in an audio file. I thought this would be an easy common task that wouldn't be too difficult. However, I can barely find any example code or documentation, so I ended up with this: private func getMarkers(_ url: CFURL) -> AudioFileMarkerList { var file: AudioFileID? var size: UInt32 = 0 var markers = AudioFileMarkerList() AudioFileOpenURL(url, .readPermission, kAudioFileWAVEType, &file) AudioFileGetPropertyInfo(file!, kAudioFilePropertyMarkerList,

AudioQueueNewInput callback latency

眉间皱痕 提交于 2019-12-07 16:12:26
问题 Regardless of the size of the buffers I provide the callback provided to AudioQueueNewInput occurs at roughly the same time interval. For example: If you have .05 second buffers and are recording at 44k the callback first called about at .09 seconds and then a second call occurs right after (.001 seconds). Then you wait again for ~.09 seconds. If your buffer size was .025. You would wait .09 seconds and then see 3 more buffers nearly instantly. Changing the sample rate increases the latency.

Write array of floats to a wav audio file in swift

大憨熊 提交于 2019-12-07 04:37:35
问题 I have this flow now: i record audio with AudioEngine, send it to an audio processing library and get an audio buffer back, then i have a strong will to write it to a wav file but i'm totally confused how to do that in swift. I've tried this snippet from another stackoverflow answer but it writes an empty and corrupted file.( load a pcm into a AVAudioPCMBuffer ) //get data from library var len : CLong = 0 let res: UnsafePointer<Double> = getData(CLong(), &len ) let bufferPointer:

Is AUGraph being deprecated on iOS? If so, when?

被刻印的时光 ゝ 提交于 2019-12-06 13:10:25
问题 I've heard rumblings that AUGraph is being deprecated on iOS, for example in this Twitter post: @marcoarment Your comment on @atpfm about needing to rewrite your audio engine: b/c of the looming AUGraph deprecation, or something else? Is AUGraph in fact being deprecated, and if so, when? Can somebody point me toward an official Apple document or announcement that clarifies this? 回答1: Indeed it will be deprecated as stated in the WWDC talk (Note: The picture is from the core audio mailing list