core-audio

iPhone SDK: play a sound with coreAudio

北城余情 提交于 2019-12-04 21:41:49
So far i've been using AudioServices to play sounds in my drum app which caused horrible lag etc. I've been told that if i use coreAudio there will be no lag abd the performance will be better.The person also told me that AudioServices is only used to play short alert sound. Any idea where i could start with CoreAudio? If you have any code, it's helpful too :) but tutorials would be better :D. Thanks in advance! -DD I really recommend Apple's documentation and the sample apps they provide: Core Audio Overview Audio Unit Hosting Guide for iOS Example Apps Audio Mixer (MixerHost) oalTouch

Unable to get correct frequency value on iphone

∥☆過路亽.° 提交于 2019-12-04 21:22:33
I'm trying to analyze frequency detection algorithms on iOS platform. So I found several implementations using FFT and CoreAudio ( example 1 and example 2 ). But in both cases there is some imprecision in frequency exists: (1) For A4 (440Hz) shows 441.430664 Hz. (1) For C6 (1046.5 Hz) shows 1518.09082 Hz. (2) For A4 (440Hz) shows 440.72 Hz. (2) For C6 (1046.5 Hz) shows 1042.396606 Hz. Why this happens and how to avoid this problem and detect frequency in more accurate way? Resolution in the frequency domain is inversely related to number of FFT bins. You need to either: increase the size of

Realtime audio processing without output

落花浮王杯 提交于 2019-12-04 20:43:32
问题 I'm looking on this example http://teragonaudio.com/article/How-to-do-realtime-recording-with-effect-processing-on-iOS.html and i want to turn off my output. I try to change: kAudioSessionCategory_PlayAndRecord to kAudioSessionCategory_RecordAudio but this is not working. I also try to get rid off: if(AudioUnitSetProperty(*audioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Output, 1, &streamDescription, sizeof(streamDescription)) != noErr) { return 1; } Becouse i want to get sound

How to use kAULowShelfParam_CutoffFrequency parameter of kAudioUnitSubType_LowShelfFilter which controls bass in Core Audio?

一笑奈何 提交于 2019-12-04 20:21:49
You must had gone through this before coming to my this question. How to use kAudioUnitSubType_LowShelfFilter of kAudioUnitType_Effect which controls bass in core Audio? Slowly & Steadily getting the things right for bass control of music. But yet not got succeeded in my objective. Now i got to know that i have to change the kAULowShelfParam_CutoffFrequency to change the bass . The following code i was using before 5 to 7 days. this code plays music properly but doesn't change bass properly. have a look on this code snippet:- - (void)awakeFromNib { printf("AUGraphController awakeFromNib\n");

Core Audio AudioFIleReadPackets… looking for raw audio

陌路散爱 提交于 2019-12-04 19:42:57
I'm trying to get raw audio data from a file (i'm used to seeing floating point values between -1 and 1). I'm trying to pull this data out of the buffers in real time so that I can provide some type of metering for the app. I'm basically reading the whole file into memory using AudioFileReadPackets. I've create a RemoteIO audio unit to do playback and inside of the playbackCallback, i'm supplying the mData to the AudioBuffer so that it can be sent to hardware. The big problem I'm having is that the data being sent to the buffers from my array of data (from AudioFileReadPackets) is UInt32... I

Is AUGraph being deprecated on iOS? If so, when?

风流意气都作罢 提交于 2019-12-04 19:17:15
I've heard rumblings that AUGraph is being deprecated on iOS, for example in this Twitter post : @marcoarment Your comment on @atpfm about needing to rewrite your audio engine: b/c of the looming AUGraph deprecation, or something else? Is AUGraph in fact being deprecated, and if so, when? Can somebody point me toward an official Apple document or announcement that clarifies this? Indeed it will be deprecated as stated in the WWDC talk (Note: The picture is from the core audio mailing list) 来源: https://stackoverflow.com/questions/44952582/is-augraph-being-deprecated-on-ios-if-so-when

J2ME/Blackberry - get audio signal amplitude level?

前提是你 提交于 2019-12-04 19:09:25
Is it possible in j2me to measure signal amplitude of audio record made by JSR-135 Player? I know I can access buffer, but then what? Target model Bold 9000, supported formats PCM and AMR. Which format I should use? See also Blackberry Audio Recording Sample Code How To - Record Audio on a BlackBerry smartphone Thank you! Maksym Gontar Get raw PCM signal level Use menu and trackwheel to zoom in/out and move left/right within graph. Audio format: raw 8000 Hz 16 bit mono pcm. Tested on Bold 9000 RIM OS 4.6 Algorythm should work in any mobile, where j2me and pcm is supported, of course

AVAudioSession / Audio Session Services switching output

假装没事ソ 提交于 2019-12-04 18:42:19
问题 Okay, I have my AVAudioSession defined with the following (yes, mix of the c and obj-c calls) Also note that the app has background mode audio, because if recording it must continue to do so while the app is in the background: [(AVAudioSession *)[AVAudioSession sharedInstance] setDelegate: self]; // Allow the app sound to continue to play when the screen is locked. [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord error:nil]; //Turn off automatic gain on the

Swift AVAudioEngine crash: player started when in a disconnected state

走远了吗. 提交于 2019-12-04 17:18:05
So my code below is supposed to replay the chimes.wav file over and over again, with a higher pitch, but crashes with the error at the bottom. Can anyone find what is causing this error? import UIKit import AVFoundation class aboutViewController: UIViewController { var audioEngine: AVAudioEngine = AVAudioEngine() var audioFilePlayer: AVAudioPlayerNode = AVAudioPlayerNode() override func viewDidLoad() { super.viewDidLoad() // Do any additional setup after loading the view, typically from a nib. var timePitch = AVAudioUnitTimePitch() timePitch.pitch = 2000 let filePath: String = NSBundle

AudioFileReadPacketData returns -50 when passed valid file

喜夏-厌秋 提交于 2019-12-04 17:10:47
I've spent some time attempting to debug this on my own but I can't seem to get the AudioFileReadPacketData to correctly read the passed in data. This is based almost directly off of the Apple AudioQueueServices Guide . class SVNPlayer: SVNPlayback { var queue: AudioQueueRef? var audioFormat: AudioStreamBasicDescription! var playbackFile: AudioFileID? var packetDesc: AudioStreamPacketDescription! var isDone = false var packetPosition: Int64 = 0 var numPacketsToRead = UInt32() private let callback: AudioQueueOutputCallback = { aqData, inAQ, inBuffer in guard let userData = aqData else { return