core-audio

Using Core Audio to extract float from default line out sound device

拈花ヽ惹草 提交于 2019-12-11 16:05:29
问题 I am in need for some assistance/guidance with using Core Audio to extract floats from the sound out device. I have read similar posts regarding the extraction of floats from AIFF. My end goal is something along the lines of: iTunes is playing a song C/C++ program using Core Audio extracts float values from the sound device (in real-time) Use resulting float vector to perform Fourier Transformation on a array of floats (probably using vDSP from Apple's Accelerate Library) - This part I have

Halting a playing sound sample on iPhone using AudioServices

[亡魂溺海] 提交于 2019-12-11 15:59:35
问题 I am implementing a sound effect that plays while a user is dragging a UISlider. Here is the IBAction: called by the UISlider's Value Changed event -(IBAction)playTone4; { AudioServicesPlaySystemSound(soundID4); } I would like the sound to halt when the user is not dragging the slider but has not released it. Is there a way to do that? There doesn't seem to be an AudioServices Stop SystemSound() function. 回答1: System sounds cannot be stopped . See the iPhone Programming Guide: section

Why can't I change the number of elements / buses in the input scope of AU multi channel mixer?

核能气质少年 提交于 2019-12-11 14:07:50
问题 UPDATE: I'm changing my code to illustrate the issue in a more streamlined way. Also I had a little bug which, while not deterring from the problem, did add some confusion. I'm instantiating a Multi Channel Mixer AU in iOS (kAudioUnitSubType_MultiChannelMixer) and I do the following: OSStatus status = noErr; // Set component type: AudioComponentDescription cd = {0}; cd.componentType = kAudioUnitType_Mixer; cd.componentSubType = kAudioUnitSubType_MultiChannelMixer; cd.componentManufacturer =

Simple AudioQueue sine wave—why the distortion?

☆樱花仙子☆ 提交于 2019-12-11 12:33:50
问题 As a learning exercise, I'm using an AudioQueue to generate and play a 300 Hz sine wave. (I understand there are a variety of tools to generate and play audio, but yes, this is just to build up my Core Audio chops and this task is all about the AudioQueue.) The wave plays, but with distortion. Recording and plotting the sound shows that there is some distortion at the boundary between buffers (every half second), in addition to other short bursts of distortion here and there. I've included my

What is the best practice for updating UI from Core Audio Callback?

两盒软妹~` 提交于 2019-12-11 11:21:23
问题 I am currently wrapping my head around Core Audio and I was hit with the question of how to update the GUI the AudioQueueInputCallback. To begin with I want to update a label with the Level Meter reading from the mic. In my code I am storing the current level meter value in a struct on each callback. func MyAudioQueueInputCallback(inUserData: UnsafeMutablePointer<Void>, inAQ: AudioQueueRef, inBuffer: AudioQueueBufferRef, inStartTime: UnsafePointer<AudioTimeStamp>, var inNumberPacketDesc:

Play sound without latency iOS

女生的网名这么多〃 提交于 2019-12-11 10:21:51
问题 I can't find method how i can play sound real with low latency. I try use AVFoundation audio player huge latency around 500ms So i try create system sound, and too without luck latency around 200ms it's not much but not useful for me. I need 50ms max. Be sure my sound sample is clear tone without silence. SystemSoundID cID; BOOL spinitialized; -(IBAction)doInit { if (spinitialized){ AudioServicesPlaySystemSound (cID); return; } NSURL *uref = [[NSURL alloc] initFileURLWithPath: [NSString

AudioFileOpenURL returns -43 on an existing file

不想你离开。 提交于 2019-12-11 10:14:58
问题 I have a step in my application where a user repeatedly hears three spoken digits. If I leave this step running for a while (for certain undefined values of "while"), my debug logs show this (irrelevant log entries removed): 2010-03-01 13:44:21.283 iPhoneHearChk[1236:207] AudioFileOpenURL returned 0 (for <file://localhost/var/mobile/Applications/3A28F975-EAD5-4A5B-AFE6-FA1C6EE95732/iPhoneHearChk.app/5b3.ima4>) 2010-03-01 13:44:35.493 iPhoneHearChk[1236:207] AudioFileOpenURL returned 0 (for

AVMIDIPlayer init with MusicSequence

情到浓时终转凉″ 提交于 2019-12-11 09:23:59
问题 AVMIDIPlayer has this initializer: initWithData:soundBankURL:error: This is in addition to an initializer for reading from a standard MIDI file. The data is NSData, which I assumed is in a standard MIDI file format. So, how to get that data? Well, the current way to create a MIDI sequence is the AudioToolbox MusicSequence. (AVAudioEngine even has a member of this type). MusicSequence has this converter: MusicSequenceFileCreateData ( MusicSequence inSequence, MusicSequenceFileTypeID inFileType

how to read/assign the elements of a pointer that points to an array of structures in C++

瘦欲@ 提交于 2019-12-11 09:18:14
问题 In iOS core audio there is the API AudioFileWritePackets that has a inPacketDescriptions parameter defined as 'A pointer to an array of packet descriptions for the audio data.' and it looks like this in the method signature: const AudioStreamPacketDescription *inPacketDescriptions, now the struct AudioStreamPacketDescription is defined as follows: struct AudioStreamPacketDescription { SInt64 mStartOffset; UInt32 mVariableFramesInPacket; UInt32 mDataByteSize; }; typedef struct

iOS Audio Unit playback with constant noise

元气小坏坏 提交于 2019-12-11 08:56:57
问题 I am using audio unit for audio playback. I have download the tone generator from http://cocoawithlove.com/2010/10/ios-tone-generator-introduction-to.html and try to play around with it. For some reason I need to use ulaw instead of linear PCM. Here is my audio format setup: AudioStreamBasicDescription streamFormat; streamFormat.mSampleRate = 8000; streamFormat.mFormatID = kAudioFormatULaw; streamFormat.mFormatFlags = 0; streamFormat.mFramesPerPacket = 1; streamFormat.mBytesPerFrame = 2;