core-audio

Join multiple audio files into one

*爱你&永不变心* 提交于 2019-12-01 10:58:23
I'm trying to allow selection of words (mp3 audio samples) and add to a sentence which upon pressing play plays them all in sequence and optionally save that one combined file. MP3 is a stream format, meaning it doesn't have a bunch of metadata at the front or end of the file. While this has a lot of downsides, one of the upsides is that you can concatenate MP3 files together into a single file and it'll play. This is pretty much what you're doing by concatenating into an NSMutableData, the downside of which is that you might run out of memory. Another option would be to build up the file on

OSX programmatically invoke sound level graphic

China☆狼群 提交于 2019-12-01 09:28:51
I have an app which can change the volume under OSX. What it lacks is the visual feedback provided when one presses the sound up/down keys. Does anyone know how to programmatically invoke that behavior? Thanks Here's a little code from George Warner and Casey Fleser that does this trick. Think carefully that this is really the way you want to do things. // Save as sound_up.m // Compile: gcc -o sound_up sound_up.m -framework IOKit -framework Cocoa #import <Cocoa/Cocoa.h> #import <IOKit/hidsystem/IOHIDLib.h> #import <IOKit/hidsystem/ev_keymap.h> static io_connect_t get_event_driver(void) {

iOS audio manipulation - play local .caf file backwards

谁说我不能喝 提交于 2019-12-01 08:49:23
I'm wanting to load a local .caf audio file and reverse the audio (play it backwards). I've gathered that I basically need to flip an array of buffer data from posts like this However, I'm not sure how to access this buffer data from a given audio file. I have a little experience playing sounds back with AVaudioPlayer and ObjectAL(an obj-c openAL library), but I don't know how to access something lower level like this buffer data array. Could I please get an example of how I would go about getting access to that array? William Power Your problem reduces to the same problem described here ,

AVAssetReader to AudioQueueBuffer

南楼画角 提交于 2019-12-01 08:42:37
问题 Currently, I'm doing a little test project to see if I can get samples from an AVAssetReader to play back using an AudioQueue on iOS. I've read this: ( Play raw uncompressed sound with AudioQueue, no sound ) and this: ( How to correctly read decoded PCM samples on iOS using AVAssetReader -- currently incorrect decoding ), Which both actually did help. Before reading, I was getting no sound at all. Now, I'm getting sound, but the audio is playing SUPER fast. This is my first foray into audio

Change OS X system volume programmatically

醉酒当歌 提交于 2019-12-01 08:28:11
How can I change the volume programmatically from Objective-C? I found this question, Controlling OS X volume in Snow Leopard which suggests to do: Float32 volume = 0.5; UInt32 size = sizeof(Float32); AudioObjectPropertyAddress address = { kAudioDevicePropertyVolumeScalar, kAudioDevicePropertyScopeOutput, 1 // Use values 1 and 2 here, 0 (master) does not seem to work }; OSStatus err; err = AudioObjectSetPropertyData(kAudioObjectSystemObject, &address, 0, NULL, size, &volume); NSLog(@"status is %i", err); This does nothing for me, and prints out status is 2003332927 . I also tried using values

iOS audio manipulation - play local .caf file backwards

霸气de小男生 提交于 2019-12-01 06:50:15
问题 I'm wanting to load a local .caf audio file and reverse the audio (play it backwards). I've gathered that I basically need to flip an array of buffer data from posts like this However, I'm not sure how to access this buffer data from a given audio file. I have a little experience playing sounds back with AVaudioPlayer and ObjectAL(an obj-c openAL library), but I don't know how to access something lower level like this buffer data array. Could I please get an example of how I would go about

iOS - updating the media play/pause state in the multitasking bar

三世轮回 提交于 2019-12-01 06:03:50
问题 We have a working app that uses an AU graph - CoreAudio API - to play audio. The graph is always running, and the play/pause state of the various source material is managed in the graph rendering callback functions. We successfully respond to UIEventTypeRemoteControl events, and we successfully update the lock-screen with the meta-data for the currently playing content using MPNowPlayingInfoCenter. The one missing piece is to update the state of the play/pause button in the iOS multitasking

Reading audio buffer data with AudioQueue

吃可爱长大的小学妹 提交于 2019-12-01 05:43:51
I am attempting to read audio data via AudioQueue. When I do so, I can verify that the bit depth of the file is 16-bit. But when I get the actual sample data, I'm only seeing values from -128 to 128. But I'm also seeing suspicious looking interleaved data, which makes me pretty sure that I'm just not reading the data correctly. So to begin with, I can verify that the source file is 44100, 16-bit, mono wav file. My buffer is allocated thusly: char *buffer= NULL; buffer = malloc(BUFFER_SIZE); assert(buffer); All the relevant values are set and used in: AudioFileReadPackets(inAudioFile,false,

Change OS X system volume programmatically

荒凉一梦 提交于 2019-12-01 04:56:37
问题 How can I change the volume programmatically from Objective-C? I found this question, Controlling OS X volume in Snow Leopard which suggests to do: Float32 volume = 0.5; UInt32 size = sizeof(Float32); AudioObjectPropertyAddress address = { kAudioDevicePropertyVolumeScalar, kAudioDevicePropertyScopeOutput, 1 // Use values 1 and 2 here, 0 (master) does not seem to work }; OSStatus err; err = AudioObjectSetPropertyData(kAudioObjectSystemObject, &address, 0, NULL, size, &volume); NSLog(@"status

How to handle UnsafeMutablePointer correctly

冷暖自知 提交于 2019-12-01 04:11:54
I am a little confused. When do I have to call free and when destroy/dealloc? I am working on a short code snippet learning core audio. I thought if I call UnsafeMutablePointer<Type>.alloc(size) then I should call destroy & dealloc . But if I use malloc() or calloc() I am supposed to call free() . In this example from Learning Core Audio the following code snippet makes me wonder: var asbds = UnsafeMutablePointer<AudioStreamBasicDescription>.alloc(Int(infoSize)) audioErr = AudioFileGetGlobalInfo(kAudioFileGlobalInfo_AvailableStreamDescriptionsForFormat, UInt32(sizeof(fileTypeAndFormat