core-audio

Help Fix Memory Leak release

我只是一个虾纸丫 提交于 2019-12-02 18:03:14
问题 #import "VTM_AViPodReaderViewController.h" #import <AudioToolbox/AudioToolbox.h> // for the core audio constants #define EXPORT_NAME @"exported.caf" @implementation VTM_AViPodReaderViewController @synthesize songLabel; @synthesize artistLabel; @synthesize sizeLabel; @synthesize coverArtView; @synthesize conversionProgress; #pragma mark init/dealloc - (void)dealloc { [super dealloc]; } #pragma mark vc lifecycle -(void) viewDidAppear:(BOOL)animated { [super viewDidAppear:animated]; } #pragma

iOS Audio Units : When is usage of AUGraph's necessary?

五迷三道 提交于 2019-12-02 17:46:11
I'm totally new to iOS programing (I'm more an Android guy..) and have to build an application dealing with audio DSP. (I know it's not the easiest way to approach iOS dev ;) ) The app needs to be able to accept inputs both from : 1- built-in microphone 2- iPod library Then filters may be applied to the input sound and the resulting is to be outputed to : 1- Speaker 2- Record to a file My question is the following : Is an AUGraph necessary in order to be able for example to apply multiple filters to the input or can these different effects be applied by processing the samples with different

iOS 7 SDK not abiding background audio

断了今生、忘了曾经 提交于 2019-12-02 16:57:17
I have done a lot of research, both on Google and StackOverflow. All the answers I found do not work in iOS 7. I started writing fresh app in iOS 7 SDK with Xcode 5. All I'm trying to do is play audio in the app from a file stored in the app bundle (not from the Music library). I want to have audio played in background and controlled when screen is locked (in addition to Control Center). I set the APPNAME-Info.plist key, UIBackgroundModes , to audio . It is not handling things in the app delegate; everything is done inside the ViewController @interface ViewController : UIViewController

iOS FFT Draw spectrum

左心房为你撑大大i 提交于 2019-12-02 16:52:16
I've read these question: Using the Apple FFT and Accelerate Framework How do I set up a buffer when doing an FFT using the Accelerate framework? iOS FFT Accerelate.framework draw spectrum during playback They all describe how to setup fft with the accelerate framework. With their help I was able to setup fft and get a basic spectrum analyzer. Right now, I'm displaying all values I got from the fft. However, I only want to show 10-15, or a variable number, of bars respreseting certain frequencies. Just like the iTunes or WinAmp Level Meter. 1. Do I need to average magnitude values from a range

Linking against Apple frameworks with gcc

家住魔仙堡 提交于 2019-12-02 16:43:36
I've created some wrapper functions that encapsulate working with CoreAudio, and the goal is to create a C library that I can use with some command line C++ tools. So far things are working well. I took a sample project, modified it, and it builds and runs in XCode. I'd like to skip XCode altogether and build the library with gcc and a Makefile. How can I link against an Apple Framework? Are Frameworks just shared libraries that I could include in the -l and -L options on gcc? Here's an example: gcc -framework CoreServices -o test test.c From the man page of Apple's gcc (i686-apple-darwin10

iOS: Audio Units vs OpenAL vs Core Audio

早过忘川 提交于 2019-12-02 16:13:46
Could someone explain to me how OpenAL fits in with the schema of sound on the iPhone? There seem to be APIs at different levels for handling sound. The higher level ones are easy enough to understand. But my understanding gets murky towards the bottom. There is Core Audio, Audio Units, OpenAL. What is the connection between these? Is openAL the substratum, upon which rests Core Audio (which contains as one of its lower-level objects Audio Units) ? OpenAL doesn't seem to be documented by Xcode, yet I can run code that uses its functions. This is what I have figured out: The substratum is Core

implicit conversion of an Objective-C pointer to 'void *' is disallowed with ARC

感情迁移 提交于 2019-12-02 15:48:40
What does this mean and what alternative do I have? implicit conversion of an Objective-C pointer to 'void *' is disallowed with ARC I am porting an Xcode3 project to iOS5 wich uses AudioSessionInitialize like this AudioSessionInitialize(NULL, NULL, NULL, self); where self here is a ViewController. Joshua Weinberg You can't do implicit casts to void* anymore, AudioSessionInitialize(NULL, NULL, NULL, objc_unretainedPointer(self)); should do the trick. EDIT: Historical point, the answer above was from before the __bridge casts were finalized. In modern code the correct answer is that provided by

How do I synthesize sounds with CoreAudio on iPhone/Mac

生来就可爱ヽ(ⅴ<●) 提交于 2019-12-02 14:32:38
I'd like to play a synthesised sound in an iPhone. Instead of using a pre-recorded sound and using SystemSoundID to play an existing binary, I'd like to synthesise it. Partially, that's because I want to be able to play the sound continuously (e.g. when the user's finger is on the screen) instead of a one-off sound sample. If I wanted to synthesise a Middle A+1 (A4) (440Hz), I can calculate a sine wave using sin(); what I don't know is how to arrange those bits into a packet which CoreAudio can then play. Most of the tutorials that exist on the net are concerned with simply playing existing

How to program a real-time accurate audio sequencer on the iphone?

半世苍凉 提交于 2019-12-02 14:18:57
I want to program a simple audio sequencer on the iphone but I can't get accurate timing. The last days I tried all possible audio techniques on the iphone, starting from AudioServicesPlaySystemSound and AVAudioPlayer and OpenAL to AudioQueues. In my last attempt I tried the CocosDenshion sound engine which uses openAL and allows to load sounds into multiple buffers and then play them whenever needed. Here is the basic code: init: int channelGroups[1]; channelGroups[0] = 8; soundEngine = [[CDSoundEngine alloc] init:channelGroups channelGroupTotal:1]; int i=0; for(NSString *soundName in

Bluetooth headphone music quality deteriorates when launching iOS simulator

我只是一个虾纸丫 提交于 2019-12-02 13:50:14
The situation goes a little something like this: I am programming Xcode whilst concurrently listening to music on my bluetooth headphones... you know to block out the world. Then, I go to launch my app in the iOS simulator and BOOM all of a sudden my crystal clear music becomes garbled and super low quality like it is playing in a bathtub 2 blocks away... in the 1940s. Note: the quality deterioration does NOT occur if I am playing music on my laptop or cinema display and I launch the sim. Seems to be exclusively a Sim -> bluetooth issue. The problem is more than just annoying. Because often