core-audio

iPhone SDK: AVAudioRecorder metering — how to change peakPowerForChannel from decibel into percentage?

拥有回忆 提交于 2019-11-28 20:54:12
The AVAudioRecorder in the iPhone SDK can be used to get the peak and average power for a channel, in decibels. The range is between 0db to 160db. What is the calculation used to convert this into a scale between 0 - 10 or something similar that can be used for an audio level meter? The range is from -160 dB to 0 dB. You probably want to display it in a meter that goes from -90 dB to 0 dB. Displaying it as decibels is actually more useful than as a linear audio level, because the decibels are a logarithmic scale, which means that it more closely approximates how loud we perceive a sound. That

How to get array of float audio data from AudioQueueRef in iOS?

天大地大妈咪最大 提交于 2019-11-28 19:40:41
I'm working on getting audio into the iPhone in a form where I can pass it to a (C++) analysis algorithm. There are, of course, many options: the AudioQueue tutorial at trailsinthesand gets things started. The audio callback, though, gives an AudioQueueRef , and I'm finding Apple's documentation thin on this side of things. Built-in methods to write to a file, but nothing where you actually peer inside the packets to see the data. I need data. I don't want to write anything to a file, which is what all the tutorials — and even Apple's convenience I/O objects — seem to be aiming at. Apple's

How to make a simple EQ AudioUnit (bass, mid, treble) with iOS?

我只是一个虾纸丫 提交于 2019-11-28 17:40:39
does anyone know how to make a simple EQ audio unit (3 bands - low, mid, hi) with iOS ? I know how to add an iPod EQ Audio Unit to my AU Graph. But it only give you access to presets and I need proper control of the EQ. I've looked around for some tutorials or explanations but no luck. Thanks. André The iPhone doesn't exactly support custom AudioUnits. Or, more precisely, it doesn't allow you to register an AudioUnit's identifier so you could load it in an AUGraph. You can, however, register a render callback, get raw PCM data, and process it accordingly. This is how I've implemented effect

Tone Generation in Cocoa Touch

雨燕双飞 提交于 2019-11-28 17:40:30
I need to generate a tone that I can manipulate frequency and wave. The overall goal is to create a basic piano. Does anyone know how I can achieve this? My development platform is the iPhone 2.x You could always start with sin waves. :-) #include <cmath> typedef double Sample; typedef double Time; class MonoNote { protected: Time start, duration; virtual void internalRender(double now, Sample *mono) = 0; public: MonoNote(Time s, Time d) : start(s), duration(d) {} virtual ~MonoNote() {} void render(double now, Sample *mono) { if (start <= now && now < start + duration) { internalRender(now,

MPMediaItem and iTunes Match

荒凉一梦 提交于 2019-11-28 16:29:15
问题 I have an app that uses the iPod Library API to access the song database in iOS. With the release of iTunes Match, any song which is not on the device will fail to load. Is there a way I an request that the song be downloaded? Perhaps using the new iCloud API? Edit: To be clear I am not asking how to download songs with iTunes Match using the iPhone. The iOS SDK allows access to the iPod Library via the MPMediaQuery/MPMediaItems. On a iOS device with iTunes Match enabled songs which are in

Example of using Audio Queue Services

╄→尐↘猪︶ㄣ 提交于 2019-11-28 16:07:28
I am seeking an example of using Audio Queue Services. I would like to create a sound using a mathematical equation and then hear it. Here's my code for generating sound from a function. I'm assuming you know how to use AudioQueue services, set up an AudioSession, and properly start and stop an audio output queue. Here's a snippet for setting up and starting an output AudioQueue: // Get the preferred sample rate (8,000 Hz on iPhone, 44,100 Hz on iPod touch) size = sizeof(sampleRate); err = AudioSessionGetProperty (kAudioSessionProperty_CurrentHardwareSampleRate, &size, &sampleRate); if (err !=

How to use the CoreAudio API in Swift

 ̄綄美尐妖づ 提交于 2019-11-28 11:47:51
I am in the process of migrating my streaming audio engine to swift. i am finding it difficult to use the C Audio API in swift. I have a problem with AudioFileStreamOpen api where it takes 2 C functions as a parameter. I don't know how to use this API is swift. AudioFileStreamOpen(self as UnsafePointer<()>, propertyProc, packetProc, kAudioFileMP3Type, audioStreamId) I have defined the callback method as below for this API. But i am getting the compilation error. func propertyProc(inClientData: UnsafePointer<()>,inFileStreamId: AudioFileStreamID,inPropertyId: AudioFileStreamPropertyID,ioFlags:

Recording from RemoteIO: resulting .caf is pitch shifted slower + distorted

徘徊边缘 提交于 2019-11-28 11:37:21
So I've cobbled together some routines for recording audio based on some posts here. The posts I've referenced are here and here , along with reading the sites they reference. My setup: I have an existing AUGraph: (several AUSamplers -> Mixer -> RemoteIO). The AUSamplers are connected to tracks in a MusicPlayer instance. That all works fine but I want to add recording to it. Recording is working but the resulting .caf is pitch/tempo shifted slower + has bad sound quality. Must be something wrong with the format I am specifying? Can someone eyeball this and tell me where I am setting the format

How to record an audio file in .mp3 format?

╄→гoц情女王★ 提交于 2019-11-28 09:23:18
I am using the following settings for recording audio file in .mp3 format using AVAudioRecorder. NSDictionary *recordSettings = [[NSDictionary alloc] initWithObjectsAndKeys: [NSNumber numberWithFloat: 44100.0],AVSampleRateKey, [NSNumber numberWithInt: kAudioFormatMPEGLayer3],AVFormatIDKey, [NSNumber numberWithInt: 1], AVNumberOfChannelsKey, [NSNumber numberWithInt: AVAudioQualityMax],AVEncoderAudioQualityKey,nil]; But not able to record with these. I searched a lot for this but wasn't able to get some relevant post. Some posts say that it is not possible.If its not possible then why so? Please

How to develop an iphone app with reverb functionality?

南楼画角 提交于 2019-11-28 08:47:13
I am developing an iPhone application (like Audio Processing). I have to give some effect to the audios. If it is desktop app , many options are there. We can get good examples and full project like audacity. But I want to develop for iPhone. I got an app with reverb option; (take a look at following link). Just I watch the "video" , I did not test this application in my iPhone device. http://www.appstorehq.com/reverb-iphone-89870/app My question is; How can I develop the app with reverb functionality ? Is there any documentation for that ? If it is, just share with us. NOTE: We can use