audio-recording

Mixing down two files together using Extended Audio File Services

╄→尐↘猪︶ㄣ 提交于 2019-12-05 09:20:19
问题 I am doing some custom audio post-processing using audio units. I have two files that I am merging together (links below), but am coming up with some weird noise in the output. What am I doing wrong? I have verified that before this step, the 2 files ( workTrack1 and workTrack2 ) are in a proper state and sound good. No errors are hit in the process as well. Buffer Processing code : - (BOOL)mixBuffersWithBuffer1:(const int16_t *)buffer1 buffer2:(const int16_t *)buffer2 outBuffer:(int16_t *

Audio Recording in ReactJS

北城余情 提交于 2019-12-05 07:20:21
I am trying to record audio with ReactJS and want to store in my node server. For it I try to use "react-audio-recorder" module, but I am getting some issues in it like while recording audios continuously one after another, module is getting failed and also I tried to use p5.js for audio recording but I am getting issues in configuring it. Please suggest me the best way to record audio in react(JavaScript) and to save it in my node server. The react-mic project can handle the recording. Don't know how to do the server side; I'm still working on it myself. 来源: https://stackoverflow.com

IOS Swift read PCM Buffer

喜你入骨 提交于 2019-12-05 06:00:16
问题 I have a project for Android reading a short[] array with PCM data from microphone Buffer for live analysis. I need to convert this functionality to iOS Swift. In Android it is very simple and looks like this.. import android.media.AudioFormat; import android.media.AudioRecord; ... AudioRecord recorder = new AudioRecord(MediaRecorder.AudioSource.DEFAULT, someSampleRate, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, AudioRecord.getMinBufferSize(...)); recorder.startRecording();

record live streaming audio

断了今生、忘了曾经 提交于 2019-12-05 01:41:54
问题 I'm actually making an app which has to play and record streaming audio from internet on ipad. The streaming of the audio is done, I will have to come to the recording part very soon and I don't have any idea on how to proceed. Could you give me a hint??? Idea? It will have to play while simultaneously recording into AAC or MP3. Thanks. 回答1: You'll need to use the lower-level AudioQueue API, and use the AudioSession API to set up the audio session. Then you'll need to fill out an

Android Media Recording using threads

坚强是说给别人听的谎言 提交于 2019-12-05 00:54:20
问题 I am developing an android application that simply start and stop recording using Buttons. I used threads. I created three classes.. One to start recording.. one to stop recording and the main class.. The problem is that I can see the file in my mobile but it is empty and the mobile give me a msg "Unable to play video" .. I want it to work with the threads.. I dont want other methods.. This is my code The main class: public class MediaRecorderSampleActivity extends Activity { Button start;

Recording Mono on iPhone in IMA4 format

你说的曾经没有我的故事 提交于 2019-12-04 22:52:23
I'm using the SpeakHear sample app on Apple's developer site to create an audio recording app. I'm attempting to record directly to IMA4 format using the kAudioFormatAppleIMA4 system constant. This is listed as one of the usable formats, but every time I set up my audio format variable and pass and set it, I get a 'fmt?' error. Here is the code I use to set up the audio format variable: #define kAudioRecordingFormat kAudioFormatAppleIMA4 #define kAudioRecordingType kAudioFileCAFType #define kAudioRecordingSampleRate 16000.00 #define kAudioRecordingChannelsPerFrame 1 #define

Record audio streaming with Ruby (on Rails)

て烟熏妆下的殇ゞ 提交于 2019-12-04 21:31:29
I need to record some radio programs and make them available for later listening. I have looked into the Shoutcast API for getting the audio streams resources, but don't have a clue how to record an audio broadcast and save it in an audio file . I'm looking for any Ruby libraries, or even some information on how to get started. You can save the stream in a file, for example : require 'net/http' require 'uri' url = URI.parse('http://your.stream.domain.com/') Net::HTTP.start(url.host, url.port) do |http| f = open("saved_stream.mp3", "w") begin http.request_get('/stream_path.mp3') do |resp| resp

How can I capture audio input from 2 mics of my android phone real time and simultaneously

我的未来我决定 提交于 2019-12-04 19:37:16
I have a requirement where I need audio from both the mics on my android phone simultaneously in order to do some signal processing using Eclipse. Do you think it is possible to do this? Also can you suggest a method to start recording for both mics realtime simultaneously? For two instances of class AudioRecord, if I pass audio source as MIC and CAMCORDER respectively, will I be able to capture two separate mic inputs simultaneously? I am not sure if the mics will work in parallel, and also do not know how to get them to start recording at the same time. Any input regarding this will be

Incorrect peak frequency in JTransform

浪尽此生 提交于 2019-12-04 19:00:52
I've trying to calculate peak frequency from android mic buffer as per this How to get frequency from fft result? . Unfortunatly i'm getting wrong values. Even i played a tone in 18Khz,but i'm not getting correct peak frequency. This is my code, int sampleRate=44100,bufferSize=4096; AudioRecord audioRec=new AudioRecord(AudioSource.MIC,sampleRate,AudioFormat.CHANNEL_CONFIGURATION_MONO,AudioFormat.ENCODING_PCM_16BIT,bufferSize); audioRec.startRecording(); audioRec.read(bufferByte, 0,bufferSize); for(int i=0;i<bufferByte.length;i++){ bufferDouble2[i]=(double)bufferByte[i]; } //here window techniq