pcm

JTransforms FFT in Android from PCM data

匆匆过客 提交于 2019-11-30 10:11:27
I've been playing with this now for sometime, I cant work out what I am meant to be doing here. I am reading in PCM audio data into an audioData array: recorder.read(audioData,0,bufferSize); //read the PCM audio data into the audioData array I want to use Piotr Wendykier's JTransform library in order to preform an FFT on my PCM data in order to obtain the frequency. import edu.emory.mathcs.jtransforms.fft.DoubleFFT_1D; At the moment I have this: DoubleFFT_1D fft = new DoubleFFT_1D(1024); // 1024 is size of array for (int i = 0; i < 1023; i++) { a[i]= audioData[i]; if (audioData[i] != 0) Log.v

Why are an integers bytes stored backwards? Does this apply to headers only?

与世无争的帅哥 提交于 2019-11-30 08:50:53
I'm currently trying to decipher WAV files. From headers to the PCM data. I've found a PDF ( http://www.tdt.com/T2Support/technical_notes/tn0132.pdf ) detailing the anatomy of a WAV file, and I've been able to extract and make sense of the appropriate header data using Ghex2. But my questions are: Why are the integers bytes stored backwards? I.e. dec. 20 is stored as 0x14000000 instead of 0x00000014. Are the integers of the PCM data also stored backwards? FixerMark WAV files are little-endian (least significant bytes first) because the format originated for operating systems running on intel

AudioTrack - short array to byte array distortion using jlayer(java mp3 decoder)

一曲冷凌霜 提交于 2019-11-30 03:57:01
问题 I'm using jLayer to decode MP3 data, with this call: SampleBuffer output = (SampleBuffer) decoder.decodeFrame(frameHeader, bitstream); This call which returns the decoded data, returns an array of short[]. output.getBuffer(); When I call AudioTrack write() with that method, it plays fine as I loop through the file: at.write(output.getBuffer(), 0, output.getBuffer().length); However, when I convert the short[] array to byte[] array using any of the methods in this answer: https://stackoverflow

Playing PCM stream from Web Audio API on Node.js

≡放荡痞女 提交于 2019-11-30 03:19:09
I'm streaming recorded PCM audio from a browser with web audio api. I'm streaming it with binaryJS (websocket connection) to a nodejs server and I'm trying to play that stream on the server using the speaker npm module. This is my client. The audio buffers are at first non-interleaved IEEE 32-bit linear PCM with a nominal range between -1 and +1 . I take one of the two PCM channels to start off and stream it below. var client = new BinaryClient('ws://localhost:9000'); var Stream = client.send(); recorder.onaudioprocess = function(AudioBuffer){ var leftChannel = AudioBuffer.inputBuffer

PCM Raw Bytes [] To Audio on Android

岁酱吖の 提交于 2019-11-30 00:14:26
I currently have a PCM audio in the form of a byte array. The format is signed 16 bit little endian. I would like to convert this to some playable format on the Android, preferably version 3.2 or higher. Does anyone have suggestions on how this can be done? I have done some research and tried the following below, but none were successful. It would be much appreciated if anyone can suggest a better way or indicate where I have gone wrong. I have tried creating an AudioFormat with the correct audio settings, however Android does not support the javax.sound.sampled library. I have also tried

How to correctly read decoded PCM samples on iOS using AVAssetReader — currently incorrect decoding

我怕爱的太早我们不能终老 提交于 2019-11-29 23:24:03
I am currently working on an application as part of my Bachelor in Computer Science. The application will correlate data from the iPhone hardware (accelerometer, gps) and music that is being played. The project is still in its infancy, having worked on it for only 2 months. The moment that I am right now, and where I need help, is reading PCM samples from songs from the itunes library, and playing them back using and audio unit. Currently the implementation I would like working does the following: chooses a random song from iTunes, and reads samples from it when required, and stores in a

Can ffmpeg convert audio to raw PCM? If so, how?

本秂侑毒 提交于 2019-11-29 19:39:55
I'm currently using ffmpeg to convert FLV/Speex to WAV/pcm_s16le , successfully. However, I now need the output format to be RAW, that is, PCM signed 16-bit little endian, without the WAV header. I tried the following: ffmpeg -y -i input.flv -vn -acodec pcm_s16le output.raw But ffmpeg responds with: Unable to find a suitable output format for 'output.raw' I also tried using output.pcm and output as output file names, with the same result. I also tried the -f flag to specify raw format, but that gives: Unknown input or output format: raw Is this possible with FFmpeg? If so, how? Chris Haas Give

Why are an integers bytes stored backwards? Does this apply to headers only?

五迷三道 提交于 2019-11-29 12:14:19
问题 I'm currently trying to decipher WAV files. From headers to the PCM data. I've found a PDF (http://www.tdt.com/T2Support/technical_notes/tn0132.pdf) detailing the anatomy of a WAV file, and I've been able to extract and make sense of the appropriate header data using Ghex2. But my questions are: Why are the integers bytes stored backwards? I.e. dec. 20 is stored as 0x14000000 instead of 0x00000014. Are the integers of the PCM data also stored backwards? 回答1: WAV files are little-endian (least

Audio File FFT in an OS X environment

久未见 提交于 2019-11-29 10:55:24
问题 I'm looking to perform an FFT on a linear PCM audio file (with potentially more than one audio channel) on OS X. What is the best way to go about this? Several sources have indicated that Apple's Accelerate Framework is what I need. If so, how should I extract and properly prepare the floating point data for use in those FFT functions? 回答1: Here's roughly what you want to do. Fill in your own input and output functions. // Stick new data into inData, a (float*) array fetchFreshData(inData); /

how to play pcm raw data in java [closed]

心不动则不痛 提交于 2019-11-29 04:09:10
I have PCM samples in a short array. What is the best way to play this out? The format is 8000Hz, Mono, 16 bit, big endian. (The PCM samples are generated in the code and not read through some file) Thanks With the javax.sound.sampled package it's pretty much straightforward, but you have to use some boilerplate. Here's a good tutorial on that: www.wikijava.org/wiki/Play_a_wave_sound_in_Java Basically you have to create an InputStream from your array and use that to create an AudioInputStream . There you have to specify the format of your audio data. Then you open an output stream (