lame

How can I compile lame as static library(.a) for armv6 and armv7 of iPhone?

时光怂恿深爱的人放手 提交于 2019-12-17 15:32:36
问题 LAME(http://lame.sourceforge.net/) is a library written in c language. It can convert PCM sound files to MP3 files. I use it to convert sound files to MP3 files on iPhone. The source PCM sound files is recorded by microphone. In order to include the LAME into my XCode Project, I need to compile the LAME to 3 static libraries(.a), for i386(IOS Simulator), armv6 and armv7. After a lot of search, I have complied a static library for i368 version(iOS Simulator) successfully. Here is commands: .

C, or C++ code for using LAME API to convert an M4A (MPEG-4 audio) to MP3

↘锁芯ラ 提交于 2019-12-13 08:12:10
问题 I'm using native LAME code for a part of my Android application. This code is supposed to take a string pointing to the input file (M4A) and a string to the output file (MP3). I found some code that seems, from what I can gather, to do such a thing. However, when I play back the MP3 files, all I hear is a ZIP! sound. No matter how long the recording is, I get the same sound. I was thinking that this had something to do with the sample rate, so I've tried all of the standard ones, and get

undefined lame decode function

痴心易碎 提交于 2019-12-13 01:44:58
问题 I'm trying to add MP3 read and write capabilities to my Android app. I'm using the lame4android app as a starting point. Encoding a file works for me, but I'm having a problem with the decode functions -- I'm getting undefined references to the decode functions. Here are excerpts from my wrapper.c: #include "libmp3lame/lame.h" #include "jni.h" lame_t lame; jint Java_com_intonia_dandy_WavStream_initEncoder(JNIEnv *env, jobject jobj, jint in_num_channels, jint in_samplerate) { lame = lame_init(

Converting a 32 bit wave form to a 16 bit wave form

為{幸葍}努か 提交于 2019-12-12 09:09:09
问题 I've been capturing audio using the loopback capture mode. The captured waveform is a 32 bit waveform. I'm struggling with converting this to a 16 bit waveform so encoders like lame can deal with it (it says Unsupported data format: 0x0003). I've tried shifting the bits (not my strong point) in the wave stream itself from 32 bit to 16 bit but the result still sounds distorted. The Wave32To16Stream class seems to blow up on this case: if (sourceStream.WaveFormat.Encoding != WaveFormatEncoding

Flex - Java byte to mp3

柔情痞子 提交于 2019-12-11 13:35:55
问题 I've a problem in converting byte to .mp3 sound file. In my case I do it using FileOutputStream using its write(bytes) method but it just creates a data file with mp3 extension but I cannot play it in any player on my PC. Note: I'm recording it from Flex Michrophone and send ByteArray to java. Which libraries should I use to add mp3 sound file headers etc. in java? UPDATE: I couldn't even convert my raw data to Wave format that is supported by java sound api.. It creates for me sound with

getting mp3 audio signal as array in java

冷暖自知 提交于 2019-12-11 12:12:21
问题 I have been trying to get audio stream of mp3 as array of floating points. I have got the array with the below sample code. I am not sure whether I can use this array for applying FFT. Because this array is not matching[or similar] to the one which I got from C++'s code which uses LAME. import java.io.File; import java.io.IOException; import java.io.PrintStream; import javax.sound.sampled.AudioFormat; import javax.sound.sampled.AudioInputStream; import javax.sound.sampled.AudioSystem; import

Lame encoded mp3 audio slowed down - Android

社会主义新天地 提交于 2019-12-11 08:34:27
问题 I have been following this tutorial on using LAME mp3 on Android with jni. Recording seems to be working and I am getting an output as mp3 but upon playback the audio has been slowed down and pitched down. I've tried to put all pertinent code below. Any guidance on why this is happening? Thanks in advance for your help. Edit: OK so just to check I imported the raw data into Audacity and that plays back fine so this must be an issue at the encoding stage. Java class: public class Record

How to link the “lame” mp3 encoder shared object to an Android studio project

孤者浪人 提交于 2019-12-11 07:14:14
问题 I am trying to write an Android app that uses the Lame mp3 encoder. My development environment is Android Studio 1.1. Following the hints under Lame MP3 Encoder compile for Android I managed to install the Android NDK and compile Lame. Under /app/src/main/libs/armeabi I obtained the shared object "libmp3lame.so", with libmp3lame.so: ELF 32-bit LSB shared object, ARM, version 1 (SYSV), dynamically linked (uses shared libs), stripped. However, trying to load this file in a simple Android Studio

Recording with arecord stops after 1h 33m Under Fedora 23

ぐ巨炮叔叔 提交于 2019-12-11 04:42:53
问题 I'm using this command to record audio in Linux Fedora 23 /usr/bin/arecord -d 11400 -D hw:1,0 -f S32_LE -c2 -r48000 -t wav | lame -b 192 - longrec.mp3 >> output.txt 2>&1 & echo $! Basically I want an mp3 record of 3 hours and 10 minutes (11400 seconds) from the input soundcard. Everything works fine when started, but it always stops after 1h33m12s. File output.txt shows nothing of any interest: LAME 3.99.5 64bits (http://lame.sf.net) Using polyphase lowpass filter, transition band: 18774 Hz -

Configure LAME MP3 encoder in DirectShow application using IAudioEncoderProperties

廉价感情. 提交于 2019-12-10 15:29:00
问题 I'm writting a .NET DirectShow application which captures audio stream from any capture device, encodes it in mp3 using the LAME directshow filter and finally writes the stream into a file. This is my directshow graph: capture source -> LAME AUDIO ENCODER (Audio compressor) -> WAV DEST (Wave muxer, compiled from SDK sourcres) -> File writer. The problem is that I'd like to configure the encoder (bitrate, channels, VBR/CBR, etc) programmatically and not using the properties pages