aac

ExtAudioFileWrite to m4a/aac failing on dual-core devices (ipad 2, iphone 4s)

旧巷老猫 提交于 2019-11-30 09:49:36
I wrote a loop to encode pcm audio data generated by my app to aac using Extended Audio File Services. The encoding takes place in a background thread synchronously, and not in real-time. The encoding works flawlessly on ipad 1 and iphone 3gs/4 for both ios 4 and 5. However, for dual-core devices (iphone 4s, ipad 2) the third call to ExtAudioFileWrite crashes the encoding thread with no stack trace and no error code. Here is the code in question: The data formats AudioStreamBasicDescription AUCanonicalASBD(Float64 sampleRate, UInt32 channel){ AudioStreamBasicDescription audioFormat;

What is the difference between M4A and AAC Audio Files?

泄露秘密 提交于 2019-11-30 06:27:32
is there a difference between M4A audio files and AAC audio files or are they exactly the same thing but with a different file extension ? .M4A files typically contain audio only and are formatted as MPEG-4 Part 14 files ( .MP4 container). .AAC is not a container format and instead it is a raw MPEG-4 Part 3 bitstream with audio stream encoded. Note that M4A does not have to contain exactly AAC audio, there are other valid options as well. There are raw video and audio streams, this streams cannot be played directly on most video/audio player, they need to be "encapsulated" on a transport, a

How do I programmatically convert mp3 to an itunes-playable aac/m4a file?

偶尔善良 提交于 2019-11-29 21:04:50
问题 I've been looking for a way to convert an mp3 to aac programmatically or via the command line with no luck. Ideally, I'd have a snippet of code that I could call from my rails app that converts an mp3 to an aac. I installed ffmpeg and libfaac and was able to create an aac file with the following command: ffmpeg -i test.mp3 -acodec libfaac -ab 163840 dest.aac When i change the output file's name to dest.m4a, it doesn't play in iTunes. Thanks! 回答1: FFmpeg provides AAC encoding facilities if you

Streaming AAC audio with Android

廉价感情. 提交于 2019-11-29 20:43:34
As I understand it, Android will only play AAC format audio if it's encoded as MPEG-4 or 3GPP. I'm able to play AAC audio encoded as M4A when it's local to the app, but it fails when obtaining it from a server. The following works, as the m4a file is held locally in the res/raw directory. MediaPlayer mp = MediaPlayer.create(this, R.raw.*file*); mp.start(); The following doesn't work. (But does with MP3's). Uri uri = Uri.parse("http://*example.com*/blah.m4a"); MediaPlayer mp = MediaPlayer.create(this, uri); mp.start(); Can anyone shed any light on why it fails when the m4a audio file is not

how to record audio file with better quality in android?

核能气质少年 提交于 2019-11-29 17:34:43
问题 I am creating one application which play recorded file on Android to iphone and vice-versa. now I am using , audioRecorder = new MediaRecorder(); audioRecorder.setAudioSource(MediaRecorder.AudioSource.MIC); audioRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4); audioRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC); file recorded using this code having size 85 kb /15 sec and a very poor quality. if I use , audioRecorder = new MediaRecorder(); audioRecorder.setAudioSource

Encode audio to aac with libavcodec

江枫思渺然 提交于 2019-11-28 20:45:12
I'm using libavcodec (latest git as of 3/3/10) to encode raw pcm to aac (libfaac support enabled). I do this by calling avcodec_encode_audio repeatedly with codec_context->frame_size samples each time. The first four calls return successfully, but the fifth call never returns. When I use gdb to break, the stack is corrupt. If I use audacity to export the pcm data to a .wav file, then I can use command-line ffmpeg to convert to aac without any issues, so I'm sure it's something I'm doing wrong. I've written a small test program that duplicates my problem. It reads the test data from a file,

Developing the client for the icecast server

断了今生、忘了曾经 提交于 2019-11-28 19:51:47
I am developing the client for the icecast server (www.icecast.org). Can anybody tell me, what is the format they are using for streaming the content? I was looking on their pages, but there is no information about the stream format at all. I have then checked the Wireshark trace and due to my understanding the format of the audio data I am receiving within the 200 OK response to the GET request it is just a plain binary audio data without any metadata included, so comparing to the SHOUTcast or HTTP Live Streaming (HLS) it is relative simple approach. Is that right? Any experience with it?

Encoding AAC Audio using AudioRecord and MediaCodec on Android

China☆狼群 提交于 2019-11-28 17:19:22
I am trying to encode aac audio using android AudioRecord and MediaCodec. I have created a encoder class very similar to ( Encoding H.264 from camera with Android MediaCodec ). With this class, I created an instance of AudioRecord and tell it to read off its byte[] data to the AudioEncoder (audioEncoder.offerEncoder(Data)). while(isRecording) { audioRecord.read(Data, 0, Data.length); audioEncoder.offerEncoder(Data); } Here is my Setting for my AudioRecord int audioSource = MediaRecorder.AudioSource.MIC; int sampleRateInHz = 44100; int channelConfig = AudioFormat.CHANNEL_IN_MONO; int

Streaming AAC audio with Android

不打扰是莪最后的温柔 提交于 2019-11-28 16:49:53
问题 As I understand it, Android will only play AAC format audio if it's encoded as MPEG-4 or 3GPP. I'm able to play AAC audio encoded as M4A when it's local to the app, but it fails when obtaining it from a server. The following works, as the m4a file is held locally in the res/raw directory. MediaPlayer mp = MediaPlayer.create(this, R.raw.*file*); mp.start(); The following doesn't work. (But does with MP3's). Uri uri = Uri.parse("http://*example.com*/blah.m4a"); MediaPlayer mp = MediaPlayer

Android - Include native StageFright features in my own project

淺唱寂寞╮ 提交于 2019-11-28 16:45:24
I am currently developing an application that needs to record audio, encode it as AAC, stream it, and do the same in reverse - receiving stream, decoding AAC and playing audio. I successfully recorded AAC (wrapped in a MP4 container) using the MediaRecorder , and successfully up-streamed audio using the AudioRecord class. But, I need to be able to encode the audio as I stream it, but none of these classes seem to help me do that. I researched a bit, and found that most people that have this problem end up using a native library like ffmpeg . But I was wondering, since Android already includes