How to stream data from MediaCodec to AudioTrack with Xamarin for Android

可紊 提交于 2020-01-02 05:38:07

问题


I'm trying to decode a mp3 file and stream it to AudioTrack. It all works fine but causes a lot of GC on the Java side. I've made sure to not allocate memory in my play/stream loop and blame ByteBuffer.Get(byte[], int, int) binding of allocating a temp Java array. Anyone can confirm and/or show a better way of feeding data from MediaCodec to AudioTrack? (I know API 21 introduced AudioTrack.write(ByteBuffer, ...)) Thanks

Here is what I do:

byte[] audioBuffer = new byte[...];

...

ByteBuffer codecOutputBuffer = codecOutputBuffers[outputIndex];

// The next line seems to be the source of a lot of GC during playback
codecOutputBuffer.Get(audioBuffer, 0, bufInfo.Size);

audioTrack.Write(audioBuffer, 0, bufInfo.Size);

UPDATE 1: I tried to use Allocation Tracker to confirm the allocation site. What I found out is that the allocated objects are 8kb large byte arrays. Unfortunatelly Allocation Tracker does not show allocation site stacktrace for them:

1   32      org.apache.harmony.dalvik.ddmc.Chunk    6   org.apache.harmony.dalvik.ddmc.DdmServer    dispatch    
2   16      java.lang.Integer                       6   java.lang.Integer   valueOf 
3   16      byte[]                                  6           
4   8192    byte[]                                  20          
5   8192    byte[]                                  20          
6   8192    byte[]                                  20          

To make sure it was ByteBuffer.Get(byte[], int, int) that allocates the arrays, I rerun the app with:

  1. audioTrack.Write(...) commented out - no change

  2. codecOutputBuffer.Get(audioBuffer, 0, bufInfo.Size) commented out - the allocations are gone

I'm going to rewrite it in Java to check if I get the same results in a native app.

UPDATE 2: I have rewritten the code in Java and now I get a perfectly flat graph in memory monitor - no allocations during playback.

My conclusion/guess is that the ByteBuffer.Get(byte[], int, int) binding from Mono to Java allocates a temp array. Not really sure why it is 8kb large since my audioBuffer only ever gets to slightly above 4kb.

UPDATE 3: My ultimate goal with this is to have a cross platform app (with more complex functionality on top of mp3 player) so I went ahead and created another experiment. I have a java component with audio streaming/decoding/playing functionality that exposes only play(), pause() methods that I consume from C#. This way I don't have allocation problems but still can drive the player from my hopefully reusable c# code. Source code below (this is just research - not production code).

Java:

import android.content.Context;
import android.content.res.AssetFileDescriptor;
import android.media.AudioFormat;
import android.media.AudioManager;
import android.media.AudioTrack;
import android.media.MediaCodec;
import android.media.MediaExtractor;
import android.media.MediaFormat;    
import java.nio.ByteBuffer;

public class AudioPlayer {

    public void play(Context aContext, final int resourceId){

        final Context context = aContext;

        new Thread()
        {
            @Override
            public void run() {

                try {
                    AssetFileDescriptor fd = context.getResources().openRawResourceFd(resourceId);

                    MediaExtractor extractor = new MediaExtractor();
                    extractor.setDataSource(fd.getFileDescriptor(), fd.getStartOffset(), fd.getLength());
                    extractor.selectTrack(0);

                    MediaFormat trackFormat = extractor.getTrackFormat(0);

                    MediaCodec decoder = MediaCodec.createDecoderByType(trackFormat.getString(MediaFormat.KEY_MIME));
                    decoder.configure(trackFormat, null, null, 0);

                    decoder.start();
                    ByteBuffer[] decoderInputBuffers = decoder.getInputBuffers();
                    ByteBuffer[] decoderOutputBuffers = decoder.getOutputBuffers();

                    int inputIndex = decoder.dequeueInputBuffer(-1);
                    ByteBuffer inputBuffer = decoderInputBuffers[inputIndex];
                    MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
                    byte[] audioBuffer = null;
                    AudioTrack audioTrack = null;

                    int read = extractor.readSampleData(inputBuffer, 0);
                    while (read > 0) {
                        decoder.queueInputBuffer(inputIndex, 0, read, extractor.getSampleTime(), 0);

                        extractor.advance();

                        int outputIndex = decoder.dequeueOutputBuffer(bufferInfo, -1);
                        if (outputIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {

                            trackFormat = decoder.getOutputFormat();

                        } else if (outputIndex >= 0) {

                            if (bufferInfo.size > 0) {

                                ByteBuffer outputBuffer = decoderOutputBuffers[outputIndex];
                                if (audioBuffer == null || audioBuffer.length < bufferInfo.size) {
                                    audioBuffer = new byte[bufferInfo.size];
                                }

                                outputBuffer.rewind();
                                outputBuffer.get(audioBuffer, 0, bufferInfo.size);
                                decoder.releaseOutputBuffer(outputIndex, false);

                                if (audioTrack == null) {
                                    int sampleRateInHz = trackFormat.getInteger(MediaFormat.KEY_SAMPLE_RATE);
                                    int channelCount = trackFormat.getInteger(MediaFormat.KEY_CHANNEL_COUNT);
                                    int channelConfig = channelCount == 1 ? AudioFormat.CHANNEL_OUT_MONO : AudioFormat.CHANNEL_OUT_STEREO;

                                    audioTrack = new AudioTrack(
                                            AudioManager.STREAM_MUSIC,
                                            sampleRateInHz,
                                            channelConfig,
                                            AudioFormat.ENCODING_PCM_16BIT,
                                            AudioTrack.getMinBufferSize(sampleRateInHz, channelConfig, AudioFormat.ENCODING_PCM_16BIT) * 2,
                                            AudioTrack.MODE_STREAM);

                                    audioTrack.play();
                                }

                                audioTrack.write(audioBuffer, 0, bufferInfo.size);
                            }
                        }

                        inputIndex = decoder.dequeueInputBuffer(-1);
                        inputBuffer = decoderInputBuffers[inputIndex];

                        read = extractor.readSampleData(inputBuffer, 0);
                    }
                } catch (Exception e) {

                }
            }
        }.start();    
    }    
}

C#

[Activity(Label = "AndroidAudioTest", MainLauncher = true, Icon = "@drawable/icon")]
public class MainActivity : Activity
{
    protected override void OnCreate(Bundle bundle)
    {
        base.OnCreate(bundle);

        SetContentView(Resource.Layout.Main);

        var play = FindViewById<Button>(Resource.Id.Play);
        play.Click += (s, e) =>
        {
            new AudioPlayer().Play(this, Resource.Raw.PianoInsideMics);
        };
    }
}

来源:https://stackoverflow.com/questions/28701102/how-to-stream-data-from-mediacodec-to-audiotrack-with-xamarin-for-android

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!