mediarecorder

MediaRecorder start fail -2147483648

醉酒当歌 提交于 2019-11-29 15:47:19
I intend to record calls with this application. But when I set the audioSource to MediaRecorder.AudioSource.VOICE_CALL, it gives an error but when the audioSource is set to MediaRecorder.AudioSource.MIC, it works perfectly fine. I am not sure where is the problem. The logcat of the problem is below. Any form of help is greatly appreciated. Thanks. public class IncomingCallReceiver extends BroadcastReceiver { private MediaRecorder mRecorder; @Override public void onReceive(Context context, Intent intent) { Bundle bundle = intent.getExtras(); if(null == bundle) return; String state = bundle

Camera2 video recording without preview on Android: mp4 output file not fully playable

天涯浪子 提交于 2019-11-29 13:59:48
问题 I am trying to record video from the back camera (the one that faces the face) on my Samsung Galaxy S6 (which supports 1920x1080 at about 30 fps). I do not want to have to use any surface for previewing if I do not have to as this is to just happen in the background. I seem to have it working, but the output files are not playable in a way that actually is correct. On my Windows 10 PC, Windows Media Player will show the first frame and then play the audio, VLC will not show any of the frames.

Video recording and onPreviewFrame callback at the same time

自古美人都是妖i 提交于 2019-11-29 12:00:06
I'm trying to record video using MediaRecorder and get raw frames (byte arrays) from onPreviewFrame callback method Seems it's not that easy, mb it's not even possible, I don't know... But I found some answers (for similar questions) and people say that you should reconnect camera instance ( Camera.reconnect() ) after calling MediaRecorder.start() and set preview callback again I tried something like this but it doesn't work (recording works but onPreviewFrame is never called) I also tried to call Camera's stopPreview and startPreview methods after MediaRecorder.start() but seems we should not

Android Camera 使用小结

泄露秘密 提交于 2019-11-29 09:26:48
Android手机关于Camera的使用,一是拍照,二是摄像,由于Android提供了强大的组件功能,为此对于在Android手机系统上进行Camera的开发,我们可以使用两类方法:一是借助Intent和MediaStroe调用系统Camera App程序来实现拍照和摄像功能,二是根据Camera API自写Camera程序。由于自写Camera需要对Camera API了解很充分,而且对于通用的拍照和摄像应用只需要借助系统Camera App程序就能满足要求了,为此先从调用系统Camera App应用开始来对Android Camera做个简单的使用小结。 调用系统Camera App实现拍照和摄像功能 不是专门的Camera应用,一般用到Camera的需求就是获取照片或者视频,比如微博分享、随手记等,对于在Symbian系统上通过简单地调用系统自带的Camera APP来实现该功能是做不到的,但是Android系统强大的组件特性,使得应用开发者只需通过Intent就可以方便的打开系统自带的Camera APP,并通过MediaStroe方便地获取照片和视频的文件路径。具体我们还是用代码来说话吧: 例1、 实现拍照 在菜单或按钮的选择操作中调用如下代码,开启系统自带Camera APP,并传递一个拍照存储的路径给系统应用程序,具体如下: imgPath = "/sdcard

Mediarecorder start failed -19

北战南征 提交于 2019-11-29 07:59:15
I am getting this error when running start() for mediarecorder. 06-28 18:46:22.570: E/MediaRecorder(9540): start failed: -19 06-28 18:46:22.570: W/System.err(9540): java.lang.RuntimeException: start failed. I am extending mediarecorder class My code: camera = Camera.open(cameraId); super.setCamera(camera); super.setVideoSource(MediaRecorder.VideoSource.CAMERA); super.setOutputFormat(MediaRecorder.OutputFormat.DEFAULT); if (mode==MODE_DEFAULT) { super.setMaxDuration(1000); super.setMaxFileSize(Integer.MAX_VALUE); } else { // On some phones a RuntimeException might be thrown :/ try { super

Using android MediaRecorder

不羁岁月 提交于 2019-11-29 07:54:32
问题 Below is the structure of my working code to record video and audio: Questions: 1) Why is the CamcorderProfile needed? setProfile(...) appears to set the dimensions to whatever QUALITY_HIGH gives, but later I set the dimensions I want with setVideoSize(...) , which overrides this. However, when I remove the two CamcorderProfile lines, the app crashes at setVideoSize(...) with LogCat E/MediaRecorder(19526): setVideoSize called in an invalid state: 2 . 2) How do I not record audio? The

How to pause/resume a recording created with mediarecorder?

不想你离开。 提交于 2019-11-29 05:16:41
I'm trying to pause a recording on an incoming call and resume it later. i'm using the andriod mediarecorder and trying to record in MPEG4 . I tried pause/resume with resetting/stopping a recording and starting it with the setOutputFile(fd) , fd being the filedescriptor of the audio file that was stopped/paused and hoped it would append but i had no luck. Is there a way to achieve this or append two recordings or should i give up on mediarecorder . code: private MediaRecorder media_recorder; private String file_path = null; public void startRecording(path) { file_path = path media_recorder=

how to switch between front and back camera when using MediaRecorder android

不打扰是莪最后的温柔 提交于 2019-11-29 00:24:47
if anyone has any idea that how to switch between front and back camera when using MediaRecorder . I defing a button for this function, but have no idea how to define the onclickListener. the total activity is the following: import java.io.File; import java.io.IOException; import android.app.Activity; import android.content.Intent; import android.content.pm.ActivityInfo; import android.hardware.Camera; import android.media.MediaRecorder; import android.os.Bundle; import android.os.Environment; import android.util.Log; import android.view.SurfaceHolder; import android.view.SurfaceView; import

How to record video on Android into Stream

倖福魔咒の 提交于 2019-11-28 23:04:44
问题 Android MediaRecorder allows to save video to file (file or socket): setOutputFile(FileDescriptor fd); setOutputFile(String path) How to save videodata to OutputStream? It will be used for streaming video recording. 回答1: You can do it using ParcelFileDescriptor.fromSocket(): String hostname = "example.com"; int port = 1234; Socket socket = new Socket(InetAddress.getByName(hostname), port); ParcelFileDescriptor pfd = ParcelFileDescriptor.fromSocket(socket); MediaRecorder recorder = new

Is it possible to merge multiple webm blobs/clips into one sequential video clientside?

戏子无情 提交于 2019-11-28 22:07:28
I already looked at this question - Concatenate parts of two or more webm video blobs And tried the sample code here - https://developer.mozilla.org/en-US/docs/Web/API/MediaSource -- (without modifications) in hopes of transforming the blobs into arraybuffers and appending those to a sourcebuffer for the MediaSource WebAPI, but even the sample code wasn't working on my chrome browser for which it is said to be compatible. The crux of my problem is that I can't combine multiple blob webm clips into one without incorrect playback after the first time it plays. To go straight to the problem