mediarecorder

Android MediaRecorder in streaming

寵の児 提交于 2019-11-30 13:43:00
Its possible to "stream" result of MediaRecorder? The unique method i can see is mediaRecorder.setOutputFile that receives a FileDescriptor. So i can write the result to a File or send via socket to receiver. I tried the second solution but the result video is corrupted because is not "seekable" in stream. The idea is to use the camera of android device to publish result to Red5. Yes, it possible, there are many examples for that. You can checkout sipdroid example. Or even Android IP camera which is much more simple. Good Luck Md. Sulayman Yes it is possible. Here is the sample code with

Camera2 video recording without preview on Android: mp4 output file not fully playable

回眸只為那壹抹淺笑 提交于 2019-11-30 13:08:52
I am trying to record video from the back camera (the one that faces the face) on my Samsung Galaxy S6 (which supports 1920x1080 at about 30 fps). I do not want to have to use any surface for previewing if I do not have to as this is to just happen in the background. I seem to have it working, but the output files are not playable in a way that actually is correct. On my Windows 10 PC, Windows Media Player will show the first frame and then play the audio, VLC will not show any of the frames. On my phone, the recorded file is playable but not totally. It will hold the first frame for 5-8

How to record the video using camera preview on TextureView

北战南征 提交于 2019-11-30 12:08:09
I have been playing with TextureView on some Android 4.0 devices. For example, I tried to develop a simple app, which can record the video and uses TextureView for its preview. However, as far as I read the Android API documents, standard MediaRecorder requires certain surface in order to perform video recording, but on the other hand, TextureView does not have its own surface anymore, so an incompatibility between TextureView and MediaRecorder seems exist. MediaRecorder.setPreviewDisplay TextureView Does anyone know how to record video using standard MediaRecorder with TextureView as its

Couldn't hear incoming voice in recorded calls in android 7?

萝らか妹 提交于 2019-11-30 07:22:34
I am developing an Android app for recording calls. This is my code snippet. recorder = new MediaRecorder(); recorder.setAudioSource(MediaRecorder.AudioSource.DEFAULT); recorder.setOutputFormat(MediaRecorder.OutputFormat.DEFAULT); recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC); recorder.setOutputFile(file_path); This is working perfectly for devices below android 7, but when I use Android 7 mobile devices I can hear only outgoing voice but cannot hear incoming voice . Can anyone help me in fixing it? Use VOICE_COMMUNICATION as AudioSource as it is microphone audio source tuned for

Android mediarecording error start failed -19 runTimeException

夙愿已清 提交于 2019-11-30 03:50:13
问题 I'm experiencing a problem with my mediarecording. I'm trying to use the front camera to recored. This gives me an error(but the preview is working). Whenever I use the back camera everything works just fine, I think this is very weird. What could be the problem and what could be the solution? My code and errors are shown below. Edit. Recording with a vga front camera seems not to work. How is this possible? Although recording with HTC camera app IS possible. Thanking you in advance.

How to record video on Android into Stream

我与影子孤独终老i 提交于 2019-11-30 02:10:17
Android MediaRecorder allows to save video to file (file or socket): setOutputFile(FileDescriptor fd); setOutputFile(String path) How to save videodata to OutputStream? It will be used for streaming video recording. You can do it using ParcelFileDescriptor.fromSocket() : String hostname = "example.com"; int port = 1234; Socket socket = new Socket(InetAddress.getByName(hostname), port); ParcelFileDescriptor pfd = ParcelFileDescriptor.fromSocket(socket); MediaRecorder recorder = new MediaRecorder(); recorder.setOutputFile(pfd.getFileDescriptor()); recorder.prepare(); recorder.start(); If you

Android MediaRecorder crashes on stop when using MP4/H264 and a resolution bigger than 720p

时光怂恿深爱的人放手 提交于 2019-11-29 23:24:25
问题 I´m trying to write a screen recorder for android and i use this as base: https://github.com/commonsguy/cw-omnibus/tree/master/MediaProjection/andcorder My problem is: If i select a resolution bigger than 720p with mp4/h264 settings, it crashes on stop. Examples: (Example 1) Does NOT work Code in RecordingSession.java: void start() { recorder=new MediaRecorder(); recorder.setVideoSource(MediaRecorder.VideoSource.SURFACE); recorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4); recorder

MediaRecorder - record calls application

眉间皱痕 提交于 2019-11-29 17:46:21
im trying to develope application that recording calls. when im recording the output sound sounds very wired - electronic sounds instead the other person voice. here is my code: public class MainActivity extends Activity implements OnClickListener { private Boolean Recording; private Button btn_REC; private MediaRecorder mrec; private File audiofile = null; private static final String TAG = "SoundRecordingDemo"; @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.main); Recording = false; mrec = new MediaRecorder(); btn_REC

How to record the video using camera preview on TextureView

别说谁变了你拦得住时间么 提交于 2019-11-29 17:24:30
问题 I have been playing with TextureView on some Android 4.0 devices. For example, I tried to develop a simple app, which can record the video and uses TextureView for its preview. However, as far as I read the Android API documents, standard MediaRecorder requires certain surface in order to perform video recording, but on the other hand, TextureView does not have its own surface anymore, so an incompatibility between TextureView and MediaRecorder seems exist. MediaRecorder.setPreviewDisplay

Record as Ogg using MediaRecorder in Chrome

不羁岁月 提交于 2019-11-29 15:55:44
Is there a way we could record ogg format in Chrome while working with MediaRecorder ? I believe, Chrome by default supports WebM. Following is what I do navigator.mediaDevices.getUserMedia({ audio: true }) .then(stream => { rec = new MediaRecorder(stream); rec.ondataavailable = e => { audioChunks.push(e.data); if (rec.state == "inactive") { let blob = new Blob(audioChunks, { 'type': 'audio/ogg; codecs=opus' }); } }; }) .catch(e => console.log(e)); You just missed to start the recoder, like: rec.start(timeslice) A code like this works, I added a player every time we record audio, so we can