libstreaming

get ERROR_CAMERA_ALREADY_IN_USE when usb debugging on my Nexus 5 with libstreaming

你离开我真会死。 提交于 2020-01-06 14:45:18
问题 I am trying to use libstreaming library to stream my phone camera output. I built my app based on the Example 3. But I just keep getting this ERROR_CAMERA_ALREADY_IN_USE exception when usb debugging with my Nexus 5 (with Android 6.0.1). I tried killing other apps, rebooting my phone, but the exception is still there. I checked online and see this, but it doesnt give me the correct answer. So I am trying to ask for help and I am grateful for any replies. Here is my MainActivity: import android

android libstreaming with MedicaCodec APi buffer size not big enough error

折月煮酒 提交于 2019-12-24 14:28:25
问题 I am using libstreaming library and trying to stream with the RtspClient and the MedicaCodec API. I am testing with a galaxy s3 with android 4.4. The problem is that no matter if I use buffer to buffer or surface to buffer I get this error : java.lang.IllegalStateException: The decoder input buffer is not big enough (nal=181322, capacity=65536). and java.lang.RuntimeException: The decoder did not decode anything. MediaRecorder api works fine but the quality is so low I can't tell if I have a

Stream using libstreaming to VLC 2.2.4 exception

半城伤御伤魂 提交于 2019-12-23 04:45:47
问题 I setup libstreaming to stream video from Android camera to VLC 2.2.4 on my macOS. SharedPreferences.Editor editor = PreferenceManager.getDefaultSharedPreferences(context).edit(); editor.putString(RtspServer.KEY_PORT, String.valueOf(1777)); editor.apply(); SessionBuilder.getInstance() .setContext(context) .setCallback(this) .setAudioEncoder(SessionBuilder.AUDIO_NONE) .setVideoEncoder(SessionBuilder.VIDEO_H264) .setVideoQuality(new VideoQuality(640, 480, 15, 500000)); context.startService(new

Decrease delay during streaming and live streaming methods

偶尔善良 提交于 2019-12-05 08:05:30
问题 I am currently using an app that uses the method exemplified on libstreaming-example-1 (libstreaming) to stream the camera from an Android Device to an Ubuntu Server (using openCV and libVLC). This way, my Android device acts like a Server and waits for the Client (Ubuntu Server) to send the play signal over RTSP and then start the streaming over UDP. The problem I am facing with the streaming is that I am getting a delay of approximately 1.1s during the transmission and I want to get it down

Decrease delay during streaming and live streaming methods

喜你入骨 提交于 2019-12-03 21:53:09
I am currently using an app that uses the method exemplified on libstreaming-example-1 ( libstreaming ) to stream the camera from an Android Device to an Ubuntu Server (using openCV and libVLC ). This way, my Android device acts like a Server and waits for the Client (Ubuntu Server) to send the play signal over RTSP and then start the streaming over UDP. The problem I am facing with the streaming is that I am getting a delay of approximately 1.1s during the transmission and I want to get it down to 150ms maximum . I tried to implement the libstreaming-example-2 of libstreaming-examples, but I