video-streaming

Playing webm chunks as standalone video

情到浓时终转凉″ 提交于 2019-12-22 11:02:32
问题 I've built some code that will get the MediaRecorder API to capture audio and video, and then use the ondataavailable function to send the corresponding webm file blobs up to a server via websockets. The server then sends those blobs to a client via websockets which puts the video together in a buffer using the Media Source Extension API. This works well, except that if I want to start a stream partway through, I can't just send the latest blob because the blob by itself is unplayable. Also,

How to continuously extract video frames from streaming RTMP using avconv / ffmpeg?

大憨熊 提交于 2019-12-22 10:29:45
问题 We're dealing with streaming video on RTMP and my goal is to extract frames from the stream at a given interval, e.g. every 1 second. Currently I run a command in a loop, which takes a frame and exports it as base64 JPEG: avconv -i <URL> -y -f image2 -ss 3 -vcodec mjpeg -vframes 1 -s sqcif /dev/stdout 2>/dev/null | base64 -w 0 But each of these processes is long (takes a few seconds -- which adds even more delay on streaming video that's not real time already). I am wondering if there is a

Setting larger GOP size in MediaFoundation hardware MFT

天大地大妈咪最大 提交于 2019-12-22 10:09:36
问题 I'm trying to live stream the desktop that's captured through Desktop duplication API. H264 encoding works fine, except the fact that the Desktop duplication API delivers frames only when there is a screen change, but video encoders expect the frames to be delivered at a constant frame rate. So, I'm forced to save the previous sample to feed the encoder at a constant rate when there is no screen change triggered. This works, I could see live output at the other end. One problem though, the

FFmpeg ios sample code for video streaming

徘徊边缘 提交于 2019-12-22 09:27:14
问题 I have done with Video Encoding using AVFoundation framework in ios. Now i want to stream these video to a RTMP server using FFMPEG . IT would be great help if anyone of you post a link / sample Code for achieving this. Any other Solution other than this is also invited. Thanking you in advance. 回答1: Here's some sample code to get you started. 来源: https://stackoverflow.com/questions/23357134/ffmpeg-ios-sample-code-for-video-streaming

Generate Thumbnail from server video link android

匆匆过客 提交于 2019-12-22 09:14:22
问题 Is it possible in android to get thumbnail of any kind of video of someone has a url link of that video only and video can be from any source like youtube or whatever is source.Please tell me if it is possible or not.Here is my java code by which i am trying to get a thumbnail of youtube video.. public class MainActivity extends Activity { String path = "http://www.youtube.com/watch?v=HMMEODhZUfA"; Bitmap bm; @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate

How do I convert a video or a sequence of images to a bag file?

别来无恙 提交于 2019-12-22 06:53:42
问题 I am new to ROS. I need to convert a preexisting video file, or a large amount of images that can be concatenated into a video stream, into a .bag file in ROS. I found this code online: http://answers.ros.org/question/11537/creating-a-bag-file-out-of-a-image-sequence/, but it says it is for camera calibration, so not sure if it fits my purpose. Could someone with a good knowledge of ROS confirm that I can use the code in the link provided for my purposes, or if anyone actually has the code I

Local Video Renderer in Android WebRTC

左心房为你撑大大i 提交于 2019-12-22 06:37:48
问题 I am using this library: https://bintray.com/google/webrtc/google-webrtc What I want to achieve (at least, at the beginning of my project) is render video locally. I am using this tutorial (which is the only one around the Internet) https://vivekc.xyz/getting-started-with-webrtc-for-android-daab1e268ff4. Unfortunately, the last line of code is not up-to-date anymore. The constructor needs a callback which I have no idea how to implement: localVideoTrack.addRenderer(new VideoRenderer(i420Frame

How do record video stream data as mp4 in webRTC android?

亡梦爱人 提交于 2019-12-22 01:31:54
问题 Please help me! I used this example in https://github.com/pchab/AndroidRTC to streaming video and audio from a android device to an other android device.In this example, they used 2 librarys is : libjingle_peerConnection and SocketIo client but i don't know how to save streaming data as h.264 format? 回答1: in this project have a class VideoFileRendere you can use this Rendere for save video in file https://github.com/Piasy/AppRTC-Android 回答2: After a lot of tries and hard work about this

How to convert mpeg dash (MPD) with DRM license to MP4?

杀马特。学长 韩版系。学妹 提交于 2019-12-22 00:17:18
问题 I am trying to convert a MPD dash file which has DRM protection in it to a MP4 file. I do have the URL to the DRM license. So, I tried to do this using ffmpeg library, but there is no option to pass the license URL along with ffmpeg command for decryption. 回答1: DRM is designed and created to stop you from doing it. DRM can have software reader or hardware reader. Hardware reader will not give you neither decryption key, neither decrypted content. Software reader will do it in most obfuscated

Live streaming with silverlight 4

倖福魔咒の 提交于 2019-12-21 23:35:06
问题 Greeting, Is there a live streaming server for silverlight 4 from Microsoft like Flash Media Server from Adobe that support live streaming for Flash??? I know that there are many open source live streaming server that support silverlight 4 but I did not find some one good as Flash Media Server which support Flash!!! please tell me if Microsoft has a media server for live streaming or if there is a good open source server for that. I'm working in building web conference system using