video-streaming

HTML5 video for iPhone / iPad. How to detect connection speed?

微笑、不失礼 提交于 2019-12-17 19:55:29
问题 I need to stream a video in Safari for iPhone/iPad with the best possible quality. I created 2 video files: one in low-quality for slow 3G speed, one in hi-quality for WiFi broadband streaming. I noticed that some apps (YouTube for example) are able to detect if the mobile device is running by 3G or WiFi, and so select a small sized video rather than an hi-quality video. Now that is my DOM / Javascript code, the $v value is replaced by PHP and contains the video filename: <video id="thevideo"

Intel graphics hardware H264 MFT ProcessInput call fails after feeding few input samples, the same works fine with Nvidia hardware MFT

为君一笑 提交于 2019-12-17 19:23:19
问题 I'm capturing the desktop using DesktopDuplication API and converting the samples from RGBA to NV12 in GPU and feeding the same to MediaFoundation hardware H264 MFT. This works fine with Nvidia graphics, and also with software encoders but fails when only intel graphics hardware MFT is available. The code works fine on the same intel graphics machine if I fallback to Software MFT. I also have ensured that the encoding is actually done in hardware on Nvidia graphics machines. On Intel graphics

Getting green screen in ffplay: Streaming desktop (DirectX surface) as H264 video over RTP stream using Live555

青春壹個敷衍的年華 提交于 2019-12-17 18:56:20
问题 I'm trying to stream the desktop(DirectX surface in NV12 format) as H264 video over RTP stream using Live555 & Windows media foundation's hardware encoder on Windows10, and expecting it to be rendered by ffplay (ffmpeg 4.2). But only getting a green screen like below, I referred MFWebCamToRTP mediafoundation-sample & Encoding DirectX surface using hardware MFT for implementing live555's FramedSource and changing the input source to DirectX surface instead of webCam. Here is an excerpt of my

RTSP stream and OpenCV (Python)

一世执手 提交于 2019-12-17 16:17:05
问题 I have an IP camera streaming on Linux through rtsp protocol and h264 linux driver. I am able to see the video in VLC with the following address and port: rtsp://192.168.1.2:8080/out.h264 However if I try to get the same video for OpenCV processing in Python 2.7.5 (MacOS X 10.9): import cv video = cv.CaptureFromFile('rtsp://192.168.1.2:8080/out.h264') I get the following error: WARNING: Couldn't read movie file rtsp://192.168.1.2:8080/out.h264 It seems something rather simple, but I am stuck

Flush & Latency Issue with Fragmented MP4 Creation in FFMPEG

为君一笑 提交于 2019-12-17 15:47:08
问题 I'm creating a fragmented mp4 for html5 streaming, using the following command: -i rtsp://172.20.28.52:554/h264 -vcodec copy -an -f mp4 -reset_timestamps 1 -movflags empty_moov+default_base_moof+frag_keyframe -loglevel quiet - "-i rtsp://172.20.28.52:554/h264" because the source is h264 in rtp packets stream from an ip camera. For the sake of testing, the camera is set with GOP of 1 (i.e. all frames are key frames) "-vcodec copy" because I don't need transcoding, only remuxing to mp4. "

Stream live video from phone to phone using socket fd

穿精又带淫゛_ 提交于 2019-12-17 15:23:08
问题 I am new to android programming and have found myself stuck I have been researching various ways to stream live video from phone to phone and seem to have it mostly functional, except of course the most important part: playing the stream. It appears to be sending the stream from one phone, but the second phone is not able to play the stream. Here is the code for the playing side public class VideoPlayback extends Activity implements Callback { MediaPlayer mp; private SurfaceView mPreview;

Stream live video from phone to phone using socket fd

℡╲_俬逩灬. 提交于 2019-12-17 15:18:15
问题 I am new to android programming and have found myself stuck I have been researching various ways to stream live video from phone to phone and seem to have it mostly functional, except of course the most important part: playing the stream. It appears to be sending the stream from one phone, but the second phone is not able to play the stream. Here is the code for the playing side public class VideoPlayback extends Activity implements Callback { MediaPlayer mp; private SurfaceView mPreview;

how ( stop,exit ) video in webrtc navigator user media JavaScript

佐手、 提交于 2019-12-17 12:37:11
问题 how i stop and exit in pure js, stream webcam in WEBRTC api js , i have in my code the following script : <script type="text/javascript"> $(document).ready(function() { $("#abrirModal").click(function() { navigator.getUserMedia = navigator.getUserMedia || navigator.webkitGetUserMedia || navigator.mozGetUserMedia; var constraints = { audio: false, video: true }; var live = document.getElementById("live"); function successCallback(stream) { window.stream = stream; // stream available to console

How to transmit live video from within a Java application?

安稳与你 提交于 2019-12-17 10:33:49
问题 I'm trying to find ways to stream a live video generated in a Java application. The application needs to take screenshots of itself and encode these into a video stream and publish the stream. So far I have been using Xuggler (a Java library on top of FFMPEG) to encode the screenshots into a video file. This works great. Xuggler claims to be able to transmit live video via RTMP but I have not found any documentation on how to do this programmatically. Does anyone know how to stream RTMP video

Video Streaming and Android

好久不见. 提交于 2019-12-17 10:19:15
问题 Today for one of my app (Android 2.1), I wanted to stream a video from an URL. As far as I explored Android SDK it's quite good and I loved almost every piece of it. But now that it comes to video stream I am kind of lost. For any information you need about Android SDK you have thousands of blogs telling you how to do it. When it comes to video streaming, it's different. Informations is that abundant. Everyone did it it's way tricking here and there. Is there any well-know procedure that