video-streaming

convert gstreamer pipeline to opencv in python

谁说胖子不能爱 提交于 2019-12-20 04:45:08
问题 I have created a network stream with following gstreamer commands: sender: gst-launch-1.0 -v videotestsrc ! video/x-raw,framerate=20/1 ! videoscale ! videoconvert ! x264enc tune=zerolatency bitrate=500 speed-preset=superfast ! rtph264pay ! udpsink host=X.X.X.X port=5000 receiver: gst-launch-1.0 -v udpsrc port=5000 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! decodebin ! videoconvert ! autovideosink This

red5 media server and protect video from being embeded?

半城伤御伤魂 提交于 2019-12-20 04:31:26
问题 I'm trying to protect my videos the most possible way, so I know putting them up on a red5 media server will make them streaming so the flv file won't be downloaded to users cache but the main concern I have is that possible to protect the video from being embeded? so it will run only from domains that I specify? and also, is it possible to somehow encrypt a streaming video file? and encrypt it down in my flash player when it's downloading? 回答1: To secure your streams you can edit the file:

Is Android Youtube API Using Official Youtube App for playing video?

给你一囗甜甜゛ 提交于 2019-12-20 04:19:53
问题 I used Youtube api for android to develop an app. Even though i create app and it works fine. Still i was not able to Find out what this api really do 1) When i run it on device android 2.2 for first time . It force me to download new updtaes for Official YouTube app from Google play store . After that it works fine and didn't ask for further updation or download of YouTube app . So is YouTube api uses Youtube app for palying video By going through the api code i found out something public

using ffmpeg to display video on iPhone

风流意气都作罢 提交于 2019-12-19 11:37:16
问题 anyone can help me on this ? I have this API ret = avRecvFrameData(avIndex, buf, VIDEO_BUF_SIZE, (char *)&frameInfo, sizeof(FRAMEINFO_t), &frmNo); the buffer will fill with the content from the video thread the codec is H264 frameInfo contains the related information. If I want to display on iPhone, how to do it with ffmpeg? much appreciated with your help . 回答1: You should not be using ffmpeg in an iOS for a number of reasons. First, there are real license issues that put including ffmpeg in

Capturing a Multicast UDP Video stream using OpenCV

风流意气都作罢 提交于 2019-12-19 10:33:12
问题 I have a multi-cast UDP Video stream that I need my OPenCV (Emgu ) 2.4.x app to capture and process ("client"). On the client, I can capture the stream using VLC (udp://xx.yy.zz.aaa:1234, However the my app fails to capture this udp stream. My code is quite simple ( Capture cap = new Capture ("udp://@212.1.1.1:1234"); p.s. I have tried with and 2/o the @ also tried rtp on that address. No luck :-/ Does OpenCV directly allow "capture" of UDP streams? or do I need to run VLC on the client to re

How to capture multiple camera streams with OpenCV?

拜拜、爱过 提交于 2019-12-19 09:58:20
问题 I have to stitch the images captured from many (9) cameras. Initially, I tried to capture the frames from 2 cameras with rate 15 FPS. Then, I connected 4 cameras (I also used externally powered USB hub to provide enough power) but I could only see only one stream. For testing, I used the following script: import numpy as np import cv2 import imutils index = 0 arr = [] while True: cap = cv2.VideoCapture(index) if not cap.read()[0]: break else: arr.append(index) cap.release() index += 1 video

How to capture multiple camera streams with OpenCV?

不问归期 提交于 2019-12-19 09:58:05
问题 I have to stitch the images captured from many (9) cameras. Initially, I tried to capture the frames from 2 cameras with rate 15 FPS. Then, I connected 4 cameras (I also used externally powered USB hub to provide enough power) but I could only see only one stream. For testing, I used the following script: import numpy as np import cv2 import imutils index = 0 arr = [] while True: cap = cv2.VideoCapture(index) if not cap.read()[0]: break else: arr.append(index) cap.release() index += 1 video

How to save a RTSP video stream to MP4 file via gstreamer?

泄露秘密 提交于 2019-12-19 05:48:20
问题 I need to get a video stream from my camera via RTSP and save it to a file. All of this needs to be done via gstreamer. After some google searching, I tried the following: gst-launch-1.0 rtspsrc location=rtsp://192.168.1.184/live2.sdp ! queue ! rtph264depay ! avdec_h264 ! mp4mux ! filesink location=result3.mp4 but it gives the error: "Erroneous pipeline: could not link avdec_h264-0 to mp4mux0" gst-launch-1.0 rtspsrc location=rtsp://192.168.1.184/live2.sdp ! queue ! rtph264depay ! h264parse !

Detect <embed> tag failing to load video

耗尽温柔 提交于 2019-12-19 04:37:54
问题 I am trying to catch an error with the following embed tag (on iPad/iOS): <embed width="320" height="240" src="path/to/my/live/stream/playlist.m3u8" type="application/vnd.apple.mpegurl" postdomevents="true" id="movie1" /> I tried to catch it with the following: $("#movie1").on('onerror', function() { alert('error!') } ); I also tried with onabort , onstalled , onended , and onsuspend - all not generating an event when the video fails to load. 回答1: You'll need to make a separate HTTP request

Set rate at which AVSampleBufferDisplayLayer renders sample buffers

做~自己de王妃 提交于 2019-12-19 04:23:43
问题 I am using an AVSampleBufferDisplayLayer to display CMSampleBuffers which are coming over a network connection in the h.264 format. Video playback is smooth and working correctly, however I cannot seem to control the frame rate. Specifically, if I enqueue 60 frames per second in the AVSampleBufferDisplayLayer it displays those 60 frames, even though the video is being recorded at 30 FPS. When creating sample buffers, it is possible to set the presentation time stamp by passing a timing info