video-streaming

How to save files in my server with ScriptCam plugin

偶尔善良 提交于 2019-12-04 05:31:48
问题 I want the users of my website to stream video in the website and preview it. then they can click save and the file is saved in the website's server. I found this plugin It seem to work fine, but the problem is that after it makes the file conversion it gives me a link to download the file whis supposedely is kept in ScriptCam's own server. I also noticed that their server URL is pre appended to the "fileName" variable. i don't know where it's coming from. Any help is welcome, Mike 回答1:

UDP Video Streaming on Android

被刻印的时光 ゝ 提交于 2019-12-04 05:24:10
I have an Android project where I need to build a client app to receive UDP or RTP unicast video streams and play them back. Unfortunately, I cannot seem to get this working and have searched extensively for a solution! I have being testing on a Xoom (Android 3.2) and a Nexus S (Android 2.3.6) and know that they can play the content when using MX Player (a third-party media player app) as the client but I can't get the native media player to play back the content. I have tried using both a simple VideoView and a MediaPlayer but both fail with the same error code and I can't really find any

Progressive Video Download on iOS

徘徊边缘 提交于 2019-12-04 04:45:36
I am trying to implement Progressive Downloading of a video in my iOS application that can be played through AVPlayer. I have already implemented a downloader module that can download the files to the iPad. However, I have discovered I cannot play a file that is still being written to So, as far as I can tell, my only solution would be through downloading a list of file 'chunks' and then keep playing through every file as they are ready (ie: downloaded), probably using HLS Searching I have come across this question which implements the progressive download through hls but other than that, I

Decode of live RTSP stream: large video lag using MediaPlayer on Android

好久不见. 提交于 2019-12-04 04:16:20
I'm playing a Live RTSP stream from VLC on a PC to Android MediaPlayer class (both on same local network). It plays smoothly with no errors - the problem is that the decoded video on screen is between around 5 and 7 seconds behind live. From debug and callbacks I can see that the live data is arriving on the device < 1s after starting mMediaPlayer.prepareAsync() . This is when the MediaPlayer class begins to work out what format the stream is with what dimensions etc. Then just before video is shown on screen (between 5 and 7 seconds later), onPrepared() is called where I call mMediaPlayer

Draw overlay (HUD) on Android VideoView?

◇◆丶佛笑我妖孽 提交于 2019-12-04 04:04:57
I have a custom view that draws HUD : Here is my layout: <?xml version="1.0" encoding="utf-8"?> <FrameLayout xmlns:android="http://schemas.android.com/apk/res/android" android:layout_width="fill_parent" android:layout_height="fill_parent" android:orientation="vertical" > <VideoView android:id="@+id/videoView1" android:layout_gravity="center" android:layout_width="match_parent" android:layout_height="wrap_content" /> <com.widgets.HUD android:id="@+id/hud" android:layout_width="fill_parent" android:layout_height="fill_parent" /> </FrameLayout> public View onCreateView(LayoutInflater inflater,

How do I play Youtube videos using the phonegap-videoplayer-plugin?

你离开我真会死。 提交于 2019-12-04 03:58:31
问题 I am using a Phonegap plugin for playing a video in my iOS app. I'm able to play a video with the URL format like http://easyhtml5video.com/images/happyfit2.mp4 . How do I play Youtube videos using the phonegap-videoplayer-plugin? 回答1: There are some methods that will give you direct link to youtube videos. Use "gdata" option to find all possible video formats. Then parse the result to get desired link. Hope this might be useful 回答2: YouTube Terms of Service: "You agree not to access Content

ffmpeg transcode to live stream

佐手、 提交于 2019-12-04 01:55:51
问题 I need to display a ip camera stream in an html video tag, i have figured out how to transcode to a file from the rtsp stream like this ffmpeg -i "rtsp://user:password@ip" -s 640x480 /tmp/output.mp4 now i need to be able to be able to live stream the rtsp input in a video tag like this <video id="video" src="http://domain:port/output.mp4" autoplay="autoplay" /> I was trying to do something like this in my server (an ubuntu micro instance on amazon) in order to reproduce the video in the video

Streaming large video files .net

隐身守侯 提交于 2019-12-04 01:26:23
问题 I am trying to stream a large file in webforms from an HttpHandler. It doesn't seem to work because its not streaming the file. Instead its reading the file into memory then sends it back to the client. I look all over for a solution and the solution are telling me that they stream the file when they are doing the same thing. My solution that stream is this: using (Stream fileStream = File.OpenRead(path)) { context.Response.Cache.SetExpires(DateTime.UtcNow.AddMinutes(360.0)); context.Response

Calculate .m4s segment file suffix in HTML5 video streaming when user seeks to another time

心已入冬 提交于 2019-12-04 00:46:26
I have created fixed length segments for a long MP4 video using Mp4Box. Mp4Box creates a meta info file mv_init.mp4 and segments like mv_1.m4s , mv_2.m4s , … I stream the video using HTML5 Media Source Extensions and the streaming is working properly. The problem is that I am unable to utilize time seeking feature of my HTML5 player . When a user uses the seekbar to seek to another time point, I need to fetch the correct segment file ( mv_{number}.m4s ) for that currentTime . For example: video duration: 2 hours segment size: 10 seconds user seeks to time: 25 minutes 25 minutes = 25 × 60

How to keep a live MediaSource video stream in-sync?

杀马特。学长 韩版系。学妹 提交于 2019-12-03 23:38:01
I have a server application which renders a 30 FPS video stream then encodes and muxes it in real-time into a WebM Byte Stream . On the client side, an HTML5 page opens a WebSocket to the server, which starts generating the stream when connection is accepted. After the header is delivered, each subsequent WebSocket frame consists of a single WebM SimpleBlock. A keyframe occurs every 15 frames and when this happens a new Cluster is started. The client also creates a MediaSource , and on receiving a frame from the WS, appends the content to its active buffer. The <video> starts playback