video-streaming

How to pause ExoPlayer 2 playback and resume (PlayerControl was removed)

﹥>﹥吖頭↗ 提交于 2019-12-18 11:45:17
问题 In ExoPlayer < 2.x there was a class PlayerControl with pause() and resume() functions but it was removed. I can't find a way to do this on ExoPlayer 2 . How can I pause and resume a playback? 回答1: you can use void setPlayWhenReady(boolean playWhenReady); . If Exo is ready, passing false you will pause the player. Passing true you will resume it. You can check the player's state using getPlaybackState() 回答2: This is my way. Create two methods and call them when needed. private void

How to display youtube-like video player in website?

穿精又带淫゛_ 提交于 2019-12-18 11:34:22
问题 I'm working on a website where I want camera-recorded vidoes to be uploaded, and be viewable (but not downloadable) to logged-in users only. I'd also like to edit the videos, have certain images appear in the background, and possibly note the time at which they last stopped watching a video. (I.e if they stopped watching after 30 mins, i'd like to start the video from 30:00 the next time they view it). My question is, 1) Is there a way to dynamically add the selected images/animation as the

Caching with AVPlayer and AVAssetExportSession

纵饮孤独 提交于 2019-12-18 11:13:55
问题 I would like to cache progressive-download videos using AVPlayer. How can I save an AVPlayer's item to disk? I'm trying to use AVAssetExportSession on the player's currentItem (which is fully loaded). This code is giving me " AVAssetExportSessionStatusFailed (The operation could not be completed) " : AVAsset *mediaAsset = self.player.currentItem.asset; AVAssetExportSession *es = [[AVAssetExportSession alloc] initWithAsset:mediaAsset presetName:AVAssetExportPresetLowQuality]; NSString *outPath

ffmpeg restream rtsp to mjpeg

五迷三道 提交于 2019-12-18 11:13:30
问题 I have a few IP cameras that stream 720 X264 video over rtsp. The streams are really unreliable when viewing on Android. They also fail if more then 2 connections are made. I have a ubuntu server that I can use to connect and restream as mjpeg or something else. There are tons of different commands out there but they all seem to involve transcoding the video. How can I simply restream the live rtsp feed as a mjpeg without doing anything to the video itself? Theres no audio so no worries there

Android Camera RTSP/RTP Stream?

你离开我真会死。 提交于 2019-12-18 10:57:26
问题 How can I send Android camera video using RTP/RTSP and play it in PC(using vlc or any other player). I googled this and found two answers: 1) using mediarecorder (http://sipdroid.org/ using VideoCamera.java) How to work with it i tried it but no result :( 2) using PreviewCallback() - onPreviewFrame(data, camera) method. by using sipdroid's (Rtppacket,Rtpsocket,sipdroidsocket) I am able to send Rtp Packets containing each frame as data and I am able to catch it via Wireshark. But I am not able

iOS video streaming and storing on device afterwards

旧城冷巷雨未停 提交于 2019-12-18 10:27:24
问题 So far I know how to stream a video and how to download it and afterwards stream it, but here's the tricky bit: streaming it once, storing it on the device and in the future play it from the device. Is that possible? 回答1: Not quite sure here how you get your stream but look in to the AVAssetWriter, AVAssetWriterInput and AVAssetWriterPixelBufferAdaptor and as soon as you receive data you should be able to append the data to the to the pixel buffer adaptor using: appendPixelBuffer

How AVSampleBufferDisplayLayer displays H.264

不羁的心 提交于 2019-12-18 10:26:27
问题 I want to share my knowledge which I worked out in some days about it. There isnt a lot to find about it. I am still fizzeling about the sound. Comments and tips are welcomed. ;-) 回答1: here my code snippets. Declare it @property (nonatomic, retain) AVSampleBufferDisplayLayer *videoLayer; at first setup the video layer self.videoLayer = [[AVSampleBufferDisplayLayer alloc] init]; self.videoLayer.bounds = self.bounds; self.videoLayer.position = CGPointMake(CGRectGetMidX(self.bounds),

Stream video content through Web API 2

不问归期 提交于 2019-12-18 10:24:36
问题 I'm in the process of working out what the best way is going to be to do the following: I have a bunch of CCTV footage files (MP4 files, ranging from 4MB-50MB in size), which I want to make available through a web portal. My first thought was to stream the file through Web API, so I found the link below: http://www.strathweb.com/2013/01/asynchronously-streaming-video-with-asp-net-web-api/ After implementing a sample project, I realised that the example was based on Web API 1, and not Web API

Video Streaming in vlcj

最后都变了- 提交于 2019-12-18 09:51:29
问题 I want to stream a video from server to the client. I have found the code for streaming the video from the server to the client side but get an error when running it: Streaming 'vlcj-speed-run.flv' to ':sout=#duplicate{dst=std{access=http,mux=ts,dst=127.0.0.1:5000}}' [018ec020] access_output_http access out: Consider passing --http-host=IP on the command line instead. [018b4978] main mux error: cannot add this stream [05493078] main decoder error: cannot create packetizer output (FLV1) 回答1: I

Android local video server

你说的曾经没有我的故事 提交于 2019-12-18 09:36:26
问题 I am trying to make a local web server using socket that will stream a video. Following is my code for server: class VideoStreamServer { public void startServer() { outFile = new File(outFilePath); Runnable videoStreamTask = new Runnable() { @Override public void run() { try { ServerSocket socket = new ServerSocket(port); StringBuilder sb = new StringBuilder(); sb.append( "HTTP/1.1 200 OK\r\n"); sb.append( "Content-Type: audio/mpeg\r\n"); sb.append( "Connection: close\r\n" ); sb.append(