http-live-streaming

How to play m3u8 playlist in all PC browsers?

偶尔善良 提交于 2019-11-29 01:48:24
问题 By default m3u8 files can be played in Mac Safari browser, but not in any other desktop browsers. What needs to be done to play them in all browsers, both supporting HTML5 and non-HTML5? 回答1: Unfortunately HTML5 support for video is so fragmented that it is, to all intents and purposes, useless (at least as a primary focus) at this point in time. M3U8 playlists are Apple HTTP Live Streaming, and as you can tell from the name, they are (or at least started out as) an Apple standard, no other

Exoplayer adaptive hls streaming

老子叫甜甜 提交于 2019-11-29 01:31:15
问题 I am looking for good and simple example/explanation how to implement ExoPlayer for HLS Adaptive streaming. I am a newbie and do not have experience and knowledge so I can figure how to do this from code example on git. There are too many 'moving parts' so beginner can understand and reuse it in own projects. Can somebody help me to learn and understand how to use/implement ExoPlayer in order to achieve this functionality? Thanks! 回答1: The easiest way to get started using ExoPlayer is to add

HTTP Live Streaming with AVPlayer in iOS 4.0?

大憨熊 提交于 2019-11-29 00:38:18
Is it possible to use HTTP Live Streaming with AVPlayer on iOS 4.0? This was clearly a documented feature of 4.0. However, if I run Apple's SitchedStreamPlayer sample code on my 3GS running iOS 4.0.1, clicking "Load Movie" does not play the stream, but gives an error: 2011-06-21 13:14:49.428 StitchedStreamPlayer[680:307] The asset's tracks were not loaded due to an error: Cannot Open MPMediaPlayer is able to play the same stream on the same device. However, I need a working solution with AVPlayer. Does anyone know how to get Apple's StichedStreamPlayer code to work on 4.0? The Runtime

How to let AVPlayer retrieve playlist secured by SSL?

江枫思渺然 提交于 2019-11-28 18:29:17
We´re developing a HTTP-streaming iOS app that requires us to receive playlists from a secured site. This site requires us to authenticate using a self signed SSL certificate. We read the credentials from a .p12 file before we use NSURLConnection with a delegate to react to the authorization challenge. - (void)connection:(NSURLConnection *)connection didReceiveAuthenticationChallenge:(NSURLAuthenticationChallenge *)challenge { [[challenge sender] useCredential: self.credentials forAuthenticationChallenge:challenge]; } - (BOOL)connection:(NSURLConnection *)connection

HTML5 video: How to test for HLS playing capability? (video.canPlayType)

喜夏-厌秋 提交于 2019-11-28 17:12:32
问题 I have video which is delivered over HLS. Now I'd like to test in JavaScript if the device actually can play HLS video in HTML5. Usually in Javascript I did something like document.createElement('video').canPlayType('video/mp4') However I can't figure out which 'type' is the right one for HLS. Apple's Safari HTML5 Audio and Video Guide seems to suggest "vnd.apple.mpegURL" ("Listing 1-7 Falling back to a plug-in for IE") <video controls> <source src="HttpLiveStream.m3u8" type="vnd.apple

Is Android 2.2 HTTP progressive streaming = HTTP Live Streaming?

拈花ヽ惹草 提交于 2019-11-28 16:26:20
The Stagefrigh media framework (Android 2.2) supports HTTP progressive streaming. What's that means? I.e. is this an HTTP Live Streaming protocol realization? And how to use HTTP Live Streaming on Android, I mean what's the client - web browser, MediaPlayer or just "in-SDK" realization and I have to inherit from some class? One big practical diffrence is that Stagefright media Framework supports mpeg3 streaming, which the old engine didn't. So you can use (shoutcast) mp3streams for example. Here is a simple example of implementation, which streams a shoutcast URL: http://fr3.ah.fm:9000 . Note

transcode and segment with ffmpeg

本秂侑毒 提交于 2019-11-28 15:35:58
It appears that ffmpeg now has a segmenter in it, or at least there is a command line option -f segment in the documentation. Does this mean I can use ffmpeg to realtime-transcode a video into h.264 and deliver segmented IOS compatible .m3u8 streams using ffmpeg alone? if so, what would a command to transcode an arbitrary video file to a segmented h.264 aac 640 x 480 stream ios compatible stream? Absolutely - you can use -f segment to chop video into pieces and serve it to iOS devices. ffmpeg will create segment files .ts and you can serve those with any web server. Working example (with

Implementation of HTTP Live Streaming in iOS

人盡茶涼 提交于 2019-11-28 13:21:05
问题 I want to write a little iOS video client and have to use HTTP Live Streaming. The videos come from a Wowza Media Server which supports HTTP Live Streaming, so the server-side implementation is not my problem. I have already watch the WWDC videos and read the Apple documentation about HTTP Live Streaming. But there is nowhere explained how to play back the videos on an iOS device. In a WWDC-talk was mentioned that there are 3 possibilities to display the videos: UIWebView

stream live video to Android

不打扰是莪最后的温柔 提交于 2019-11-28 04:11:34
How can I stream live video to Android (2.1 and higher), I have two links: m3u8 and f4m (As I know, f4m is not supported). From what I saw on stackoverflow, there is a way to stream m3u8 with vitamio (but the link is not working ). Is there any other way to stream m3u8 video ? Maybe there is other format that I can use ? Thanks. Butters Because no one answered my question, I will do it myself. If you want to perform HLT (HTTP Live Stream) on Android 2.1 and higher you may use the vitamio library. Site at: ( http://www.vitamio.org/ ). Here is code example: The main layout: <?xml version="1.0"

FairPlay Streaming: Calling copyPixelBufferForItemTime on AVPlayerItemVideoOutput returns NULL

半城伤御伤魂 提交于 2019-11-28 01:45:59
Has anybody had experience using HLS with Fairplay and succeeded in retrieving the pixel buffer? I'm using an AVURLAsset with its resourceLoader delegate set. My AVAssetResourceLoaderDelegate takes care of dealing with the Fairplay process. It displays fine on an AVPlayerLayer , however, when I try to use an AVPlayerItemVideoOutput that is attached to the AVPlayerItem and use copyPixelBufferForItemTime on it, the pixelBuffer returned is always NULL. On the other hand, when I use a non-encrypted stream and not use the resourceLoader copyPixelBufferForItemTime returns a pixelbuffer as expected.