http-live-streaming

trying to live stream from AWS/FMS to iphone (using HLS)

喜欢而已 提交于 2019-12-08 07:04:40
问题 I've set up an instance on Amazon AWS running Flash Media Server (FMS), which is broadcasting Live HTTP Streaming (HLS) following these instructions, so I know I'm steaming using the right streaming format for iPhone. Further, using the same instructions i've confirmed that the server is up and running and i've successfully set up a flash client to read its HDS stream (HTTP Dynamic Stream for flash devices). I wrote this iphone client code to play the stream (stolen from a tutorial that makes

Can you stream from a HTTPS server using HLS?

可紊 提交于 2019-12-08 06:47:49
问题 I'm wondering wether or not it's possible to stream from a HTTPS server using HLS, using the following code - let url = NSURL(string:"http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8") let player = AVPlayer(URL: url!) let playerController = AVPlayerViewController() playerController.player = player self.addChildViewController(playerController) self.view.addSubview(playerController.view) playerController.view.frame = self.view.frame player.play() I can stream from an HTTP server,

MPMoviePlayerController plays only when called twice. Only occurs in iOS 4

試著忘記壹切 提交于 2019-12-08 06:36:13
问题 I have an app for iOS 4.0-5.1 that uses HTTP Live Streaming to play videos. I have a simple setup with a button in a view that starts playing the stream when it is tapped. This works fine in iOS 5 but the button needs to be tapped twice before the stream begins playing in iOS 4. Does anybody know why this is happening and how to make the video play on the first tap of the button? Here is what I'm doing: .h #import <UIKit/UIKit.h> #import <MediaPlayer/MediaPlayer.h> @interface ViewController :

Resume AVPlayer stream playback last sample

若如初见. 提交于 2019-12-08 03:59:52
问题 I am trying to use the native player (AVPlayer) to reproduce a live stream on iOS. However, I have trouble resuming the playback. When I stop the playback and resume it after few seconds, the playback starts from the moment I paused instead of reproducing the current (last) sample of the live stream. Is there a way to get the last sample, o configure AVPlayer to reproduce from last sample when tapping on Play Button? 回答1: My solution is based on denying user to keep the player paused. This is

ffmpeg segments only the first part of my audio file

亡梦爱人 提交于 2019-12-07 19:25:00
问题 I'm implementing a http live streaming server to send audio file to iOS devices. No problem with Apple's tools, mediafilesegmenter, my files are valid and it works fine. I'm trying now to segment the same file using ffmpeg. I've downloaded the last stable version which is the 0.10.2 for now. Here is how I try to segment my mp3 file: ./ffmpeg -re -i input.mp3 -f segment -segment_time 10 -segment_list outputList.m3u8 -acodec libmp3lame -map 0 output%03d.mp3 It starts the mapping like expected

JWPlayer and HLS streaming - “Error loading player: No playable sources found”

好久不见. 提交于 2019-12-07 17:02:28
问题 The problem I have a server (nginx-rtmp-module) that streams from IP camera to HLS. I want to embed the live stream to popular browsers: Chrome, Firefox and IE. The stream is not working on some desktop browsers. Test player: https://content.jwplatform.com/previews/KCpvutTz-FfTLdraP What I tried Tested devices and browsers: Firefox on PC - "Error loading player: No playable sources found" IE 11 - OK Chrome on PC - OK Chrome on Android - OK iPhone - OK The questions How to resolve these issues

Error while implementing AVAssetDownloadURLSession to download HLS stream

我与影子孤独终老i 提交于 2019-12-07 16:57:25
问题 I'm trying to implement an offline mode to a streaming application. The goal is to be able to download an HLS stream on the device of the user to make it possible to watch the stream even while the user is offline. I have recently stumble on this tutorial. It seems to answer the exact requirements of what I was trying to implement but I'm facing a problem while trying to make it work. I've created a little DownloadManager to apply the logic of the tutorial. Here is my singleton class: import

MediaPlayer, only video m3u8 HTML streams work

梦想的初衷 提交于 2019-12-07 13:18:51
问题 I'm using the MediaPlayer with a m3u8 audio stream. This results in a log error message: Error(1, -1010) The first argument seems ok when I look at the error codes: https://github.com/android/platform_external_opencore/blob/master/pvmi/pvmf/include/pvmf_return_codes.h Only -1010 , is strange. When I use the apple video m3u8 url it's working great! This is the url: http://devimages.apple.com/iphone/samples/bipbop/gear1/prog_index.m3u8 The code I'm using is simple: MediaPlayer mediaPlayer = new

Android VideoView live tv stream (HLS)

只谈情不闲聊 提交于 2019-12-07 07:28:46
问题 I'am trying to develop app for tv streaming (HLS). Using code below I tested stream on 2.3.3, 3.0 and 4.0.1 version Android devices, but encountered several problems. On Android 2.3.3 stream plays for >1 minute and then just stops. On Android 3.0 it plays well and on Android 4.0.3 it displays message 'This file cannot be played' (if I remember correctly). So my question would be: How can I play stream on ether of these devices, without having stream playing problems? Or where can I read more

Android Widevine HLS/DRM support

﹥>﹥吖頭↗ 提交于 2019-12-07 05:23:47
问题 It will be soon 2 years since Google acquires the Widevine company that provides the DRM support for protecting e.g. the HLS H.264/AAC streams. According to the http://www.widevine.com/ not only Android, but also iPhone/iPad and game consoles like Wii or PS3 ares supported. Does anybody experience with the Android Widevine DRM? Regards, STeN 回答1: you must be certified by google to work with the Widevine APIs. the certification is called CWIP and requires paying a substantial sum and going