video-streaming

How to play YouTube video in iOS not in fullscreen?

落花浮王杯 提交于 2019-12-11 10:22:59
问题 I've been trying to get YouTube music videos to play in my iOS app in a small window (not full screen) and can't get it to work. I've tried the following: I first used the YouTube API and created a YTPlayerView as instructed here: https://developers.google.com/youtube/v3/guides/ios_youtube_helper#adding_ytplayerview. This allowed me to play videos in-line, but many videos were unable to be played because of licensing issues, stating "This video contains content from *. It is restricted from

Wowza VOD App with JW Player and SMIL file - not working on mobile

◇◆丶佛笑我妖孽 提交于 2019-12-11 10:07:45
问题 I posted this in the wowza forums but have gotten a total of 0 responses, so I wanted to post here as well. I haven't done a lot of media streaming work and I'm trying to close out some details of a VOD project that basically streams video to a website and have run into an issue with enabling the correct stream to go to the correct consumer (ex: android browser, chrome desktop browser, iOS, etc). I've cycled through a bunch of tutorials, forums, etc and can't find the right fix, including

Why RTP/RTSP meddle with my H.264 NALs?

人盡茶涼 提交于 2019-12-11 09:25:27
问题 I looked in The RFC and noting could explain why the following happens(Though the decoder can still produce the original movie). I transmitted the H.264/AVC nals using VSS h.264 encoder, the byte stream looked something like this E5 46 0E 4F FF A0 23... when I read the movie data one the receiver side after the RTP Broadcaster/RTSP receiver, I get extra unknown data but always in the same places, 8 bytes are added before Start Code prefix (0x00000001), and 2 bytes are added after Start Code

Stream Video from Mobile

﹥>﹥吖頭↗ 提交于 2019-12-11 09:23:56
问题 I'm looking the best solution for video streaming from Mobile devices . As far as I understand, The most efficient way is using RTMP \ RTSP \ UDP protocol, or TCP \ websockets. So far I've found few options: HTTP Live Streaming (IOS) - but it's only for IOS HTML5 LIVE VIDEO STREAMING VIA WEBSOCKETS - which works on node.js, but with no audio! BinaryJS - bidrectional realtime binary data with binary websockets (also websockets on Node.js) WebRTC - for client side The thing is - I don't really

draw graph of encoded bit rate of video vs play location

China☆狼群 提交于 2019-12-11 08:55:29
问题 I am trying to measure the variation in the bandwidth required when a video is played over the network. For this purpose, i need to make a graph of the bandwidth required to play the video continuously at any time during the video. I tried processing the video with gstreamer but it gives me the bit rate of the decoded[not encoded] video which is more or less constant. Is there a way to get the encoded bit rate of a video over time? 回答1: Since i got no answers here, i will post the solution

How to Know the time duration a token was streaming it's video to a session in opentok

你离开我真会死。 提交于 2019-12-11 08:52:21
问题 I connected , subscribed to a session in opentok with a token and streamed my video. Now How do I get informations like How much time did my video streamed, and many other informations which can be useful for data analysis. var apiKey = "*****"; var sessionId = "**************************"; var token = "************"; var publisher = TB.initPublisher(apiKey); var session = TB.initSession(sessionId); session.connect(apiKey, token); session.addEventListener("sessionConnected",

Encoding raw h.264 data to browser via Dash

喜夏-厌秋 提交于 2019-12-11 08:08:22
问题 I have a live stream of raw h264 (no container) coming from a remote webcam. I wanna stream it live in browser using DASH. DASH requires creating mpd file (and segmentation). I found tools (such as mp4box) that accomplish that in static files, but i'm struggling to find a solution for live streams. any suggestions - preferably using node.js modules ? Threads i have checked: mp4box - from one hand i saw this comment that states " You cannot feed MP4Box with some live content. You need to feed

Multiple Libstreaming streams only recognized as session of First RTSP client

我的梦境 提交于 2019-12-11 07:49:29
问题 I am using Live555 as RTSP client to get the RTSP H264 video stream from android LibStreaming MajorKernelPanic server. I am facing the problem to display more than one video streams from the mentioned type android RTSP server which is handled by different RTSP client. The problem is obviously if using VLC which get the RTSP H264 frame via Live555 also. The first VLC (RTSP client) displays video correctly. The other VLCs (RTSP client also) display nothing but its frame is displayed in the

WebRTC cannot show a video from peer

若如初见. 提交于 2019-12-11 07:15:14
问题 I've tried to implement code from this sample in my app but something went wrong. The problem is my second peer, doesen't play the stream on <video> The short description of algorithm is this: The "host" peer gets stream from camera and showing it on self <video> , then create connection with second peer and sending to him a stream. The second peer receive the stream and show on his <video> The problem is my second peer receive stream, but couldn't play it on . Why? I don't know. Here there

Playing a stream of video data using QTKit on Mac OS X

扶醉桌前 提交于 2019-12-11 06:34:03
问题 I've been playing with QTKit for a couple of days and I'm successfully able to record video data to a file from the camera using a QTCaptureSession and QTCaptureDeviceInput etc. However what I want to do is send the data to another location, either over the network or to a different object within the same app (it doesn't matter) and then play the video data as if it were a stream. I have a QTCaptureMovieFileOutput and I am passing nil as the file URL so that it doesn't actually record the