video-streaming

Multiple Video Files Simultaneously Playing in Android

非 Y 不嫁゛ 提交于 2019-12-21 14:38:49
问题 I had asked the same question for iOS on iPad but now I am trying to see if it's possible within Android. The response I received so far is no within iOS. If it's possible in Android please explain what API is used. Here's my original question referenced: Original Posting on iOS for Multiple Videos Playing Simultaneously on an iPad 回答1: I tried to do so(2 VideoViews), but only one video played. It is because of linux decoder, which may be used as single instance only(from stack trace info).

Progressive Video Download on iOS

你。 提交于 2019-12-21 12:41:01
问题 I am trying to implement Progressive Downloading of a video in my iOS application that can be played through AVPlayer. I have already implemented a downloader module that can download the files to the iPad. However, I have discovered I cannot play a file that is still being written to So, as far as I can tell, my only solution would be through downloading a list of file 'chunks' and then keep playing through every file as they are ready (ie: downloaded), probably using HLS Searching I have

Draw overlay (HUD) on Android VideoView?

孤者浪人 提交于 2019-12-21 12:00:15
问题 I have a custom view that draws HUD : Here is my layout: <?xml version="1.0" encoding="utf-8"?> <FrameLayout xmlns:android="http://schemas.android.com/apk/res/android" android:layout_width="fill_parent" android:layout_height="fill_parent" android:orientation="vertical" > <VideoView android:id="@+id/videoView1" android:layout_gravity="center" android:layout_width="match_parent" android:layout_height="wrap_content" /> <com.widgets.HUD android:id="@+id/hud" android:layout_width="fill_parent"

How to keep a live MediaSource video stream in-sync?

假装没事ソ 提交于 2019-12-21 07:14:04
问题 I have a server application which renders a 30 FPS video stream then encodes and muxes it in real-time into a WebM Byte Stream. On the client side, an HTML5 page opens a WebSocket to the server, which starts generating the stream when connection is accepted. After the header is delivered, each subsequent WebSocket frame consists of a single WebM SimpleBlock. A keyframe occurs every 15 frames and when this happens a new Cluster is started. The client also creates a MediaSource, and on

Add making video call and voice call to my site php [closed]

一曲冷凌霜 提交于 2019-12-21 06:40:45
问题 It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center. Closed 7 years ago . I am creating a social network for some crowd.At there I need to add video calling and voice calling functionalities(like facebook or skype) to this site.I need to do this with PHP.Is there any api or help menu

Live video streaming, how to play it on iPhone?

核能气质少年 提交于 2019-12-21 06:28:22
问题 I am wondering what is available to play live video feed on my iPhone, in a developer way. Few apps exists and play live stream like : http://qik.com/ http://www.ustream.tv http://orb.com/en/orblive Do you have an idea how they achieve this ? Thanks a lot. Thierry 回答1: iPhone 3.0 software includes new APIs for video streaming. Unfortunately, it's under NDA at the moment, so no one can really talk about it on these forums. 回答2: I did worked on the live video streaming. Unfortunately, i didn't

iPhone: HTTP live streaming without any server side processing

爱⌒轻易说出口 提交于 2019-12-21 06:17:42
问题 I want to be able to (live) stream the frames/video FROM the iPhone camera to the internet. I've seen in a Thread (streaming video FROM an iPhone) that it's possible using AVCaptureSession's beginConfiguration and commitConfiguration. But I don't know how to start designing this task. There are already a lot of tutorials about how to stream video TO the iPhone, and it is not actually what I am searching for. Could you guys give me any ideas which could help me further? 回答1: That's a tricky

Can profile-level-id and sprop-parameter-sets be extracted from an RTP stream?

天大地大妈咪最大 提交于 2019-12-21 05:21:55
问题 I'm trying to stream live video from my android phone to a desktop RTSP server on my PC. The streamed video can be played in another device. I'm using H.264 video encoder, so the SDP returned by the server (as the reply of DESCRIBE request) should contain the profile-level-id and sprop-parameter-sets fields. The Spydroid project shows how to extract these info from a dummy file recorded to SD card by parsing it (from the avcC block). But I cannot do it like that. In Spydroid, the media

How make video call with ejabberd?

一世执手 提交于 2019-12-21 04:57:21
问题 How make video call with ejabberd ( like msn and skype ) ? 回答1: ejabberd doesn't handle audio/video natively. Audio and video is handled through the Jingle (XEP-0166), which is client-to-client. If you want to place audio or video calls you should make sure both clients support Jingle through normal serivce discover means (see section 11 of XEP-0166). There aren't a lot of clients that do this right now, but Psi, at least, supports it in more recent builds. 回答2: You can try using Jingle Nodes

What is RTSP and WebRTC for streaming?

元气小坏坏 提交于 2019-12-21 04:38:11
问题 I'm very newbie for streaming. But I must do a user-based streaming system with IP camera. It will be like security cameras. One user will has one stream. My team think working with RTSP. And they want know how will we do it and what is rtsp , webrtc , rtp. I'm researching and i want to ask to you. So what is exactly RTSP? Some IP cameras say supporting WebRTC and what is this? Is this compatitable with RTSP? Which is the best protocol for user based streaming ? 回答1: RTSP is a streaming