video-streaming

h264 packetization mode for FUA

空扰寡人 提交于 2019-12-03 16:44:28
We have got into couple of interop issues where, The video mode that is required by couple of endpoints in market are little different and only understands H.264 packetization modes (FUA type) (i.e) FU -A NAL unit type.(while others do not play the video on receiving a fu-a nal type payload) Does anyone know what is this FUA type of packetization mode? How is it different from packetization modes 0,1,2 as defined in RFC3984? Is the video encoder/decoder supports it, how can it be appropriately signalled in SIP SDP session wherein the attributes do not get changed even when traversing through

How make video call with ejabberd?

冷暖自知 提交于 2019-12-03 16:22:17
How make video call with ejabberd ( like msn and skype ) ? ejabberd doesn't handle audio/video natively. Audio and video is handled through the Jingle (XEP-0166), which is client-to-client. If you want to place audio or video calls you should make sure both clients support Jingle through normal serivce discover means (see section 11 of XEP-0166). There aren't a lot of clients that do this right now, but Psi, at least, supports it in more recent builds. You can try using Jingle Nodes ( http://jinglenodes.org ) in combination with SIP Communicator which supports Audio and Video via Jingle

Can profile-level-id and sprop-parameter-sets be extracted from an RTP stream?

假如想象 提交于 2019-12-03 16:13:45
I'm trying to stream live video from my android phone to a desktop RTSP server on my PC. The streamed video can be played in another device. I'm using H.264 video encoder, so the SDP returned by the server (as the reply of DESCRIBE request) should contain the profile-level-id and sprop-parameter-sets fields. The Spydroid project shows how to extract these info from a dummy file recorded to SD card by parsing it (from the avcC block). But I cannot do it like that. In Spydroid, the media recorder and the RTSP server are on the same device, so the server can always record a test file with the

Count frames in H.264 bitstream

只谈情不闲聊 提交于 2019-12-03 15:35:46
How to count/detect frames (pictures) in raw H.264 bitstream? I know there are 5 VCL NALU types but I don't know how to rec(k)ognize sequence of them as access unit. I suppose detect a frame means detect an access unit as access unit is A set of NAL units that are consecutive in decoding order and contain exactly one primary coded picture. In addition to the primary coded picture, an access unit may also contain one or more redundant coded pictures, one auxiliary coded picture, or other NAL units not containing slices or slice data partitions of a coded picture. The decoding of an access unit

How to live stream video using C program. What should be the HTTP reply ? How can I use chunked encoding if possible?

∥☆過路亽.° 提交于 2019-12-03 15:19:15
问题 (the actual question has been edited because I was successful doing live streaming, BUT NOW I DO NOT UNDERSTAND THE COMMUNICATION between client and my C code.) Okay I finally did live streaming using my C code. BUT I COULD NOT UNDERSTAND HOW "HTTP" IS WORKING HERE. I studied the communication b/w my browser and the server at the link http://www.flumotion.com/demosite/webm/ using wireshark. I found that the client first sends this GET request GET /ahiasfhsasfsafsgfg.webm HTTP/1.1 Host:

How to use RajawaliVR or Rajawali to play a 360 Video

余生颓废 提交于 2019-12-03 15:03:19
问题 I am having a hard time to figure out how to use the Rajawali to play a 360 video. In order to achieve this, I tried every solution I could find in the Internet, but I failed. Firstly, I used the RajawaliCardboard, and let the MainActivity extend from CardboardActivity . At the same time, in MyRenderer class, I let this class extend from the RajawaliCardboardRenderer class. In MyRenderer class, I overrided the initScene() function: protected void initScene() { StreamingTexture mTexture = null

Videos no longer streaming with mediaelement.js in Chrome

混江龙づ霸主 提交于 2019-12-03 14:38:52
Over the past few days, we noticed that our videos quit streaming using MediaElement.js Version 2.11.3 and Chrome Version 50.0.2661.94 (64-bit) Videos still play in Firefox and Safari without a problem. The error we receive in the Chrome Dev Tools is: Uncaught (in promise) DOMException: The element has no supported sources. The markup on the rendered page looks like this: <video height="150" poster="https://xxxxxxxxxx.cloudfront.net/123423_1_thumb.jpg" preload="auto" width="200" src="" hidden-source="https://xxxxxxx.cloudfront.net/123423_1_wm.webm"> <object data="flashmediaelement.swf" height=

Video streaming from Android device to LAMP Server

安稳与你 提交于 2019-12-03 14:38:14
问题 starting from this point: http://www.mattakis.com/blog/kisg/20090708/broadcasting-video-with-android-without-writing-to-the-file-system I'm trying to create an application to save a video stream from mobile camera to a remote server. (I found several examples in google code for android part: ipcamera-for-android, spydroid-ipcamera, etc..) I read some answers here and around the network, but can not find the solution about how to "read" and save the stream of data on the server side. My

how to implement video chat in iphone

瘦欲@ 提交于 2019-12-03 14:21:11
Can anyone please tell me the way to do video chat in iphone? I tried to search it on many website but in vain. I found this link: http://code.google.com/p/xmppframework/wiki/iPhone I am not sure if this works for the video chat too? Thanks, Naveed I don't have experience with XMPP, but I think you would have to add your own video solution on top of it. This is definitely not a straightforward task to accomplish - but this might be useful for you: 1) http://code.google.com/p/idoubs/ - open-source 3GPP IMS client for iOS 2) http://labs.adobe.com/technologies/cirrus/ - RTMFP protocol - works on

Is it possible to do Flash pseudo streaming with S3?

≯℡__Kan透↙ 提交于 2019-12-03 14:19:48
I've been using S3 to store and serve FLV and MP4 videos. It works great, but the content is progressively downloaded. I was wondering if it is possible to get so-called "pseudo streaming" to work with S3. Pseudo streaming allows viewers to seek ahead in a video before the full video has downloaded as well as send only the bits necessary to the Flash player. I'm aware of Lighttp's pseudo streaming plugin, and I know I can use keyframed FLV files with an XMOOV script - but I'd like to setup this up with S3, as opposed to running my own server. Any help is appreciated. No. No, you can't do