live-streaming

Read rtmp live streaming vido data using java code

こ雲淡風輕ζ 提交于 2019-12-11 19:52:42
问题 My requirement is that How to read and create mp4 file from live stream video URL. I have did many R&D's for this topic but failed to get any answer. Following link also not useful to me. When I run ffmpeg command given in this question, it gives me an exception. I want to use FFMPEG, not xuggle library Thanks. 回答1: Following is the java method which you can use to read data from rtmp url with the integration of FFMPEG library. public static void liveRtmpFeed() throws IOException,

Could not open output container for live stream

不打扰是莪最后的温柔 提交于 2019-12-11 19:34:04
问题 I am working on YouTube broadcasting Java program. So far, I can create live event using this program: https://github.com/youtube/api-samples/tree/master/java For more detail what I am getting so far, please see my another question: https://stackoverflow.com/questions/30449366/how-to-send-video-stream-for-live-event-using-youtube-broadcast-in-java Now, the next thing is I want to create a video stream which will be passed to live streaming YouTube APIs so that my video will be broadcasting as

Re-stream live video feed

我与影子孤独终老i 提交于 2019-12-11 19:10:50
问题 I currently have a device running Ubuntu with an Asus Xtion Pro Live attached. What I'm trying to do is to capture the video and push it to a server, and then have the server re-stream it so that other clients that connect to my server can view the stream. The server is running Windows and has a public ip. What I currently have now is, while the device is on the same network as a client, the client can connect directly to the device to view the stream. For example, if the device has an ip of

Server architecture: websocket multicast server?

不问归期 提交于 2019-12-11 13:59:44
问题 What would be the simplest way to build a server that receives incoming connections via a websocket, and streams the data flowing in that socket out to n subscribers on other websockets. Think for example of a streaming application, where one person is broadcasting to n consumers. Neglecting things like authentication, what would be the simplest way to build a server that can achieve this? I'm a little confused about what would happen when a chunk of data hits the server. It would go into a

Setting source of an Image control from Memory Stream using Non-UI thread in WPF

独自空忆成欢 提交于 2019-12-11 11:35:59
问题 I am Capturing image from a finger print Scanner and i want to display the captured image live in an Image control. //Onclick of a Button Thread WorkerThread = new Thread(new ThreadStart(CaptureThread)); WorkerThread.Start(); So i created a thread as above and called the method that captures the image from the device and sets the source of the Image control as follows. private void CaptureThread() { m_bScanning = true; while (!m_bCancelOperation) { GetFrame(); if (m_Frame != null) {

How to implement Live Streaming on Windows Phone

我与影子孤独终老i 提交于 2019-12-11 11:21:37
问题 I want to implement Audio Live Streaming on Windows Phone. most of them told Windows Phone 7 sdk does not support Streaming. so I upgrade and install Windows Phone 7.1. Which class i want to use to implement Live Streaming????? Please give some guidelines and materials to do this task. Thanks in advance. 回答1: You may want to consider using IIS 7.X for the streaming portion. The following link shows how you can build the client app for Windows Phone 7: http://www.iis.net/community/default.aspx

What technologies should I use to produce a WebM live stream from a series of in-memory bitmaps?

限于喜欢 提交于 2019-12-11 06:15:49
问题 Boss handed me a bit of a challenge that is a bit out of my usual ballpark and I am having trouble identifying which technologies/projects I should use. (I don't mind, I asked for something 'new' :) Job: Build a .NET server-side process that can pick up a bitmap from a buffer 10 times per second and produce/serve a 10fps video stream for display in a modern HTML5 enabled browser. What Lego blocks should I be looking for here? Dave 回答1: You'll want to use FFmpeg. Here's the basic flow: Your

C reading from a live file ( file keep growing in size )

╄→гoц情女王★ 提交于 2019-12-11 01:54:23
问题 I have an application that is recording live , the capture file keep growing in size using fread() and feof() , but feof() is breaking the loop early , so what's the best technique to keep reading from the stream should I wait and then I can advance the file stream ? should I open the file again and advance to position by calculating the total of read bytes? maybe something else ? the code will have to read the file , build a packet and send it packaging and sending is going well with fixed

360 live stitched local camera feed at lowest latency + Unity / iOS / Android

对着背影说爱祢 提交于 2019-12-11 01:22:55
问题 I'm looking for recommendations on hardware and software to render a live stitched feed from a local 360 camera in Unity (alternatively native iOS or Android) at absolute minimum latency . There is high flexibility regarding the hardware and setup. It can be a mobile or a desktop app. Can be any reasonably priced 360 camera. It can be a wired connection or WiFi direct. The only thing that matters is to have access to the stitched live video feed in a custom app (SDK?) and be able to render it