live-streaming

(Ffmpeg) How to play live audio in the browser from received UDP packets using Ffmpeg?

筅森魡賤 提交于 2020-06-29 05:20:33
问题 I have .NET Core console application which acts as UDP Server and UDP Client UDP client by receiving audio packet. UDP server, by sending each received packet. Here's a sample code of the console app: static UdpClient udpListener = new UdpClient(); static IPEndPoint endPoint = new IPEndPoint(IPAddress.Parse("192.168.1.230"), 6980); static IAudioSender audioSender = new UdpAudioSender(new IPEndPoint(IPAddress.Parse("192.168.1.230"), 65535)); static void Main(string[] args) { udpListener.Client

Server node.js for livestreaming

这一生的挚爱 提交于 2020-06-01 07:24:47
问题 I'm trying to create a server in node.js that receives RTMP packets and converts them in HLS packets, then it sends back the packets. I'm doing this to create a livestream service compatible with every dispositive from the moment that iOS doesn't support RTMP. This is my code, but i'm stuck in what i should put into the callback. Sorry for the mess but I'm not a JS programmer and this are my first steps into a JS project. Thanks in advance! My stream client will be OBS. import { Server } from

How to send OpenCV output to browser with python?

房东的猫 提交于 2020-05-26 08:06:50
问题 I have a simple python script with open cv, which takes in a video and does object detection on it using YOLO. My question is, how can I display the output to my website as a live stream. Here is the python code, saving to output.avi. import cv2 from darkflow.net.build import TFNet import numpy as np import time import pafy options = { 'model': 'cfg/tiny-yolo.cfg', 'load': 'bin/yolov2-tiny.weights', 'threshold': 0.2, 'gpu': 0.75 } tfnet = TFNet(options) colors = [tuple(255 * np.random.rand(3)

How to send OpenCV output to browser with python?

这一生的挚爱 提交于 2020-05-26 08:03:36
问题 I have a simple python script with open cv, which takes in a video and does object detection on it using YOLO. My question is, how can I display the output to my website as a live stream. Here is the python code, saving to output.avi. import cv2 from darkflow.net.build import TFNet import numpy as np import time import pafy options = { 'model': 'cfg/tiny-yolo.cfg', 'load': 'bin/yolov2-tiny.weights', 'threshold': 0.2, 'gpu': 0.75 } tfnet = TFNet(options) colors = [tuple(255 * np.random.rand(3)

How to find if a youtube channel is currently live streaming without using search?

故事扮演 提交于 2020-05-24 05:04:08
问题 I'm working on a website to load multiple youtube channels live streams. At first i was trying to figure out a way to do this without utilizing youtube's api but have decided to give in. To find whether a channel is live streaming and to get the live stream links I've been using: https://www.googleapis.com/youtube/v3/search?part=snippet&channelId={CHANNEL_ID}&eventType=live&maxResults=10&type=video&key={API_KEY} However with the minimum quota being 10000 and each search being worth 100, Im

How to livestream a webcam to YouTube with FFmpeg?

廉价感情. 提交于 2020-05-16 21:42:08
问题 I want to send the livestream of my webcam to YouTube. I can follow YouTube's guide up to step 8. "Stream Connection" tells me there is "No data" and the button "Go Live" remains unclickable. A screenshot of this situation can be seen at As encoding software, I was planning on using FFmpeg because it can run from the target platform, a Raspberry Pi with Raspbian. A USB webcam supported by video4linux2 is used. FFmpeg's wiki shows that streaming a file can be done with the following: ffmpeg

Live streaming: node-media-server + Dash.js configured for real-time low latency

不羁的心 提交于 2020-04-10 03:33:06
问题 We're working on an app that enables live monitoring of your back yard. Each client has a camera connected to the internet, streaming to our public node.js server. I'm trying to use node-media-server to publish an MPEG-DASH (or HLS) stream to be available for our app clients, on different networks, bandwidths and resolutions around the world. Our goal is to get as close as possible to live "real-time" so you can monitor what happens in your backyard instantly. The technical flow already

Sending periodic metadata in fragmented live MP4 stream?

独自空忆成欢 提交于 2020-03-13 18:31:22
问题 As suggested by the topic, I'm wondering if it's possible to send metadata about the stream contents periodically in a fragmented MP4 live stream. I'm using the following command (1) to get fragmented MP4: ffmpeg -i rtsp://admin:12345@192.168.0.157 -c:v copy -an -movflags empty_moov+omit_tfhd_offset+frag_keyframe+default_base_moof -f mp4 ... My main program reads the fragments from this command from either stdout or from a (unix domain) socket and gets: ftyp moov moof mdat moof mdat moof mdat

Sending periodic metadata in fragmented live MP4 stream?

好久不见. 提交于 2020-03-13 18:31:19
问题 As suggested by the topic, I'm wondering if it's possible to send metadata about the stream contents periodically in a fragmented MP4 live stream. I'm using the following command (1) to get fragmented MP4: ffmpeg -i rtsp://admin:12345@192.168.0.157 -c:v copy -an -movflags empty_moov+omit_tfhd_offset+frag_keyframe+default_base_moof -f mp4 ... My main program reads the fragments from this command from either stdout or from a (unix domain) socket and gets: ftyp moov moof mdat moof mdat moof mdat

Live audio streaming container formats

纵然是瞬间 提交于 2020-02-20 06:11:07
问题 When I start receiving the live audio (radio) stream (e.g. MP3 or AAC) I think the received data are not kind of raw bitstream (i.e. raw encoder output), but they are always wrapped into some container format. If this assumption is correct, then I guess I cannot start streaming from arbitrary place of the stream, but I have to wait to some sync byte. Is that right? Is it usual to have some sync bytes? Is there any header following the sync byte, from which I can guess the used codec, number