live-streaming

Sending live video stream to wowza streaming engine with Android devices

我是研究僧i 提交于 2019-12-12 12:22:52
问题 I want to send live video stream from my android device to wowza streaming engine. I am using sample in this blog but I can not see the result on Test Players page. Do I need to have a web server serving a page with a video player pointed to this video/app on wowza? 回答1: I found this little (but very useful) library with three examples: libstreaming It works like a charm! Easy to install and develop. 回答2: Main point is to look at Wowza logs to understand if stream was successfully published

Live streaming audio from iPhone

♀尐吖头ヾ 提交于 2019-12-12 09:06:53
问题 I would like to stream audio from my iPhone to a remote server but I don't really know what is my best bet. I tried here a code for sending small chunks but I have some some audio gaps between chunks. So I think about FFmpeg or as suggested here writing my own AAC parser. Any code sample or advices would be appreciated because I have hard time to get started 回答1: Another core audio based Audio Player: https://github.com/douban/DOUAudioStreamer . Just see the examples to use. In my opinion,

cross platform sound API [closed]

廉价感情. 提交于 2019-12-12 08:30:44
问题 Closed. This question is off-topic. It is not currently accepting answers. Want to improve this question? Update the question so it's on-topic for Stack Overflow. Closed 4 years ago . I'm looking into developing an application that will require live streaming of audio. I would prefer to use some cross-platform (windows/linux/BSD) open source library written in C or C++ even though writing it using the respective OSs' Sound APIs is still an option. I have read a bit about various sound

How can I implement YouTube LiveStream player in iOS and android?

两盒软妹~` 提交于 2019-12-12 04:59:39
问题 I want to implement a simple livestream to my iOS and android apps. I can use youtube.com to record the livestream, and broadcast it as a regular video in my apps. I don't want to implement heavy SDKs or build a platform to do so. What is the best choice? Have come across - kickflip.io, livestreamsdk.com, ustream.tv, etc. With the first 2, you have to setup the whole thing. While I was thinking if I can simply embed like a Youtube player which will show the broadcast. 回答1: The YouTube Live

ffmpeg livestream from static image and audio

走远了吗. 提交于 2019-12-12 04:49:32
问题 I'm trying to livestream by ffmpeg using static image and audio file. The ffmpeg command like this ffmpeg -re -loop 1 -f image2 -i '/tmp/11.jpg' -f lavfi -i amovie=/tmp/5117.mp3:loop=999 -video_size 600x480 -c:v libx264 -x264-params keyint=60 -bufsize 500k -c:a aac -ar 44100 -b:a 128k -r 30 -g 60 -pix_fmt yuv420p -f flv "rtmp://" /tmp/11.jpg was generated by another process and keep updated twice per second. The ffmpeg command doesn't look right, first, it show status like this frame= 85 fps

Android Vitamio 5.0.0 crash

丶灬走出姿态 提交于 2019-12-12 04:17:58
问题 I fail to make it work vitamio. everything seems right, but application stops, and you receive this error. What did I do wrong? exactly the same code with vitamio 4.2 works perfectly. I guess I did something wrong > import android.support.v7.app.AppCompatActivity; > import android.os.Bundle; > import io.vov.vitamio.MediaPlayer; > import io.vov.vitamio.Vitamio; > import io.vov.vitamio.widget.MediaController; > import io.vov.vitamio.widget.VideoView; > > > public class Rtmp_player extends

GStreamer AAC audio stream delay in iOS

我怕爱的太早我们不能终老 提交于 2019-12-12 03:24:30
问题 I'm playing aac audio stream on my iOS device using GStreamer SDK, its working fine, but delay is above 2.0 seconds. Can I make this delay lower then 2.0 seconds? There may be some buffering issue. This is how I'm creating the pipeline pipeline = gst_parse_launch("playbin2", &error); 回答1: try setting the latency like this: g_object_set(G_OBJECT(pipeline.source), "latency", 250, NULL); 来源: https://stackoverflow.com/questions/32865653/gstreamer-aac-audio-stream-delay-in-ios

iOS RTP live audio receiving

坚强是说给别人听的谎言 提交于 2019-12-12 03:23:25
问题 I'm trying to receive a live RTP audio stream in my iPhone but I don't know how to start. I'm seeking some samples but I can't find them anywhere. I have a Windows desktop app which captures audio from the selected audio interface and streams it as µ-law or a-law. This app works as an audio server that serves any incoming connection with that streaming. I have to say that I've developed an Android app that receives that stream and it works, so I want to replicate this functionality on iOS. In

Streaming Live audio to the browser - Alternatives to the Web Audio API?

心已入冬 提交于 2019-12-12 02:09:08
问题 I am attempting to stream live audio from an iOS device to a web browser. The iOS device sends small, mono wav files (as they are recorded) through a web socket. Once the client receives the wav files, I have the Web Audio API decode and schedule them accordingly. This gets me about 99% of the way there, except I can hear clicks between each audio chunk. After some reading around, I have realized the likely source of my problem: the audio is being recorded at a sample rate of only 4k and this

How do I get the output from an ASIO device to IceCast2 or FFMpeg?

泪湿孤枕 提交于 2019-12-11 23:04:56
问题 I have an ASIO device (Presonus Firestudio 2626). I am using it to mix and create different outputs on all of it's provided outputs (about 9 outputs like ADT1, ADT2). I need someway to stream these outputs using either IceCast or FFMpeg RTP. One of the problems is that I have a restriction on using only a MAC or a Windows machine as my ASIO device does not provide drivers for ubuntu. What are the ways that I can connect the ASIO device outputs to IceCast or FFMpeg? I've tried the following.