audio-streaming

How to stream audio from one Android device to another Android device Via Bluetooth?

帅比萌擦擦* 提交于 2019-11-28 16:24:19
Is it possible to stream audio over bluetooth? During my research I found that is it only possible using A2DP(Advanced Audio Distribution Profile) . And does every android device support A2DP? If not, is it possible to stream audio between two android devices using bluetooth? Please help me understand this. I've looked through the following links: Receive audio via Bluetooth in Android , Google confirms bluetooth audio streaming fix for next version of Android 4.2 How can I stream audio from another device via Bluetooth? Is it possible to stream audio over Bluetooth? Below sort of thread says

What is the best way to stream a audio file to website users/listeners [closed]

巧了我就是萌 提交于 2019-11-28 16:15:45
I'm developing a music site which will stream audio files stored in a server to users, audio files will be played through flash player placed in a webpage.. As I heard I need to use a streaming media server for streaming audio files ( like 2mb to 3mb in size).. Do I need to use one? I found some streaming media server softwares like http://www.icecast.org - but as in their documentation, It is used for streaming radio stations and live streaming purposes, but I just need to stream audio files faster and in low size (low bandwidth) with good quality.. I heard I need to encode the audio files

AVPlayer “freezes” the app at the start of buffering an audio stream

梦想与她 提交于 2019-11-28 15:47:52
I am using a subclass of AVQueuePlayer and when I add new AVPlayerItem with a streaming URL the app freezes for about a second or two. By freezing I mean that it doesn't respond to touches on the UI. Also, if I have a song playing already and then add another one to the queue, AVQueuePlayer automatically starts preloading the song while it is still streaming the first one. This makes the app not respond to touches on the UI for two seconds just like when adding the first song but the song is still playing. So that means AVQueuePlayer is doing something in main thread that is causing the

Streaming audio from a Node.js server to HTML5 <audio> tag

孤街浪徒 提交于 2019-11-28 15:05:11
I've been experimenting with binary streams in Node.js, and much to my amazement do actually have a working demo of taking a Shoutcast stream using node-radio-stream and pushing it into a HTML5 element using chunked encoding. But it only works in Safari! Here is my server code: var radio = require("radio-stream"); var http = require('http'); var url = "http://67.205.85.183:7714"; var stream = radio.createReadStream(url); var clients = []; stream.on("connect", function() { console.error("Radio Stream connected!"); console.error(stream.headers); }); // When a chunk of data is received on the

Creating a rtsp client for live audio and video broadcasting in objective C

生来就可爱ヽ(ⅴ<●) 提交于 2019-11-28 11:40:29
I am trying to create a RTSP client which live broadcast Audio and Video. I modified the iOS code at link http://www.gdcl.co.uk/downloads.htm and able to broadcast the Video to server properly. But now i am facing issues in broadcasting the audio part. In the link example the code is written in such a way that it writes the Video data to file and than reads the data from the file and upload the NALU's video packets to RTSP server. For Audio part i am not sure how to proceed on it. Right now what i have tried is that get the audio buffer from mic and than broadcast it to the server directly by

Android Play PCM byte array from Converted from Base64 String Slow Sounds

こ雲淡風輕ζ 提交于 2019-11-28 10:21:10
问题 As the very long title suggests, I'm having trouble playing the audio from a audio that I send over the network through PubNunb. What I do is I send the audio while recording from AudioRecord using this code: AudioConfig audioConfig = getValidSampleRates(AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT); buffer = new byte[audioConfig.getBufferSize()]; recorder = new AudioRecord(MediaRecorder.AudioSource.MIC, audioConfig.getSampleSize(), AudioFormat.CHANNEL_IN_MONO, AUDIO_FORMAT,

Playing remote audio files in Python? [closed]

我怕爱的太早我们不能终老 提交于 2019-11-28 08:45:04
I'm looking for a solution to easily play remote .mp3 files. I have looked at "pyglet" module which works on local files, but it seems it can't handle remote files. I could temporary download the .mp3 file but that's not reccomended due to how large the .mp3 files could appear to be. I rather want it to be for cross-platform instead of Windows-only etc. Example, playing a audio file from: http://example.com/sound.mp3 Just stream the file as it's downloads, my idea is a MP3 player in Python which opens Soundcloud songs. You can use GStreamer with python bindings (requires PyGTK). Then you can

How can I play audio stream without saving it into the file with pyglet?

浪尽此生 提交于 2019-11-28 05:54:26
问题 Now I have these libraries: requests , pyglet , pyaudio How can I play an audio stream using ones, for example, from this site without saving it into the file(using buffering)? There is a confusing information in documentation of this library about a StreamingSource class When I push the information in bytes in StreamingSource object(source.get_audio_data(DATA)) and after that I push this one into a Player(pyglet.media.Player()) it throws an exception, that says that the StreamingSource hasn

How could I play a shoutcast/icecast stream using HTML5?

微笑、不失礼 提交于 2019-11-28 03:29:29
Is it possible to play a shoutcast/icecast stream using HTML5? If so, how should I implement it? Add a semicolon to the end of the http request. It IS the protocol set forth by shoutcast to override it's browser detection. Like this: <audio controls src="http://shoutcast.internet-radio.org.uk:10272/;"></audio> There is a big problem with SHOUTcast, which I suspect is responsible for it not working even in Chrome which is supposed to support MP3. SHOUTcast can serve three different types of response: a native-SHOUTcast “ICY” protocol streaming audio response. It decides to do this if the player

ios avplayer trigger streaming is out of buffer

三世轮回 提交于 2019-11-28 03:24:42
I want to reconnect to the server when the streaming buffer is empty. How can I trigger a method when the AVPlayer or AVPlayerItem buffer is empty? I know there are playbackLikelyToKeepUp , playbackBufferEmpty and playbackBufferFull methods to check the buffer status, but those are not callbacks. Are there any callback functions, or any observers I should add? you can add observer for those keys: [playerItem addObserver:self forKeyPath:@"playbackBufferEmpty" options:NSKeyValueObservingOptionNew context:nil]; [playerItem addObserver:self forKeyPath:@"playbackLikelyToKeepUp" options