audio-streaming

how to convert getUsermedia audio stream into a blob or buffer?

落爺英雄遲暮 提交于 2019-12-03 17:18:46
I am getting audio stream from getUserMeda and then convert it into a blob or buffer and send it to server as audio is comming I am using socket.io to emit it to server how can i convert audio mediastream into buffer? Following is the code that i have written yet navigator.getUserMedia({audio: true, video: false}, function(stream) { webcamstream = stream; var media = stream.getAudioTracks(); socket.emit("sendaudio", media); }, function(e){ console.log(e); } }); How to convert stream into buffer and emit it to node.js server as stream comes from getusermedia function? Per @MuazKhan's comment,

Transcoding and streaming audio - how to send content-range headers

痴心易碎 提交于 2019-12-03 16:49:18
Quick version: how to send correct Content-Range headers when don't know the body length? I have a FLAC file. I want to transcode it to MP3 and stream it to the user immediately. I have something like this so far: function transcode(file) { var spawn = require('child_process').spawn var decode = spawn('flac', [ '--decode', '--stdout', file ]) var encode = spawn('lame', [ '-V0', '-', '-' ]) decode.stdout.pipe(encode.stdin) return encode } var express = require('express') var app = express() app.get('/somefile.mp3', function (req, res) { res.setHeader('Accept-Ranges', 'bytes') res.setHeader(

WebRTC Play Audio Input as Microphone

≡放荡痞女 提交于 2019-12-03 15:50:16
I want to play my audio file as microphone input (without sending my live voice but my audio file) to the WebRTC connected user. Can anybody tell me how could it be done? I have done some following tries in the JS code, like: 1. base64 Audio <script> var base64string = "T2dnUwACAAAAAAA.."; var snd = new Audio("data:audio/wav;base64," + base64string); snd.play(); var Sound = (function () { var df = document.createDocumentFragment(); return function Sound(src) { var snd = new Audio(src); df.appendChild(snd); snd.addEventListener('ended', function () {df.removeChild(snd);}); snd.play(); return

How to Stream Media(Audio/Video) Across Multiple Android device over Wifi or Wifi Hotspot?

非 Y 不嫁゛ 提交于 2019-12-03 13:55:37
问题 I am currently working on a project when you play song. it's play in your another device which is automatically play in another android device connected through WiFi or WiFi-mobile hotspot . How to stream audio android device to vlc player. I got Some Help from [Here's a link!] I go through many link bt none of help me. I want build functionality like [ Sound Seeder soundseeder]. Someone have any solution then please help me. It's mine for me. Thanks..!! 回答1: For Video Streaming Using Wi-Fi

Capture system audio output with Nodejs

不打扰是莪最后的温柔 提交于 2019-12-03 13:27:37
问题 Is there a way in javascript or is there a nodejs module, that I can use to capture the output of a system (win/osx). For example, if a user is playing something via iTunes/MPlayer (any music player), I can capture the audio stream that's going to the speakers (output) and send it over the web? 回答1: This might go some way to doing what you want: https://www.npmjs.com/package/node-core-audio 回答2: I'm about to start some dev work on a similar project but was having issues getting node-core

Play mp3 file while downloading?

廉价感情. 提交于 2019-12-03 13:21:15
I'd like my Android application to download an mp3 file from the internet and play it like a stream while downloading it. Is this even possible? How would I go about doing it? Essentially I want the user to be able to listen to the file instantly, but have it keep downloading to the SD Card even if he stops listening, so the whole mp3 file will end up on the SD Card either way. I don't believe android provides the functionality you're asking for. But there's one workaround I know of that might work. http://code.google.com/p/android/issues/detail?id=739 is an open ticket with a lot of

understanding getByteTimeDomainData and getByteFrequencyData in web audio

狂风中的少年 提交于 2019-12-03 11:53:31
问题 The documentation for both of these methods are both very generic wherever I look. I would like to know what exactly I'm looking at with the returned arrays I'm getting from each method. For getByteTimeDomainData, what time period is covered with each pass? I believe most oscopes cover a 32 millisecond span for each pass. Is that what is covered here as well? For the actual element values themselves, the range seems to be 0 - 255. Is this equivalent to -1 - +1 volts? For getByteFrequencyData

Sound card detection for web

匆匆过客 提交于 2019-12-03 09:31:59
We need some heads up for a hobby web project. At this stage we want to detect client's sound card and direct whatever coming from sound card to server to process audio. And low latency is an important issue for us. So we need your suggestions for language, library etc to use. If you can give us some information about the big picture, then we can study on our own. Capturing Audio Client-Side You can use the Web Audio API along with getUserMedia (generally considered part of the WebRTC feature set) to capture audio (and video if you want it) from the user. Here is a code example from the

Android Record raw bytes into WAVE file for Http Streaming

柔情痞子 提交于 2019-12-03 09:03:56
So I am using AudioRecord from Android to record raw bytes and for writing them into a .wav file. Since Android has no support for this I had to write the .wav file headers manually with the following code: randomAccessWriter.writeBytes("RIFF"); randomAccessWriter.writeInt(0); // Final file size not known yet, write 0 randomAccessWriter.writeBytes("WAVE"); randomAccessWriter.writeBytes("fmt "); randomAccessWriter.writeInt(Integer.reverseBytes(16)); // Sub-chunk size, 16 for PCM randomAccessWriter.writeShort(Short.reverseBytes((short) 1)); // AudioFormat, 1 for PCM randomAccessWriter.writeShort

Why Icecast2 does not want to give the stream through https?

落花浮王杯 提交于 2019-12-03 08:35:56
On a server with Ubuntu 14.04 LTS installed Icecast2 2.4.1 with SSL support. Also on this server work HTTPS website. I want insert on the page HTML5-player that will also take the stream through the SSL (otherwise - mixed content error). The site has a commercial SSL certificate, Icecast - a self-signed. Icecast config file: <icecast> <location>****</location> <admin>admin@*************</admin> <limits> <clients>1000</clients> <sources>2</sources> <threadpool>5</threadpool> <queue-size>524288</queue-size> <source-timeout>10</source-timeout> <burst-on-connect>0</burst-on-connect> <burst-size