audio-streaming

Audio Recording and Streaming in Android

泄露秘密 提交于 2019-11-29 08:35:29
I an developing an android app. I want to accomplish below feature. I will use my phone's built-in mic to record and at the same time i want the recorded audio to be played through either phone's speakers or headphones. Is it feasible? If yes, please help me in this. Saurabh Meshram Here is a simple Recording and Playback application. Uses Android AudioRecord and AudioTrack , Design : The recorded audio is written to a buffer and played back from the same buffer, This mechanism runs in a loop (using Android thread) controlled by buttons. Code private String TAG = "AUDIO_RECORD_PLAYBACK";

How to Call SPEEX Audio Decode/Encode in HTML5 / JavaScript (Without Flash)

試著忘記壹切 提交于 2019-11-29 08:29:39
问题 I'm working on a project that requires audio data to be streamed via HTTP to/from server. We need to compress the audio data using SPEEX. In Flash, we can use Alchemy and LibSpeex, but how can we do it in JavaScript. HTML5 can be used. Thanks. Peter 回答1: I recently implemented a successful HTML5/VOIP client using the following JS port of Speex, truly awesome stuff: https://github.com/jpemartins/speex.js For now you'll need to either wait for MediaStreamRecorder or jump the gun like I did, and

Trying to stream audio from microphone to another phone via multipeer connectivity

假装没事ソ 提交于 2019-11-29 07:01:48
I am trying to stream audio from the microphone to another iPhone via Apples Multipeer Connectivity framework. To do the audio capturing and playback I am using AVAudioEngine (much thanks to Rhythmic Fistman' s answer here ). I receive data from the microphone by installing a tap on the input, from this I am getting a AVAudioPCMBuffer which I then convert to an array of UInt8 which I then stream to the other phone. But when I am converting the array back to an AVAudioPCMBuffer I get an EXC_BAD_ACCESS exception with the compiler pointing to the method where I am converting the byte array to

HTML5 Audio Streaming

橙三吉。 提交于 2019-11-29 03:27:32
问题 There has been some talk of this around stackoverflow before, but nothing really answered this question from what I have seen. I am trying to implement a streaming audio web application. Almost identical to what WFMU has done with their player (http://wfmu.org/html5/player.php). All I have been able to figure out from their stream is they are piping the stream into PHP, don't know in what format, and then feeding this to jPlayer for HTML5 presentation to the client. They have this working

Streaming audio in Node.js with Content-Range

≡放荡痞女 提交于 2019-11-29 02:45:57
I'm using a streaming server in Node.js to stream MP3 files. While the whole file streaming it is ok, I cannot use the Content-Range header to stream the file seeking to a start position and util a end position. I calculate the start and end bytes from seconds using ffprobe like ffprobe -i /audio/12380187.mp3 -show_frames -show_entries frame=pkt_pos -of default=noprint_wrappers=1:nokey=1 -hide_banner -loglevel panic -read_intervals 20%+#1 That will give me the exact bytes from 10 seconds in this case to the first next packet. This becomes in Node.js as simple as const args = [ '-hide_banner',

Live audio stream java

可紊 提交于 2019-11-28 23:45:06
I am implementing live streaming from MIC to java server at another PC. But I am only hearing a white noise. I have attached both client and server program Client: import java.io.IOException; import java.net.DatagramPacket; import java.net.DatagramSocket; import java.net.InetAddress; import java.net.SocketException; import java.net.UnknownHostException; import javax.sound.sampled.AudioFormat; import javax.sound.sampled.AudioInputStream; import javax.sound.sampled.AudioSystem; import javax.sound.sampled.DataLine; import javax.sound.sampled.LineUnavailableException; import javax.sound.sampled

stream media FROM iphone

给你一囗甜甜゛ 提交于 2019-11-28 20:40:14
问题 I need to stream audio from the mic to a http server. These recording settings are what I need: NSDictionary *audioOutputSettings = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt: kAudioFormatULaw],AVFormatIDKey, [NSNumber numberWithFloat:8000.0],AVSampleRateKey,//was 44100.0 [NSData dataWithBytes: &acl length: sizeof( AudioChannelLayout ) ], AVChannelLayoutKey, [NSNumber numberWithInt:1],AVNumberOfChannelsKey, [NSNumber numberWithInt:64000],AVEncoderBitRateKey, nil]; API

Is it possible to get a byte buffer directly from an audio asset in OpenSL ES (for Android)?

雨燕双飞 提交于 2019-11-28 20:10:51
I would like to get a byte buffer from an audio asset using the OpenSL ES FileDescriptor object, so I can enqueue it repeatedly to a SimpleBufferQueue, instead of using SL interfaces to play/stop/seek the file. There are three main reasons why I would like to manage the sample bytes directly: OpenSL uses an AudioTrack layer to play/stop/etc for Player Objects. This does not only introduce unwanted overhead, but it also has several bugs, and rapid starts/stops of the player cause lots of problems. I need to manipulate the byte buffer directly for custom DSP effects. The clips I'm going to be

record output sound in python

放肆的年华 提交于 2019-11-28 19:52:42
i want to programatically record sound coming out of my laptop in python. i found PyAudio and came up with the following program that accomplishes the task: import pyaudio, wave, sys chunk = 1024 FORMAT = pyaudio.paInt16 CHANNELS = 1 RATE = 44100 RECORD_SECONDS = 5 WAVE_OUTPUT_FILENAME = sys.argv[1] p = pyaudio.PyAudio() channel_map = (0, 1) stream_info = pyaudio.PaMacCoreStreamInfo( flags = pyaudio.PaMacCoreStreamInfo.paMacCorePlayNice, channel_map = channel_map) stream = p.open(format = FORMAT, rate = RATE, input = True, input_host_api_specific_stream_info = stream_info, channels = CHANNELS)

Can I use Firebase Storage for online music streaming?

◇◆丶佛笑我妖孽 提交于 2019-11-28 18:49:58
What I want is to save mp3 files on Firebase Storage and then stream it to an Android device. All the tutorials on Firebase discuss about the image file upload and download. If there are any other cloud that is more easy than Firebase to store and stream audio for android, then please suggest. Firebase provide StreamDownloadTask as Mike has suggested, for getting InputStream but unfortunately MediaPlayer doesn't accept a direct stream. Now we have 2 options if we strictly like to continue with InputStream, as far as I know - a. Write a stream in the temporary file and pass it to the media