audio-streaming

Getting metadata from an audio stream

可紊 提交于 2019-12-02 16:51:59
I would like to get the file name and, if possible, album image from a streaming URL in a AVPlayerItem that I am playing with AVQueuePlayer but I don't know how to go about doing this. Also if it turns out that my streaming URL doesn't have any metadata can I put metadata in my NSURL* before passing it to the AVPlayerItem? Thanks. Well I am surprised no one has answered this question. In fact no one has answered any of my other questions. Makes me wonder how much knowledge people in here truly have. Anyways, I will go ahead and answer my own question. I found out how to get the metadata by

MediaExtractor does not understand audio/aacp streams

吃可爱长大的小学妹 提交于 2019-12-02 16:51:55
问题 I have my own MediaDataSource : class MyDataSource extends MediaDataSource { private static final String TAG = "MyDataSource"; private HttpURLConnection connection; private BufferedInputStream inputStream; MyDataSource(@NonNull URL streamURL) throws Throwable { this.connection = (HttpURLConnection) streamURL.openConnection(); this.connection.setRequestMethod("GET"); this.connection.addRequestProperty("Icy-Metadata", "0"); this.connection.connect(); int responseCode = this.connection

How to customize MPVolumeView?

这一生的挚爱 提交于 2019-12-02 16:21:10
I have tried many methods to implement a regular UISlider and control the device volume, but it's all Native-C functions which results in many untraceable bugs. I tried the MPVolumeView it works like charm, it even controls the device volume even after you close the app, just like the iPod app. My question is, is there anyway to customize the MPVolumeView with specific colors and images, just like UISlider ? NOTE: I want a legal method without using private undocumented APIs. UPDATE As per @Alexsander Akers answer, since the sub views are hidden in MPVolumeView I had to cycle through subviews,

Bluetooth audio streaming between android devices

╄→гoц情女王★ 提交于 2019-12-02 15:57:25
I made a research on the same topic and found that android devices are a2dp sources and the audio can be streamed only from an a2dp source to an a2dp sink. A2dp sink can be a bluetooth headset or a bluetooth speaker. But my question is then how the android app named " Bluetooth Music Player " works ? It allows streaming from one mobile to another. So in this case the listening mobile device must act as a sink. How this is possible? Are they using some other profile instead of a2dp? Ok, that may be a different profile what they are using. Because the application needs to be installed in the

Web Audio API for live streaming?

两盒软妹~` 提交于 2019-12-02 15:07:28
We need to streaming live audio (from a medical device) to web browsers with no more than 3-5s of end-to-end delay (assume 200mS or less network latency). Today we use a browser plugin (NPAPI) for decoding , filtering (high, low, band), and playback of the audio stream (delivered via Web Sockets). We want to replace the plugin. I was looking at various Web Audio API demos and the most of our required functionality (playback, gain control, filtering) appears to be available in Web Audio API . However, it is not clear to me if Web Audio API can be used for streamed sources as most of the Web

How to play sound from tcp stream in java

被刻印的时光 ゝ 提交于 2019-12-02 13:09:04
问题 There is another app that writes raw wav file on this socket. The client starts and begins listening to the song which is currently playing. Socket clientSocket = new Socket("localhost", 9595); AudioInputStream stream = AudioSystem.getAudioInputStream(clientSocket.getInputStream()); I get javax.sound.sampled.UnsupportedAudioFileException: could not get audio input stream from input stream Doc about AudioSystem.getAudioInputStream: "Obtains an audio input stream from the provided input stream.

MediaPlayer() audio stuttering(android)

左心房为你撑大大i 提交于 2019-12-02 10:00:32
I am using the MediaPlayer function to stream a live audio stream from a remote server, in my android app. But the audio is choppy and stuttering. The problem is not my internet as the feed plays perfectly when I play it on the computer. What could be the problem?*Note: the streams are live. This is the code I'm using: MediaPlayer mp = new MediaPlayer(); try{ mp.setDataSource("http://radiotool:80/feed 342.mp3");//hardcoded for testing purposes mp.prepare(); mp.start(); } catch(Exception e) {Log.d("Error came up man",", check the internet connection and stuff.."); Brad The stream you are

How to stream an .pls audio file in android 2.2

怎甘沉沦 提交于 2019-12-02 09:41:51
I am trying to play an .pls file (available at http://stream.radiosai.net:8002/listen.pls ) in android 2.2. But it isn't working. Playing MP3 works fine. A pls or a playlist file is basically a text file with the path of each track in each entry. Kind of like an .ini file. Take a look at the WP entry. I don't think the inbuilt MediaPlayer supports these formats, so you'll have to download the file and parse the file and play the actual mp3's one by one. Rajdeep Dua Android doesn't support the .pls format. Link 来源: https://stackoverflow.com/questions/8266424/how-to-stream-an-pls-audio-file-in

MediaExtractor does not understand audio/aacp streams

岁酱吖の 提交于 2019-12-02 09:29:46
I have my own MediaDataSource : class MyDataSource extends MediaDataSource { private static final String TAG = "MyDataSource"; private HttpURLConnection connection; private BufferedInputStream inputStream; MyDataSource(@NonNull URL streamURL) throws Throwable { this.connection = (HttpURLConnection) streamURL.openConnection(); this.connection.setRequestMethod("GET"); this.connection.addRequestProperty("Icy-Metadata", "0"); this.connection.connect(); int responseCode = this.connection.getResponseCode(); if (responseCode != 200) throw new IOException("http response code " + responseCode); for

how to speed up google cloud speech

℡╲_俬逩灬. 提交于 2019-12-02 09:09:49
I am using a microphone which records sound through a browser, converts it into a file and sends the file to a java server. Then, my java server sends the file to the cloud speech api and gives me the transcription. The problem is that the transcription is super long (around 3.7sec for 2sec of dialog). So I would like to speed up the transcription. The first thing to do is to stream the data (if I start the transcription at the beginning of the record. The problem is that I don't really understand the api. For instance if I want to transcript my audio stream from the source (browser/microphone