audio-streaming

Get Audio Session Id from Google Meet

淺唱寂寞╮ 提交于 2020-12-23 13:50:20
问题 I am playing with DynamicsProcessing. I want to process the audio from an external application. I just require the audioSessionId for that. I have no problems with Play Music , for example. I have used a BroadCastReceiver listening the android.media.action.OPEN_AUDIO_EFFECT_CONTROL_SESSION and everything works like a charm. <receiver android:name=".framework.AudioSessionReceiver"> <intent-filter> <action android:name="android.media.action.OPEN_AUDIO_EFFECT_CONTROL_SESSION"/> </intent-filter>

Get Audio Session Id from Google Meet

穿精又带淫゛_ 提交于 2020-12-23 13:48:36
问题 I am playing with DynamicsProcessing. I want to process the audio from an external application. I just require the audioSessionId for that. I have no problems with Play Music , for example. I have used a BroadCastReceiver listening the android.media.action.OPEN_AUDIO_EFFECT_CONTROL_SESSION and everything works like a charm. <receiver android:name=".framework.AudioSessionReceiver"> <intent-filter> <action android:name="android.media.action.OPEN_AUDIO_EFFECT_CONTROL_SESSION"/> </intent-filter>

audio/mp4; codecs=“mp4a.40.2” not playing in Chrome and Firefox

喜你入骨 提交于 2020-11-25 04:08:30
问题 It seems I want to convert audios, which I want to stream on my website, to audio/mp4; codecs="mp4a.40.2" . Using ffmpeg-cli-wrapper, I am converting my uploaded audio files with this command here: ffmpeg -i /tmp/input.any -acodec aac -b:a 256000 /tmp/output.aac On the client I am creating a SourceBuffer like this: this.sourceBuffer = this.mediaSource.addSourceBuffer('audio/mp4; codecs="mp4a.40.2"'); The errors are: Chrome: NotSupportedError: Failed to load because no supported source was

audio/mp4; codecs=“mp4a.40.2” not playing in Chrome and Firefox

放肆的年华 提交于 2020-11-25 04:08:26
问题 It seems I want to convert audios, which I want to stream on my website, to audio/mp4; codecs="mp4a.40.2" . Using ffmpeg-cli-wrapper, I am converting my uploaded audio files with this command here: ffmpeg -i /tmp/input.any -acodec aac -b:a 256000 /tmp/output.aac On the client I am creating a SourceBuffer like this: this.sourceBuffer = this.mediaSource.addSourceBuffer('audio/mp4; codecs="mp4a.40.2"'); The errors are: Chrome: NotSupportedError: Failed to load because no supported source was

HTTP realtime audio streaming server

ぐ巨炮叔叔 提交于 2020-07-19 18:50:13
问题 As a proof-of-concept I need to create a HTTP server which on GET request should start continuous stream of non-encoded/non-compressed audio data - WAV, PCM16. Let's assume the audio data are chunks of 4096 randomly generated mono audio samples @44.1kHz sampling rate. What should I put in the HTTP response header in order to browser on the other end start a player in its UI for the user to listen in realtime? I was reading about "Transfer-Encoding: chunked", "multipart", mimetype="audio/xwav"

(Ffmpeg) How to play live audio in the browser from received UDP packets using Ffmpeg?

筅森魡賤 提交于 2020-06-29 05:20:33
问题 I have .NET Core console application which acts as UDP Server and UDP Client UDP client by receiving audio packet. UDP server, by sending each received packet. Here's a sample code of the console app: static UdpClient udpListener = new UdpClient(); static IPEndPoint endPoint = new IPEndPoint(IPAddress.Parse("192.168.1.230"), 6980); static IAudioSender audioSender = new UdpAudioSender(new IPEndPoint(IPAddress.Parse("192.168.1.230"), 65535)); static void Main(string[] args) { udpListener.Client