web-audio-api

No metadata when recording an audio webm with MediaRecorder

若如初见. 提交于 2021-02-20 08:31:52
问题 For my project I record user audio using MediaRecorder and it almost works fine. My problem rises when I wish to display a waveform of the user recording using Wavesurfer.js, which doesn't load my recording. Playing the recording with an Audio element works fine, though. After trying different sources, it seams that it is because the final .webm file doesn't have much metadata, not even a duration or bitrate (even though I set it in the MediaRecorder options). Here is the output from ffprobe

No metadata when recording an audio webm with MediaRecorder

梦想的初衷 提交于 2021-02-20 08:31:07
问题 For my project I record user audio using MediaRecorder and it almost works fine. My problem rises when I wish to display a waveform of the user recording using Wavesurfer.js, which doesn't load my recording. Playing the recording with an Audio element works fine, though. After trying different sources, it seams that it is because the final .webm file doesn't have much metadata, not even a duration or bitrate (even though I set it in the MediaRecorder options). Here is the output from ffprobe

How to make AudioWorklets work with vue-cli/webpack/babel? (getting illegal invocation error)

半腔热情 提交于 2021-02-19 05:18:46
问题 I'm trying to create a WebApp with vue-cli that uses AudioWorklets, but I'm getting a bunch of errors when trying to access any property of my AudioWorkletNode, like port or channelCount etc: TypeError: Illegal invocation at MyWorkletNode.invokeGetter After hours of googling and debugging I think it's somehow related to classes, AudioWorklet seems to only work with ES6 classes but one of vue-cli/babel/webpack does it's magic (which I don't understand where and what it does) and transpiles the

How to modulate the pulsewidth of the Web Audio API Square OscillatorNode?

自古美人都是妖i 提交于 2021-02-11 17:26:37
问题 I want to modulate the square waveform of the Web Audio API OscillatorNode by connecting to other OscillatorNodes. But I can not find the parameter in the AudioParams. Is this possible at all or is there a workaround? I thought about creating a "custom" wavetable Oscillator with the "audioContext.createWaveTable()" function. This wavetable could contain different pulses with sweeping pulsewidths. But than again I have no idea how to control the position of the wavetable pointer via

How to stream audio file with opentok?

时间秒杀一切 提交于 2021-02-10 05:11:03
问题 In opentok, with OT.initPublisher, you only can pass a deviceId to the audioSource. Does someone know a method to stream an audio file ? For example, I have done this: navigator.getUserMedia({audio: true, video: false}, function(stream) { var context = new AudioContext(); var microphone = context.createMediaStreamSource(stream); var backgroundMusic = context.createMediaElementSource(document.getElementById("song")); var mixedOutput = context.createMediaStreamDestination(); microphone.connect

Chrome extension: Can't make chrome.desktopCapture.chooseDesktopMedia capture window audio

此生再无相见时 提交于 2021-02-08 05:48:23
问题 I'm trying to use the chrome.desktopCapture.chooseDesktopMedia API in order to capture audio from the extension window. I'm sending the capture request from the popup.js page. Manifest: { "background": { "scripts": [ "background.js" ] }, "browser_action": { "default_icon": "style/icons/icon16.png", "default_title": "__MSG_name__" }, "default_locale": "en", "description": "__MSG_description__", "icons": { "128": "style/icons/icon128.png" }, "manifest_version": 2, "name": "__MSG_extName__",

Record internal audio of a website via javascript

生来就可爱ヽ(ⅴ<●) 提交于 2021-02-08 05:41:36
问题 i made this webapp to compose music, i wanted to add a feature to download the composition as .mp3/wav/whateverFileFormatPossible, i've been searching on how to do this for many times and always gave up as i couldn't find any examples on how to do it, only things i found were microphone recorders but i want to record the final audio destination of the website. I play audio in this way: const a_ctx = new(window.AudioContext || window.webkitAudioContext)() function playAudio(buf){ const source

Extracting audio from a video file

时光毁灭记忆、已成空白 提交于 2021-02-05 20:36:17
问题 Edit : This post is no duplicate of mine. I am trying to extract the audio data as binary, got no problems with playing the audio file as separate as I mentioned before. I am trying to extract audio from a video file on client-side by using Web Audio Api. var audioContext = new(window.AudioContext || window.webkitAudioContext)(); fileData = new Blob([input.files[0]]); var videoFileAsBuffer = new Promise(getBuffer); videoFileAsBuffer.then(function (data) { audioContext.decodeAudioData(data)

Extracting audio from a video file

对着背影说爱祢 提交于 2021-02-05 20:35:52
问题 Edit : This post is no duplicate of mine. I am trying to extract the audio data as binary, got no problems with playing the audio file as separate as I mentioned before. I am trying to extract audio from a video file on client-side by using Web Audio Api. var audioContext = new(window.AudioContext || window.webkitAudioContext)(); fileData = new Blob([input.files[0]]); var videoFileAsBuffer = new Promise(getBuffer); videoFileAsBuffer.then(function (data) { audioContext.decodeAudioData(data)

Encode AudioBuffer with Opus (or other codec) in Browser

痞子三分冷 提交于 2021-01-29 13:21:38
问题 I am trying to stream Audio via Websocket. I can get an AudioBuffer from the Microphone (or other Source) via Web-Audio-Api and stream the RAW-Audio-Buffer, but i think this would not be very efficient. So i looked arround to encode the AudioBuffer somehow. - If the Opus-Codec would not be practicable, i am open to alternatives and thankful for any hints in the right direction. I have tried to use the MediaRecorder (from MediaStreamRecording-API) but it seems not possible to stream with that