web-audio-api

WebAudio API: Is It Possible to Export an AudioBuffer with StereoPanner Node Data?

天大地大妈咪最大 提交于 2021-01-29 13:00:29
问题 I'm looking to export an AudioBuffer to a wav file with a StereoPanner node i.e. I pan a sound all the left and export it panned to the left. I'm wondering if it is possible to export the StereoPanner data associated with an AudioContext? I have built an AudioSource from an AudioContext, and I have attached an StereoPanner to my AudioSource. I'm able to pan my sound in-browser without issue, and I'm also able to export my AudioBuffer to a file (wav). Unfortunately, when I export my

WebAudio API: Is It Possible to Export an AudioBuffer with StereoPanner Node Data?

元气小坏坏 提交于 2021-01-29 11:50:40
问题 I'm looking to export an AudioBuffer to a wav file with a StereoPanner node i.e. I pan a sound all the left and export it panned to the left. I'm wondering if it is possible to export the StereoPanner data associated with an AudioContext? I have built an AudioSource from an AudioContext, and I have attached an StereoPanner to my AudioSource. I'm able to pan my sound in-browser without issue, and I'm also able to export my AudioBuffer to a file (wav). Unfortunately, when I export my

How can I mix multiple stereo signals to one with WebAudio?

北慕城南 提交于 2021-01-29 08:29:22
问题 I'm writing a web app which needs to combine a number of stereo sounds into one stereo output, so I want an equivalent of gstreamer's audiomixer element, but there doesn't seem to be one in WebAudio. ChannelMerger doesn't do quite the same thing - it combines multiple mono signals into one multi-channel signal. The documentation for AudioNode.connect says that you can connect an output to multiple inputs of other nodes and that attempts to connect the same output to the same input more than

Alternate for Web Audio API

故事扮演 提交于 2021-01-28 12:24:44
问题 I have a web program which makes use of Web Audio API. The issue here is that i want to make it compatible for IE. Is there any alternate for the Web Audio API, so that i can make the same code run on IE specifically? 回答1: What are your needs? If you need to do dynamic synthesis, audio routing, etc, you will only be able to achieve that with the Web Audio API, so your IE users are out of luck. However, if all you need to do is play audio files, then I would recommend that you use howler.js.

Web Audio Api precise looping in different browsers

余生颓废 提交于 2021-01-28 11:16:08
问题 So what I want is to have constant looping interchanging from different audio sources. For demo purpose I made a little puzzle game - you align numbers in order from 0 to 8 and depending on how you align them different loops are playing. I managed to get the result I want on Chrome Browser, but not on Safari or Firefox. I tried adding a different audio destination or multiple audio contexts but no matter what loop just stops after one iteration in Safari and other browsers except for Chrome.

Web Audio API: Layout to Achieve Panning for an Arbitrary Number of Sources

只愿长相守 提交于 2021-01-28 09:20:30
问题 I am trying to achieve user-controlled panning for any number of simultaneous web audio sources. The sources themselves are mono. I'm working in Javascript with the web audio API (https://developer.mozilla.org/en-US/docs/Web/API/Web_Audio_API). Currently, the problem I'm running into is that I'm trying to use a multi-channel output (one for each source), but the channel interpretation is overriding my attempts at panning (see https://developer.mozilla.org/en-US/docs/Web/API/AudioNode

How to play multiple AudioBufferSourceNode synchronized?

被刻印的时光 ゝ 提交于 2021-01-28 03:56:03
问题 I have multiple audio files that must be played in sync. I have read that Web Audio API is the best solution for this. But, I can't find any document that shows how to achieve this. Almost all articles I have read do this to start playback. //Let's say I have AudioBufferSourceNode connected to two buffers var source1, source2; source1.start(0); source2.start(0); Shouldn't this cause source2 to start playing slightly later than source1? Also, what makes the sources stay in sync? I can not find

Fetch external audio file into web audio API buffer - cors error?

假装没事ソ 提交于 2021-01-28 01:34:07
问题 I'm working on a project which I hoped would take random selections from the xeno-canto archive to create a 'virtual dawn chorus' I'm doing this in javascript with the webAudio API. I have a list of samples and their urls such as xeno-canto.org/sounds/uploaded/YQNGFTBRRT/XC144576-ABTO_BWRNWR_15Apr2013_Harter.mp3' Which I'm tyring to load into an audio buffer. This is my 'loadAudio' function in javascript. I've successfully used this in other projects to get my audio - although that was from

Understanding AudioBuffer to ArrayBuffer conversion

早过忘川 提交于 2021-01-27 14:42:12
问题 I have an AudioBuffer client-side that I'd like to AJAX to a express server. This link shows how a XMLHttpRequest can send/receive binary data - ArrayBuffer. An ArrayBuffer is different to an AudioBuffer (or so I believe) as I decoded an ArrayBuffer to make the AudioBuffer in the first place. This was done using decodeAudioData() as part of the Web Audio API. So my question is, can I convert an AudioBuffer back to an ArrayBuffer? If this is possible, I'd like to then send the ArrayBuffer to

Chrome extension: Prevent chrome.tabCapture.capture choppy sound by increasing buffer size?

ⅰ亾dé卋堺 提交于 2021-01-27 13:48:07
问题 It's seems like audio capturing using chrome.tabCapture.capture can produce some choppy sounds. There is already a bug report for this. Is it possible to increase the buffer that receives the captured stream in order to prevent the stutter, or does the tabCapture method already defines a buffer? Basic capturing: chrome.tabCapture.capture({ audio: true, video: false }, function (stream) { var ctx = new AudioContext(); var output = ctx.createMediaStreamSource(stream); output.connect(ctx