html5-audio

Load audiodata into AudioBufferSourceNode from <audio/> element via createMediaElementSource?

两盒软妹~` 提交于 2019-11-30 01:51:42
问题 Is it possible to have an audiofile loaded from <audio/> -element via createMediaElementSource and then load the audio data into a AudioBufferSourceNode ? Using the audio-element as a source (MediaElementSource) seems not to be an option, as I want to use Buffer methods like noteOn and noteGrain . Loading the audiofile directly to the buffer via XHR unfortunately isn't an option neither ( see Open stream_url of a Soundcloud Track via Client-Side XHR?) Loading the buffer contents from the

How to get microphone input volume value with web audio api?

◇◆丶佛笑我妖孽 提交于 2019-11-29 21:35:05
I am using the microphone input with web audio api and need to get the volume value. Right now I have already got the microphone to work: http://updates.html5rocks.com/2012/09/Live-Web-Audio-Input-Enabled Also, i know there's a method manipulating the volume of audio file: http://www.html5rocks.com/en/tutorials/webaudio/intro/ // Create a gain node. var gainNode = context.createGain(); // Connect the source to the gain node. source.connect(gainNode); // Connect the gain node to the destination. gainNode.connect(context.destination); // Reduce the volume. gainNode.gain.value = 0.5; But how to

WebRTC Live Audio Streaming/Broadcast [closed]

倾然丶 夕夏残阳落幕 提交于 2019-11-29 21:08:34
I'm trying to get my head round WebRTC. I need to be able to capture and stream live audio through a web browser. I'm just having difficulty finding the code examples that I can understand or is up-to-date. If anyone could help me with just first capturing and playing audio in the same browser with HTML5/WebRTC I think that would help me get started and along my way. Note: I'm only concerned about getting this to work in Chrome (or Chrome Canary for that matter!). Thanks for any help! The HTML5 Rocks article on WebRTC is probably the best intro article that explains everything in layman's

What does the FFT data in the Web Audio API correspond to?

陌路散爱 提交于 2019-11-29 20:18:53
I've used the FFT data from the Analyser node using the getByteFrequencyData method in the Web Audio API to create a spectrum visualizer as shown below: In this instance I have 256 bins of data. What exactly do the numbers in this correspond to? Is it the decibel level of each frequency component. If so how do I know what the value of the frequency of each bin corresponds to? I would like to know so I can experiment in building a graphic eq and so would like to know at which points to indicate the filter bands. Ideally I'd like to represent frequencies from 20Hz to 20kHz and plot intervals

HTML5 audio duration issue

半世苍凉 提交于 2019-11-29 18:49:23
I am using jplayer to play audio and video files in Ipad, But the issue is if i change video or audio url, sometimes i get duration NaN. Please help me. seteh The medata is available after onloadedmetadata event has fired. (c) https://stackoverflow.com/a/7275714/492641 Hope it helps. Check your server has Range requests enabled. From this page on the Jplayer site: Your server must enable Range requests. This is easy to check for by seeing if your server's response includes the Accept-Ranges in its header. Most HTML5 browsers enable seeking to new file positions during a download, so the server

How to set the duration of audio

落爺英雄遲暮 提交于 2019-11-29 16:35:07
I am trying to set the duration of an audio tag using HTML DOM duration property of audio. I have tried the following but it doesn't seem to work: $('audio')[0].duration = 1; I've gone through other answers but I couldn't see any which make use of the duration property. If the duration property is readonly, what other method does it leave me with? You cannot change the duration as it is locked to the original data (which cannot be changed via the audio element). You can achieve the illusion of a different duration though by making restrictions on play by monitoring the time and pause the audio

Recording audio in Chrome for Android using web audio API and navigator.getUserMedia

我是研究僧i 提交于 2019-11-29 16:11:41
Chrome for Android versions 30 and 31 beta on Android 4.1 do not appear to correctly record audio using HTML5 web audio and navigator.webkitGetUserMedia. (Chrome 30+ on Android is supposed to support these APIs.) The symptom is that the code appears to work correctly, including displaying the prompt for whether or not to allow microphone access, but recorded data contains nothing but zeros. I created a simplified testcase (go to http://jsfiddle.net/JCFtK/ , and click Record button, then choose the appropriate option to allow it to access your microphone). The key part of the code is below (

HTML5 audio ended function [duplicate]

老子叫甜甜 提交于 2019-11-29 15:45:27
Possible Duplicate: HTML5 audio element with dynamic source I'm trying to get the player to reload a new track once the current one ends, but I think I might have done something wrong. This is the player: <audio id="audio" autoplay controls="controls" onclick="start()"> <source src="song.php" type="audio/mpeg" /> </audio> Here's the script: function start(){ var audio = document.getElementById('audio'); audio.play(); audio.addEventListener('ended',function(){ $.post("song.php", function(result){ audio.src = result; audio.pause(); audio.load(); audio.play(); }); }); } If I change the script to

HTML (or R Shiny) Audio Caching [closed]

三世轮回 提交于 2019-11-29 14:55:00
After figuring out how to make a Shiny server play a wav file , I created a shiny server that dynamically creates wav files based on reactive input. However, the first wav file gets cached, and despite changes to the file or even renaming the file, only the first wav is played until a full page refresh. How could I play a changed wav file after it is changed through an HTML-based Shiny app? I understand that this question has been asked (and solved) for HTML developers in a few places, using javascript or Jquery/php or a server-side solution , but I haven't figured out how to make any of these

Cross-browser and cross-device audio

霸气de小男生 提交于 2019-11-29 14:46:59
Question Is there any way to use audio in a web app so that it will work on most browsers (Chrome,FF,Safari and IE9+) and devices (Android/IOS for mobile would do) ? My requirements I would need basic preloading, playing multiple sounds at once, muting them on click and perhaps looping (probably not seamlessly right? ;) ). What I learned so far: Web Audio API wont work on IE and any Android. ( link ) with audio tag I get different durations for the same audio on Safari and Chrome ... I checked SoundManager2 but it won't work somewhere (throws HTML5 error code 4 for me) SoundJS seems to work