html5-audio

How to get audio element?

浪尽此生 提交于 2019-12-03 14:49:18
Using React, I wish to get the audio element. var AudioPlayer = React.createClass({ componentDidMount: function () { console.info('Audio: component did mount'); var audio = React.findDOMNode('audio'); console.info('audio', audio); }, render : function() { return ( <audio src="/static/music/foo.mp3" controls /> ); } }); But I keep receiving the error: Error: Invariant Violation: Element appears to be neither ReactComponent nor DOMNode (keys: 0,1,2,3,4) Surely lowered components are React classes? It works using the component references: var AudioPlayer = React.createClass({ componentDidMount:

Play audio using base64 data in html5

南楼画角 提交于 2019-12-03 14:25:45
I have created one application to record audio using native mediaPlayer, converting this audio file into base64 data, passing this in html5 audio tag as below, File file = new File(Environment.getExternalStorageDirectory()+"/"+ "audiofile"+"/"+"myAudio.mp3"); byte[] FileBytes =getBytesFromFile(file); String base64 = Base64.encodeToString(FileBytes, Base64.NO_WRAP).toString(); public static byte[] getBytesFromFile(java.io.File file) throws IOException { InputStream is = new FileInputStream(file); long length = file.length(); if (length > Integer.MAX_VALUE) { } byte[] bytes = new byte[(int

Overlay two audio buffers into one buffer source

末鹿安然 提交于 2019-12-03 12:48:06
Trying to merge two buffers into one; I have been able to create the two buffers from the audio files and load and play them. Now I need to merge the two buffers into one buffer. How can they get merged? context = new webkitAudioContext(); bufferLoader = new BufferLoader( context, [ 'audio1.mp3', 'audio2.mp3', ], finishedLoading ); bufferLoader.load(); function finishedLoading(bufferList) { // Create the two buffer sources and play them both together. var source1 = context.createBufferSource(); var source2 = context.createBufferSource(); source1.buffer = bufferList[0]; source2.buffer =

Play music from bytearray in html5

余生颓废 提交于 2019-12-03 12:47:10
Is there any way to play music from bytes instead of a file in HTML 5? I need to stream music bytes and play them live. please check this var dogBarkingBuffer = null; // Fix up prefixing window.AudioContext = window.AudioContext || window.webkitAudioContext; var context = new AudioContext(); function loadDogSound(url) { var request = new XMLHttpRequest(); request.open('GET', url, true); request.responseType = 'arraybuffer'; // Decode asynchronously request.onload = function() { context.decodeAudioData(request.response, function(buffer) { dogBarkingBuffer = buffer; }, onError); } request.send()

Playing audio file returns “Uncaught (in promise)” but works in console

不羁的心 提交于 2019-12-03 12:32:54
I'm trying to play audio files (I've tried many). All of them are mp3s. I've tested the following on both MAMP localhost and also by just running it in the browser. I use the following javascript: var testSound = new Audio(); testSound.src = "a.mp3" setTimeout(testSound.play.bind(testSound),100) This returns the error: Uncaught (in promise) Trying to catch it: var testSound = new Audio(); testSound.src = "a.mp3" setTimeout(playSound,100) function playSound () { testSound.play().then(response => { }).catch(e => { console.log(e); }) } returns nothing ( "" ) But if I now turn to the console and

How do I play a sound when an element changes, like SO Chat does?

夙愿已清 提交于 2019-12-03 12:01:59
I want a sound to play when an element changes on a page. I know how to do this, but I can't get it to play only on the first change , and don't do it later, until the user focuses the window (tab) and blurs it again. My current code: var notif = new Audio('http://cycle1500.com/sounds/infbego.wav'); if (window.innerHeight === window.outerHeight) { $(window).bind('DOMNodeInserted', function() { notif.play(); }); } Use a variable to represent whether the sound should be played or not. var shouldPlayAlertSound = true, notif = new Audio('http://cycle1500.com/sounds/infbego.wav'); if (window

understanding getByteTimeDomainData and getByteFrequencyData in web audio

狂风中的少年 提交于 2019-12-03 11:53:31
问题 The documentation for both of these methods are both very generic wherever I look. I would like to know what exactly I'm looking at with the returned arrays I'm getting from each method. For getByteTimeDomainData, what time period is covered with each pass? I believe most oscopes cover a 32 millisecond span for each pass. Is that what is covered here as well? For the actual element values themselves, the range seems to be 0 - 255. Is this equivalent to -1 - +1 volts? For getByteFrequencyData

Firefox WebAudio createMediaElementSource not working

99封情书 提交于 2019-12-03 11:02:19
Im using the WebAudio API with new Audio() object as a source. The following is a simplified version of what i am doing. This however, doesnt play any sounds in firefox 25.0.1. var context; if(window.webkitAudioContext) { context = new webkitAudioContext(); } else { context = new AudioContext(); } var audio = new Audio(); // This file does seem to have CORS Header audio.src = "http://upload.wikimedia.org/wikipedia/en/4/45/ACDC_-_Back_In_Black-sample.ogg"; var source; function onCanPlay() { console.log("can play called"); source = context.createMediaElementSource(audio); source.connect(context

How to design customized audio player with HTML

有些话、适合烂在心里 提交于 2019-12-03 10:01:17
问题 I have a layout for an audio player that I'd like to use with the HTML audio player element. I was trying <audio></audio> , and it's giving me the default player: Is there any way to change the style of the player to use the layout that I want to use? 回答1: You can whip up a very nice looking set of custom audio controls for the HTML5 audio player pretty quickly. Using (mostly) basic HTML and CSS, with some light Javascript event handling is all that's required. This solution a fully

Automatic audio sprite generator for web/javascript?

霸气de小男生 提交于 2019-12-03 08:43:51
Audio sprites (several audio bites concatenated in one audio file) are getting common in javascript control of audio on the web. However, it takes quite a lot of "stupid" work to create and implement an audio sprite. Is there a tool or method by which you could do it automatically instead of "manually"? E.g. given a folder with audio files, I want a tool that generates An audio file with all the contents, preferably separated by a bit of silence. The onset and offset timings (in milliseconds) of each soundbite in the audiofile. Preferably, it would output the javascript sprite code itself!