webm

Merge WAV audio and WebM video

拟墨画扇 提交于 2019-12-04 12:30:52
There's a way to merge an audio (wav) and a video (webm) in a nodejs server? Since WebM is a container format, I hope that is possible add audio track to an existing WebM file. I'm right? Anyone know a NodeJS package for doing this? Found a solution, but is not really simple to do. For do this is required ffmpeg (or similar). To install it I done this steps: (only for mac) install HomeBrew . run the installation of ffmpeg with all the dependences that is required: sudo brew install ffmpeg --with-libvpx --with-theora --whit-libogg --with-libvorbis now we can merge a audio and a video file with

Why does FFMPEG always make large WebM files?

我的未来我决定 提交于 2019-12-04 10:45:30
问题 I'm trying to encode my movies into WebM: ffmpeg -i input.MOV -codec:v libvpx -quality good -cpu-used 0 -b:v 10k -qmin 10 -qmax 42 -maxrate 10k -bufsize 20k -threads 8 -vf scale=-1:1080 -codec:a libvorbis -b:a 192k output.webm I want to encode at a couple of different bit rates (video and audio combined): 2192 kbps 1692 kbps 1000 kbps The problem is that no matter which bit rates I enter, I always get a file with a bit rate higher than 1900 kbps. (1914 kbps with the code example above.) What

Video element disappears in Chrome when not using controls

一世执手 提交于 2019-12-04 04:21:20
So - I think this is a browser bug. It came up in a much more complicated design/site, but I've had a good solid fiddle around, simplified my code and designs, etc, and have found the following: When embedding <video> without a controls attribute in Chrome, triggering the video to play using javascript causes the video element to go blank. http://jsfiddle.net/trolleymusic/2fHTv/ The blankness is a bit random, sometimes by rolling out of the element, it'll reappear. Sometimes you need to click/focus on something else, most of the time pausing the video will cause it to reappear. I've also put a

webm / vp8 player for java

ぐ巨炮叔叔 提交于 2019-12-04 03:18:36
does anyone know of a java library that plays vp8 or webm videos? thanks! VLC can play webm and vp8 videos since version 1.1.0, and there are Java bindings available for it. Have a look at: jVLC: http://wiki.videolan.org/Java_bindings VLCJ: http://code.google.com/p/vlcj/ I've used jVLC and it works, but it is not actively maintained anymore. VLCJ looks very good. http://sourceforge.net/projects/javavp8decoder/ it's beta but maybe it's a start. I don't know of any native implementations (yet) but there are plugins for gstreamer and ffmpg , both have very good java-wrappers 来源: https:/

Video with transparency on Android

时光毁灭记忆、已成空白 提交于 2019-12-04 02:50:20
问题 Is there any way to have Android play video with transparent areas? When I try to play a WebM video containing transparent areas in VideoView, the background of the view remains black. Instead of black I'd expect to see the background of the parent view shown through on the transparent areas. The only working solution I've found so far is to create a drawable animation out of the video frames, which isn't very memory efficient. 回答1: I know it's a bit late, but perhaps it can help nevertheless

vue-video-player

眉间皱痕 提交于 2019-12-03 22:42:43
yarn add vue-video-player main.js import VideoPlayer from 'vue-video-player' require('video.js/dist/video-js.css') require('vue-video-player/src/custom-theme.css') Vue.use(VideoPlayer) 完整代码如下 <template> <div> <!--https://github.surmon.me/vue-video-player/--> <video-player class="vjs-custom-skin" ref="videoPlayer" :options="playerOptions" :playsinline="true" @play="onPlayerPlay($event)" @pause="onPlayerPause($event)" @ended="onPlayerEnded($event)" @loadeddata="onPlayerLoadeddata($event)" @waiting="onPlayerWaiting($event)" @playing="onPlayerPlaying($event)" @timeupdate="onPlayerTimeupdate($event

Media Source Extensions appendBuffer of WebM stream in random order

雨燕双飞 提交于 2019-12-03 21:40:12
I am trying to achieve video downloading in parallel from multiple sources. However MSE appendBuffer method always fails when not following sequence order of video file. I would like to append parts in random order and play video "as soon as possible". I was exploring SourceBuffer mode property as well as timestampOffset. None of those were helpful. I am wondering if source webm file i have could be in "not supported format" for such a task (sequential approach works fine). source video file Thank you for any advices. UPDATE: I tried to analyse well known example video file and i figured out

Convert WebM as MP4 on the fly

笑着哭i 提交于 2019-12-03 20:39:39
I am trying to convert a remote WebM file on the fly to MP4. This should happen without writing anything to disk. Furthermore it would be great to be able to stream out results as soon as possible. This is my flask function without the actual conversion, so you get a idea of the streaming. @app.route("/stream/mp4") def as_mp4(): url = "http://video.webmfiles.org/big-buck-bunny_trailer.webm" r = requests.get(url, stream=True) def stream(): # convert it here for chunk in r.iter_content(chunk_size=1024): yield chunk # end for # end def return Response(stream(), mimetype="video/mp4") # end def You

need to create a webm video from RGB frames

心不动则不痛 提交于 2019-12-03 16:18:09
I have an app that generates a bunch of jpgs that I need to turn into a webm video. I'm trying to get my rgb data from the jpegs into the vpxenc sample. I can see the basic shapes from the original jpgs in the output video, but everything is tinted green (even pixels that should be black are about halfway green) and every other scanline has some garbage in it. I'm trying to feed it VPX_IMG_FMT_YV12 data, which I'm assuming is structured like so: for each frame 8-bit Y data 8-bit averages of each 2x2 V block 8-bit averages of each 2x2 U block Here is a source image and a screenshot of the video

Live-Streaming webcam webm stream (using getUserMedia) by recording chunks with MediaRecorder over WEB API with WebSockets and MediaSource

我们两清 提交于 2019-12-03 13:05:36
I'm trying to broadcast a webcam's video to other clients in real-time, but I encounter some problems when viewer's start watching in the middle. For this purpose, I get the webcam's stream using getUserMedia (and all its siblings). Then, on a button click, I start recording the stream and send each segment/chunk/whatever you call it to the broadcaster's websocket's backend: var mediaRecorder = new MediaRecorder(stream); mediaRecorder.start(1000); mediaRecorder.ondataavailable = function (event) { uploadVideoSegment(event); //wrap with a blob and call socket.send(...) } On the server side (Web