vp8

Combining implementation of autobahn websockets, gstreamers and html5 mediaSource API

无人久伴 提交于 2020-01-02 16:24:09
问题 I am running a websocket server using autobahn|python. on the server side, I also have a gstreamer pipeline running which I am using to capture webm frames using "appsink". The gstreamer pipeline that is implemented is: gst-launch-1.0 v4l2src ! video/x-raw,width=640,height=480 ! videoconvert ! vp8enc ! webmmux ! appsink name="sink" Everytime, I receive a buffer in the appsink, I send it over a websocket as a binary "message" using sendMessage. def on_new_buffer(appsink): global once gstsample

build vp8 on android

旧街凉风 提交于 2019-12-30 01:35:28
问题 I'm trying to build the vp8 codec for Android. I ran the configure.sh script and the makefile for armv6 with sourcery g++ which succesfully produced libvpx.so. After that I wrote a JNI wrapper and compiled it with ndk-build succesfully. When I run this on a Gingerbread smartphone I got a UnsatisfiedLinkError "libpthread.so.0 not found". How can I get rid of this error? 回答1: From http://git.chromium.org/gitweb/?p=webm/bindings.git;a=blob_plain;f=JNI/README.Android with some adjustments for

Webm (VP8 / Opus) file read and write back

十年热恋 提交于 2019-12-25 16:54:37
问题 I am trying to develop a webrtc simulator in C/C++. For media handling, I plan to use libav . I am thinking of below steps to realize media exchange between two webrtc simulator. Say I have two webrtc simulators A and B . Read media at A from a input webm file using av_read_frame api. I assume I will get the encoded media (audio / video) data, am I correct here? Send the encoded media data to simulator B over a UDP socket. Simulator B receives the media data in UDP socket as RTP packets.

How to encode series of images into VP8 using WebM VP8 Encoder API? (C/C++)

早过忘川 提交于 2019-12-23 12:20:09
问题 How to transcode RGB images into VP8 frames (Keyframe + some dependent frames)? So I created some images how to turn tham into VP8 now? 回答1: First, you need a codec library for VP8: http://www.webmproject.org/code/build-prerequisites/ Using libvpx API you can then encode your RGB frames into VP8 frames. 回答2: The easiest way to go is to use ffmpeg. The latest release of ffmpeg (0.6) now supports the VP8 codec, and building it is now easy. Then, ffmpeg makes it simple to gather individual

webm / vp8 player for java

巧了我就是萌 提交于 2019-12-21 09:26:57
问题 does anyone know of a java library that plays vp8 or webm videos? thanks! 回答1: VLC can play webm and vp8 videos since version 1.1.0, and there are Java bindings available for it. Have a look at: jVLC: http://wiki.videolan.org/Java_bindings VLCJ: http://code.google.com/p/vlcj/ I've used jVLC and it works, but it is not actively maintained anymore. VLCJ looks very good. 回答2: http://sourceforge.net/projects/javavp8decoder/ it's beta but maybe it's a start. 回答3: I don't know of any native

VP8 Encoding Nexus 5 returns empty/0-Frames

依然范特西╮ 提交于 2019-12-18 07:18:12
问题 I'm trying to encode my camera feed to VP8. The problem is: when I get the frame from the output buffer, the byte array is always different size but all entries are 0. Here's the code where I grab the frame and print it: while (true) { try { encoderIndex = mEncoder.dequeueOutputBuffer(encoderOutputInfo, timeOut); } catch (Exception e) { e.printStackTrace(); } switch (encoderIndex) { case MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED: // something break; case MediaCodec.INFO_OUTPUT_FORMAT_CHANGED: //

VP8 Encoding Nexus 5 returns empty/0-Frames

风格不统一 提交于 2019-12-18 07:18:02
问题 I'm trying to encode my camera feed to VP8. The problem is: when I get the frame from the output buffer, the byte array is always different size but all entries are 0. Here's the code where I grab the frame and print it: while (true) { try { encoderIndex = mEncoder.dequeueOutputBuffer(encoderOutputInfo, timeOut); } catch (Exception e) { e.printStackTrace(); } switch (encoderIndex) { case MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED: // something break; case MediaCodec.INFO_OUTPUT_FORMAT_CHANGED: //

VP8 Encoding results in grayscale image on Google Glass

陌路散爱 提交于 2019-12-13 05:04:09
问题 The application I am working on is developed for Google Glass but runs on Android tablets as well.It uses VP8 encoding to transfer camera images to a remote application. The preview format parameter on the camera is set to ImageFormat.YV12. The VP8 encoder is initialized with VPX_IMG_FMT_YV12 parameter. When the application .apk file is installed and run from the Glass, the image is displayed in gray scale on the remote application. When the same .apk file is installed on a tablet or a phone,

Best HTML5 Video Format for Safari on Window (or getting VP8 to play in Safari on Windows)

懵懂的女人 提交于 2019-12-11 12:17:00
问题 Here's the deal, through a huge series of events, I am stuck using Safari on Windows for video playback in HTML5. I can't use any other browser, Chrome is out of the question, I must use Safari and it has to be on Windows for hardware compatibility. The best format I've found is a h.264 Quicktime file, but I'm still getting some frames dropped and a bit of tearing. The video is being played in 1920x1080 resolution and I have tried down-sampling to 720p, which causes noticeable quality loss

Concatenate parts of two or more webm video blobs

断了今生、忘了曾经 提交于 2019-12-11 09:24:49
问题 Is it possible to to concatenate parts of two or more video blobs encoded in webm format using just client side javascript? 回答1: What you are looking for is the media source extension as defined here: http://www.w3.org/TR/media-source/ Both Firefox (partially), Opera Chrome already support this. IE11 only on Windows 8+. Also, take a look at DASH (https://en.wikipedia.org/wiki/Dynamic_Adaptive_Streaming_over_HTTP) and HLS (https://en.wikipedia.org/wiki/HTTP_Live_Streaming) 来源: https:/