libvpx

CentOS 7编译安装Nginx+MySQL+PHP

二次信任 提交于 2020-08-10 22:45:20
一、配置防火墙,开启80端口、3306端口 CentOS 7.0默认使用的是firewall作为防火墙,这里改为iptables防火墙。 1、关闭firewall: systemctl stop firewalld.service #停止firewall systemctl disable firewalld.service #禁止firewall开机启动 2、安装iptables防火墙 yum install iptables-services #安装 vi /etc/sysconfig/iptables #编辑防火墙配置文件 # Firewall configuration written by system-config-firewall # Manual customization of this file is not recommended. *filter :INPUT ACCEPT [0:0] :FORWARD ACCEPT [0:0] :OUTPUT ACCEPT [0:0] -A INPUT -m state --state ESTABLISHED,RELATED -j ACCEPT -A INPUT -p icmp -j ACCEPT -A INPUT -i lo -j ACCEPT -A INPUT -m state --state NEW -m tcp

Cygwin 编译 ffmpeg

流过昼夜 提交于 2020-04-11 19:49:19
1、在官网下载linux下的压缩包   https://ffmpeg.zeranoe.com/builds/source/ffmpeg/ffmpeg-3.2.4.tar.xz 2、进入cygwin,假定将压缩包放在了cygwin的 bin\ffmpeg-3.2.4.tar目录下 执行如下指令进行文件解压 cd / bin cd ffmpeg - 3.2 . 4 . tar / xz -d ffmpeg- 3.2 . 4 . tar .xz tar -xvf ffmpeg- 3.2 . 4 . tar cd ffmpeg - 3.2 . 4 / 3、编译 shared模式编译 ./configure --disable-static --enable-shared --enable-gpl --enable-version3 --enable-d3d11va --enable-dxva2 --enable-libmfx --enable-nvenc --enable-avisynth --enable-bzlib --enable-fontconfig --enable-frei0r --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca -

ffmpeg codec conversion; can't configure encoder

吃可爱长大的小学妹 提交于 2020-01-25 08:16:46
问题 I am simply trying to convert a vp9 webm I have into a vp8 webm. this is the command I'm using. ffmpeg -i in.webm -c:v vp8 out.webm the vp8 encoder returns a strange error Input #0, matroska,webm, from 'in.webm': Metadata: encoder : google Duration: 00:02:34.60, start: 0.000000, bitrate: 404 kb/s Stream #0:0(eng): Video: vp9 (Profile 0), yuv420p(tv, bt709/unknown/unknown), 640x360, SAR 1:1 DAR 16:9, 30 fps, 30 tbr, 1k tbn, 1k tbc (default) Stream mapping: Stream #0:0 -> #0:0 (vp9 (native) ->

ffmpeg codec conversion; can't configure encoder

好久不见. 提交于 2020-01-25 08:16:46
问题 I am simply trying to convert a vp9 webm I have into a vp8 webm. this is the command I'm using. ffmpeg -i in.webm -c:v vp8 out.webm the vp8 encoder returns a strange error Input #0, matroska,webm, from 'in.webm': Metadata: encoder : google Duration: 00:02:34.60, start: 0.000000, bitrate: 404 kb/s Stream #0:0(eng): Video: vp9 (Profile 0), yuv420p(tv, bt709/unknown/unknown), 640x360, SAR 1:1 DAR 16:9, 30 fps, 30 tbr, 1k tbn, 1k tbc (default) Stream mapping: Stream #0:0 -> #0:0 (vp9 (native) ->

need to create a webm video from RGB frames

旧城冷巷雨未停 提交于 2019-12-21 05:22:12
问题 I have an app that generates a bunch of jpgs that I need to turn into a webm video. I'm trying to get my rgb data from the jpegs into the vpxenc sample. I can see the basic shapes from the original jpgs in the output video, but everything is tinted green (even pixels that should be black are about halfway green) and every other scanline has some garbage in it. I'm trying to feed it VPX_IMG_FMT_YV12 data, which I'm assuming is structured like so: for each frame 8-bit Y data 8-bit averages of

VP8 Encoding results in grayscale image on Google Glass

陌路散爱 提交于 2019-12-13 05:04:09
问题 The application I am working on is developed for Google Glass but runs on Android tablets as well.It uses VP8 encoding to transfer camera images to a remote application. The preview format parameter on the camera is set to ImageFormat.YV12. The VP8 encoder is initialized with VPX_IMG_FMT_YV12 parameter. When the application .apk file is installed and run from the Glass, the image is displayed in gray scale on the remote application. When the same .apk file is installed on a tablet or a phone,

need to create a webm video from RGB frames

心不动则不痛 提交于 2019-12-03 16:18:09
I have an app that generates a bunch of jpgs that I need to turn into a webm video. I'm trying to get my rgb data from the jpegs into the vpxenc sample. I can see the basic shapes from the original jpgs in the output video, but everything is tinted green (even pixels that should be black are about halfway green) and every other scanline has some garbage in it. I'm trying to feed it VPX_IMG_FMT_YV12 data, which I'm assuming is structured like so: for each frame 8-bit Y data 8-bit averages of each 2x2 V block 8-bit averages of each 2x2 U block Here is a source image and a screenshot of the video

用nginx搭建基于rtmp或者http的flv、mp4流媒体服务器

不问归期 提交于 2019-11-29 07:24:56
一、流媒体播放方式 1、 HTTP方式 这种方式要下载FLV视频文件到本地播放,一旦FLV视频文件下载完成,就不会消耗服务器的资源和带宽,但是拖动功能没有RTMP/RTMP流媒体方式强大,很多视频网站都是用HTTP方式实现的,如:YouTube,土豆,酷6等 2、 RTMP/RTMP流媒体方式 这种方式不用下载FLV视频文件到本地,可以实时的播放flv文件,可以任意拖拽播放进度条,但是比较消耗服务器的资源。 二、使用nginx来搭建flv流媒体服务器 1.安装git yum install git 2.安装依赖包 yum -y install gcc glibc glibc-devel make nasm pkgconfig lib-devel openssl-devel expat-devel gettext-devel libtool mhash.x86_64 perl-Digest-SHA1.x86_64 3、安装ffmpeg及其依赖包(我的依赖包安装在/usr/local/src下,不过安装位置可以根据个人习惯而定) 以下安装包能用yum安装的,尽量用yum来安装,没有的再wget来获得。 #wget http://www.tortall.net/projects/yasm/releases/yasm-1.2.0.tar.gz #tar xzvf yasm-1.2.0