x264

arm linux 移植 FFMPEG + x264

别等时光非礼了梦想. 提交于 2020-01-14 17:57:35
背景 Ffmpeg 中带有264的解码,没有编码,需要添加x264。libx264是一个自由的H.264编码库,是x264项目的一部分,使用广泛,ffmpeg的H.264实现就是用的libx264。 FFmpeg是一套可以用来记录、转换数字音频、视频,并能将其转化为流的开源计算机程序。采用LGPL或GPL许可证。它提供了录制、转换以及流化音视频的完整解决方案。 librtmp用来接收、发布RTMP协议格式的数据。FFmpeg支持RTMP协议,将librtmp编译进去后支持协议RTMPE、RMTPTE、RTMPS。这里我直接使用FFmpeg自带的RTMP功能。 host平台   :Ubuntu 18.04 arm平台   : S5P6818 x264   : 20171212 ffmpeg   : 3.4.1 arm-gcc   :4.8.1 ## # Copyright By Schips, All Rights Reserved # https://gitee.com/schips/ # File Name: make.sh # Created : Mon 02 Sep 2019 08:05:53 PM HKT ## #!/bin/sh BASE=`pwd` BUILD_HOST=arm-linux OUTPUT_PATH=${BASE}/install X264_DIR=$

Window下编译 64位ffmpeg 引入libx264 libmp3lame库

时光毁灭记忆、已成空白 提交于 2020-01-10 00:59:14
好记性不如烂笔头,每次编译总要有些时间折腾,记录下编译过程,方便后来者。 本文 介绍windows下编译64位Ffmpeg库 (版本V4.02)如何引入libx264及libmp3lame(编码mp3)库。 编译环境选择MinGW64。MinGW64如何安装可参考前面的文章 https://www.cnblogs.com/wanggang123/p/9896564.html 一.. 编译x264库,如需要ffmpeg支持h264编码编译时需要将它添加进来。编译x264库相对容易,一次搞定。 首先下载x264库,用最新的版本就可以,下载的地址是 https://www.videolan.org/developers/x264.html 接着configure,configure参数如下图所示。 图1. x264库 configure示意图 x264库配置 需要nasm ,可以下载可执行文件放到MinGW64安装路径下的bin目录,如图2所示。 http://www.linuxfromscratch.org/blfs/view/8.2/general/nasm.html 图2 nasm.exe 安装路径 未完成,明天继续写。 来源: https://www.cnblogs.com/wanggang123/p/12174126.html

Auto launch the video player in Android from the browser like an iPhone does

北城以北 提交于 2020-01-01 14:21:15
问题 I have just created and iPhone web app, which has some x264 (mp4) video files on it. When I link directly to the file on the iPhone and the user taps the link, the video player is loaded and the video starts playing. Using the app on an Android phone causes the browser to download the video instead of just playing it. Is there a way to force a video player to just boot up and play the video not download it? Thanks in advance. 回答1: You should know that Android is quite strict regarding the

H264: decode series of nal units with ffmpeg

我的未来我决定 提交于 2019-12-31 17:23:50
问题 I tried to decode a series of nal units with ffmpeg (libavcodec) but I get a "no frame" error. I produced the nal units with the guideline at How does one encode a series of images into H264 using the x264 C API?. I tried the following strategy for decoding: avcodec_init(); avcodec_register_all(); AVCodec* pCodec; pCodec=lpavcodec_find_decoder(CODEC_ID_H264); AVCodecContext* pCodecContext; pCodecContext=lpavcodec_alloc_context(); avcodec_open(pCodecContext,pCodec); AVFrame *pFrame; pFrame

x264 IDR access unit with a SPS and a PPS

ぐ巨炮叔叔 提交于 2019-12-24 16:26:24
问题 I am trying to encode video in h.264 that when split with Apples HTTP Live Streaming tools media file segmenter will pass the media file validator I am getting two errors on the split MPEG-TS file WARNING: Media segment contains a video track but does not contain any IDR access unit with a SPS and a PPS. WARNING: 7 samples (17.073 %) do not have timestamps in track 257 (avc1). After hours of research I think the "IDR" warning relates to not having keyframes in the right place on the segmented

openCV VideoCapture doesn't work with gstreamer x264

…衆ロ難τιáo~ 提交于 2019-12-23 18:08:49
问题 I'd like to display a rtp / vp8 video stream that comes from gstreamer, in openCV. I have already a working solution which is implemented like this : gst-launch-0.10 udpsrc port=6666 ! "application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)VP8-DRAFT-IETF-01,payload=(int)120" ! rtpvp8depay ! vp8dec ! ffmpegcolorspace ! ffenc_mpeg4 ! filesink location=videoStream Basically it grabs incoming data from a UDP socket, depacketize rtp, decode vp8, pass to ffmpegcolorspace

How to use gstreamer to save webcam video to file?

六眼飞鱼酱① 提交于 2019-12-20 04:53:58
问题 I've been trying to get emgu to save same webcam video to file The problem is opencv only ssupports avi, and avi does not seem to suit a format like X264 very well. Could I use Gstreamer to do this for me in C? It would be good if I could choose the file format and container type too. It would be good if I could use a format like schrodinger dirac. I'm new to GStreamer so I'm not quite sure if I'm on the right track here. EDIT I've managed to capture the webcam video using gst-launch-0.10

X264 Error message when capturing video

强颜欢笑 提交于 2019-12-20 03:57:12
问题 I'm writing a program to save some webcam video to a file. I'm using the x264 codec found here x264 When I try writing frames to a file I get this error message poping up. x264vfw [warning]: Few frames probably would be lost. Ways to fix this: x264vfw [warning]: -if you use VirtualDub or its fork than you can enable 'VirtualDub Hack' option x264vfw [warning]: -you can enable 'File' output mode x264vfw [warning]: -you can enable 'Zero Latency' option I found this VirtualDub Hack but then I'm

H264 Encoders other than ffmpeg x264

寵の児 提交于 2019-12-18 21:56:45
问题 The iPhone app I am working on captures images in series within certain user-defined time interval, I am looking for a way to combine these images into H264 encoded videos. I have done some research on Google, it looks like I will have to use something like ffmpeg/mencoder on iPhone? (Also found someone ported ffmpeg to iPhone, ffmpeg4iPhone) However, I found that x264 is under GPL license, and requires me to open source my project if I use ffmpeg. Also found some people suggested to use Ogg

h264 RTP timestamp

孤街浪徒 提交于 2019-12-18 12:33:17
问题 I have a confusion about the timestamp of h264 RTP packet. I know the wall clock rate of video is 90KHz which I defined in the SIP SDP. The frame rate of my encoder is not exactly 30 FPS, it is variable. It varies from 15 FPS to 30 FPS on the fly. So, I cannot use any fixed timestamp. Could any one tell me the timestamp of the following encoded packet. After 0 milisecond encoded RTP timestamp = 0 (Let the starting timestamp 0) After 50 milisecond encoded RTP timestamp = ? After 40 milisecond