ffmpeg

From multiple video files to single output

两盒软妹~` 提交于 2021-02-18 18:54:00
问题 Let's say that I have a list of hundreds of possible video files. Using ffmpeg it's pretty easy to take multiple files and stitch them together to single video output, but that's where the things become tricky. I'm looking for a way to have them stream live and dynamically add videos to queue as stream goes on. Think of something like SSAI but for the whole video. Stream live is there so we don't have a delay while waiting for ffmpeg to finish the whole video but rather start as soon as

From multiple video files to single output

半世苍凉 提交于 2021-02-18 18:53:53
问题 Let's say that I have a list of hundreds of possible video files. Using ffmpeg it's pretty easy to take multiple files and stitch them together to single video output, but that's where the things become tricky. I'm looking for a way to have them stream live and dynamically add videos to queue as stream goes on. Think of something like SSAI but for the whole video. Stream live is there so we don't have a delay while waiting for ffmpeg to finish the whole video but rather start as soon as

Reverse video playback through ffmpeg

ぃ、小莉子 提交于 2021-02-18 18:02:14
问题 I am implementing a video player using ffmpeg multimedia framework. I am able to achieve play, pause, increase speed, decrease speed, forward seek, backward seek functionalities. But reverse playback of the video is not so smooth, it stutters a lot. Please help me in understanding video reverse playback. What is the better approach for this? Is there any other multimedia framework which support reverse video playback? Many thanks in advance. 回答1: So, first, some framing of this issue. FFmpeg

ffplay on Android

对着背影说爱祢 提交于 2021-02-18 17:52:29
问题 I've gone through this - http://writingminds.github.io/ffmpeg-android-java/ I've gotten FFMpeg on Android, as in, my app is able to load the FFMPeg binary. However, ffplay commands do not work. Is it possible to port ffplay into my app? 回答1: Take a look at this lib: http://androidwarzone.blogspot.com.br/2011/12/ffmpeg4android.html FFmpeg4Android is a way your application can run FFmpeg commands, only Java, no need for C code, or NDK. You can use any player that supports streaming, on the

ffmpeg, dash manifest cannot be created due to unspecified pixel format

社会主义新天地 提交于 2021-02-18 17:37:13
问题 I am using ffmpeg 2.8 on OSX. I try to convert a short mp4 video to webm for adaptive streaming like suggested here http://wiki.webmproject.org/adaptive-streaming/instructions-to-playback-adaptive-webm-using-dash like this: VP9_DASH_PARAMS="-tile-columns 6 -frame-parallel 1" ffmpeg -i t2.mp4 -c:v libvpx-vp9 -s 160x90 -b:v 250k -keyint_min 150 -g 150 ${VP9_DASH_PARAMS} -an -f webm -dash 1 video_160x90_250k.webm ffmpeg -i t2.mp4 -c:a libvorbis -b:a 128k -vn -f webm -dash 1 audio_128k.webm

combine two audio files with a command line tool

这一生的挚爱 提交于 2021-02-18 17:36:21
问题 I've to merge two (or more) audio files (like a guitar and a drum track) into a single file. I'm running over linux CentOS and I'd need a command line tool to do so, because I've got to run this as part of a background process, triggered via crontab of a custom bash script. I also need to be able to change the pan, volume, trim and start time (i.e I want the guitar track to start after 1.25ms after the drum track so that they can be both in sync with each other). My first choice would be

combine two audio files with a command line tool

[亡魂溺海] 提交于 2021-02-18 17:36:10
问题 I've to merge two (or more) audio files (like a guitar and a drum track) into a single file. I'm running over linux CentOS and I'd need a command line tool to do so, because I've got to run this as part of a background process, triggered via crontab of a custom bash script. I also need to be able to change the pan, volume, trim and start time (i.e I want the guitar track to start after 1.25ms after the drum track so that they can be both in sync with each other). My first choice would be

Using FFMPEG to add a single frame to end of MP4

被刻印的时光 ゝ 提交于 2021-02-18 16:35:21
问题 I have written some image acquisition software and as I am acquiring these images I want to add the last image onto the end of a video file (a time-lapse of all images acquired so far). This will give a video view of all the images. However I am struggling with trying to add the single frame. I have generated the time-lapse fine. The way I am doing this is waiting until I have gathered 10 images, then I generate the time-lapse. The command I have used to generate the time-lapse I will be

Using FFMPEG to add a single frame to end of MP4

风格不统一 提交于 2021-02-18 16:34:55
问题 I have written some image acquisition software and as I am acquiring these images I want to add the last image onto the end of a video file (a time-lapse of all images acquired so far). This will give a video view of all the images. However I am struggling with trying to add the single frame. I have generated the time-lapse fine. The way I am doing this is waiting until I have gathered 10 images, then I generate the time-lapse. The command I have used to generate the time-lapse I will be

Using FFMPEG to add a single frame to end of MP4

老子叫甜甜 提交于 2021-02-18 16:33:40
问题 I have written some image acquisition software and as I am acquiring these images I want to add the last image onto the end of a video file (a time-lapse of all images acquired so far). This will give a video view of all the images. However I am struggling with trying to add the single frame. I have generated the time-lapse fine. The way I am doing this is waiting until I have gathered 10 images, then I generate the time-lapse. The command I have used to generate the time-lapse I will be