h.264

Mediacodec and camera, color space incorrect

て烟熏妆下的殇ゞ 提交于 2019-12-03 08:43:59
问题 By referring Aegonis's work 1 and work 2, I also got the H.264 stream , but the color is not correct. I am using HTC Butterfly for development. Here is part of my code: Camera: parameters.setPreviewSize(width, height); parameters.setPreviewFormat(ImageFormat.YV12); parameters.setPreviewFrameRate(frameRate); MediaCodec: mediaCodec = MediaCodec.createEncoderByType("video/avc"); MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc", 320, 240); mediaFormat.setInteger(MediaFormat.KEY

Decoding H264 streaming using android low level api

烂漫一生 提交于 2019-12-03 07:39:42
问题 I am using MediaCodec low level Api in android to decode h264 raw stream received from IP CAMERA. Raw stream from IP camera , receiving on TCP/IP connection. To decode stream , My code is : @Override protected void onCreate(Bundle savedInstanceState) { MediaCodec mCodecc; MediaFormat mFormat; BufferInfo mInfo; ByteBuffer[] inputBuffers ; ByteBuffer[] outputBuffers ; } protected void Init_Codec() { mCodecc = MediaCodec.createDecoderByType("video/avc"); mFormat = MediaFormat.createVideoFormat(

Using Live555 to Stream Live Video from an IP camera connected to an H264 encoder

假如想象 提交于 2019-12-03 07:36:09
I am using a custom Texas Instruments OMAP-L138 based board that basically consists of an ARM9 based SoC and a DSP processor. It is connected to a camera lens. What I'm trying to do is to capture live video stream which is sent to the dsp processor for H264 encoding which is sent over uPP in packets of 8192 bytes. I want to use the testH264VideoStreamer supplied by Live555 to live stream the H264 encoded video over RTSP. The code I have modified is shown below: #include <liveMedia.hh> #include <BasicUsageEnvironment.hh> #include <GroupsockHelper.hh> #include <stdio.h> #include <unistd.h>

H.264 RTSP Absolute TIMESTAMP

放肆的年华 提交于 2019-12-03 07:34:46
Is it possible to read an absolute timestamp from an H.264 stream sent trough RTSP from an Axis camera? It will be necessary to know when the frame has been taken by the camera. Thanks Andrea as Ralf already said - the RTP timestamps are relative to a random clock - they are only useful for computing the difference between two frames (or RTP-packets in general). For synchronizing these relative values to a wall clock you can use the RTCP sender - just have a look on the links Ralf provided. For Axis-products using H.264 this works pretty good. In case you're also using MPEG4, the Axis firmware

How to write a Live555 FramedSource to allow me to stream H.264 live

[亡魂溺海] 提交于 2019-12-03 07:08:19
问题 I've been trying to write a class that derives from FramedSource in Live555 that will allow me to stream live data from my D3D9 application to an MP4 or similar. What I do each frame is grab the backbuffer into system memory as a texture, then convert it from RGB -> YUV420P, then encode it using x264, then ideally pass the NAL packets on to Live555. I made a class called H264FramedSource that derived from FramedSource basically by copying the DeviceSource file. Instead of the input being an

extracting h264 raw video stream from mp4 or flv with ffmpeg generate an invalid stream

放肆的年华 提交于 2019-12-03 06:02:40
I'm trying to extract the video stream from an mp4 or flv h264 video (youtube video) using ffmpeg. The original video (test.flv) play without trouble with ffplay , ffprobe gives an error as follow: ffprobe version N-55515-gbbbd959 Copyright (c) 2007-2013 the FFmpeg developers built on Aug 13 2013 18:06:32 with gcc 4.7.3 (GCC) configuration: --enable-gpl --enable-version3 --disable-w32threads --enable-av isynth --enable-bzlib --enable-fontconfig --enable-frei0r --enable-gnutls --enab le-iconv --enable-libass --enable-libbluray --enable-libcaca --enable-libfreetyp e --enable-libgsm --enable

RTSP h.264 in google chrome browser

淺唱寂寞╮ 提交于 2019-12-03 05:32:56
After we are moving over to html5 and many browsers like Google chrome bans VLC web plugin. ... Is there any way to play h.264 / h.265 Rtsp streams in browsers this days ? Direct RTSP streaming is still not supported by browsers, if you have to play an RTSP stream in the browser then you need a proxy server that will convert RTSP to HTTP stream. Many open source projects are there that will do the work of RTSP to HTTP conversion or you can use FFmpeg (used by VLC) to convert RTSP to HTTP and then can stream it on the browser. These guys put together an rtsp over websocket player. https:/

H264 frame viewer

风格不统一 提交于 2019-12-03 05:13:50
问题 Do you know any application that will display me all the headers/parameters of a single H264 frame? I don't need to decode it, I just want to see how it is built up. 回答1: Three ways come to my mind (if you are looking for something free, otherwise google "h264 analysis" for paid options): Download the h.264 parser from (from this thread @ doom9 forums) Download the h.264 reference software libh264bitstream provides h.264 bitstream reading/writing This should get you started. By the way, the h

Decode android's hardware encoded H264 camera feed using ffmpeg in real time

一世执手 提交于 2019-12-03 04:24:32
I'm trying to use the hardware H264 encoder on Android to create video from the camera, and use FFmpeg to mux in audio (all on the Android phone itself) What I've accomplished so far is packetizing the H264 video into rtsp packets, and decoding it using VLC (over UDP ), so I know the video is at least correctly formatted. However, I'm having trouble getting the video data to ffmpeg in a format it can understand. I've tried sending the same rtsp packets to a port 5006 on localhost (over UDP), then providing ffmpeg with the sdp file that tells it which local port the video stream is coming in on

convert H264 video to raw YUV format

十年热恋 提交于 2019-12-03 04:14:36
问题 Is it possible to create a raw YUV video from H264 encoded video using ffmpeg? I want to open the video with matlab and access Luma, Cb and Cr components frame by frame. 回答1: Yes you can, you just have to specific the pixel format. To get the whole list of the format: ffmpeg -pix_fmts | grep -i pixel_format_name For example if you want to save the 1st video track of an mp4 file as a yuv420p ( p means planar ) file: ffmpeg -i video.mp4 -c:v rawvideo -pix_fmt yuv420p out.yuv 来源: https:/