h.264

Access StageFright.so directly to decode H.264 stream from JNIlayer in Android

こ雲淡風輕ζ 提交于 2019-11-29 00:17:01
Is there a way to access libstagefright.so directly to decode H.264 stream from JNI layer on Android 2.3 or above? If your objective is to decode an elementary H.264 stream, then your code will have to ensure that the stream is extracted, the codec-specific-data is provided to the codec which is primarily SPS and PPS data and frame data along with time-stamps is provided to the codec. Across all Android versions, the most common interface would be OMXCodec which is an abstraction over an underlying OMX component. In Gingerbread (Android 2.3) and ICS (Android 4.0.0), if you would like to create

How to decode sprop-parameter-sets in a H264 SDP?

时间秒杀一切 提交于 2019-11-28 23:31:49
What is the meaning of Base64 decoded bytes in sprop-parameter-sets in SDP for a h264 stream? How can I know the video size from this example? SDP example: sprop-parameter-sets=Z0IAKeNQFAe2AtwEBAaQeJEV,aM48gA== First part decoded from Base64 to Base16: 67 42 00 29 E3 50 14 07 B6 02 DC 04 04 06 90 78 91 15 Second part (comma separated): 68 CE 3C 80 ANSWER: Fetching the dimensions of a H264Video stream Jonathan Websdale The spec you require is available for free download from the ITU website here:- H.264 (03/10) Select the freely downloadable PDF and you'll find the format detailed in section 7

Capturing h.264 stream from camera with Gstreamer

半腔热情 提交于 2019-11-28 21:34:41
I'm trying to capture H264 stream from locally installed Logitech C920 camera from /dev/video0 with Gstreamer 1.0 v4l2src element. v4l2-ctl --list-formats shows that camera is capable to give H264 video format: # v4l2-ctl --list-formats ioctl: VIDIOC_ENUM_FMT ... Index : 1 Type : Video Capture Pixel Format: 'H264' (compressed) Name : H.264 ... But pipeline # gst-launch-1.0 -vvv v4l2src device=/dev/video0 ! video/x-h264, width=800, height=448, framerate=30/1 ! fakesink keeps giving me not-negotiated (-4) error: /GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-h264, width=

Android encoder muxer: raw h264 to mp4 container

∥☆過路亽.° 提交于 2019-11-28 20:57:38
I created a h264 raw video file, and I was able to mux it with Android MediaMuxer on Android 4.3 and up. Now I need to support Android versions 4.1 and 4.2. I found Jcodec. And there is an example for doing this: https://github.com/jcodec/jcodec/blob/master/samples/main/java/org/jcodec/samples/mux/AVCMP4Mux.java But I'm getting java.nio.ReadOnlyBufferException exception at line 70: H264Utils.encodeMOVPacket(data); I guess this code is not for Android? How do I fix this. Can someone familiar with Jcodec help on this? xy uber.com I gave up on Jcodec. It exposes too many codec internal stuff, and

Get the width / height of the video from H.264 NALU

拥有回忆 提交于 2019-11-28 20:55:34
I have gotten the SPS in NALU ( AVC Decoder Configuration Record ), and trying to parse the video width / height from it. 67 64 00 15 ac c8 60 20 09 6c 04 40 00 00 03 00 40 00 00 07 a3 c5 8b 67 80 This is my code parse the SPS but gets the wrong values. pic_width_in_mbs_minus1 is 5, and pic_height_in_map_units_minus1 is 1. Actually the video is 512 X 288px typedef struct _SequenceParameterSet { private: const unsigned char * m_pStart; unsigned short m_nLength; int m_nCurrentBit; unsigned int ReadBit() { ATLASSERT(m_nCurrentBit <= m_nLength * 8); int nIndex = m_nCurrentBit / 8; int nOffset = m

MediaCodec H264 Encoder not working on Snapdragon 800 devices

一笑奈何 提交于 2019-11-28 19:21:27
问题 I have written a H264 Stream Encoder using the MediaCodec API of Android. I tested it on about ten different devices with different processors and it worked on all of them, except on Snapdragon 800 powered ones (Google Nexus 5 and Sony Xperia Z1). On those devices I get the SPS and PPS and the first Keyframe, but after that mEncoder.dequeueOutputBuffer(mBufferInfo, 0) only returns MediaCodec.INFO_TRY_AGAIN_LATER. I already experimented with different timeouts, bitrates, resolutions and other

How do I use Hardware accelerated video/H.264 decoding with directx 11 and windows 7?

混江龙づ霸主 提交于 2019-11-28 19:18:35
问题 I've been researching all day and not gotten very far. I'm on windows 7, using directx 11. (My final output is to be a frame of video onto a DX11 texture) I want to decode some very large H.264 video files, and the CPU (using libav) doesn't cut it. I've looked at the hwaccel capabilities of libav using DXVA2, but hit a road block when I need to create a IDirectXVideoDecoder, which can only be created with a D3D9 interface. (which I don't have using DX11) Whenever I've looked up DXVA

Stream H.264 video over rtp using gstreamer

主宰稳场 提交于 2019-11-28 18:09:44
问题 I am newbie with gstreamer and I am trying to be used with it. My first target is to create a simple rtp stream of h264 video between two devices. I am using these two pipelines: Sender: gst-launch-1.0 -v filesrc location=c:\\tmp\\sample_h264.mov ! x264enc ! rtph264pay ! udpsink host=127.0.0.1 port=5000 Receiver: gst-launch-1.0 -v udpsrc port=5000 ! rtpmp2tdepay ! decodebin ! autovideosink But with the first one (the sender) I got the following error: Setting pipeline to PAUSED ... Pipeline

Extracting h264 from CMBlockBuffer

a 夏天 提交于 2019-11-28 17:59:40
问题 I am using Apple VideoTool Box (iOS) to compress raw frames captured by the device camera. My callback is being called with a CMSampleBufferRef object that contains CMBlockBuffer. The CMBlockBuffer object contain the H264 elementary stream but I didn't find any way to get a pointer to the elementary stream. When I printed into the console the CMSampleBufferRef object I got: (lldb) po blockBufferRef CMBlockBuffer 0x1701193e0 totalDataLength: 4264 retainCount: 1 allocator: 0x1957c2c80

How to decode H.264 video frame in Java environment

 ̄綄美尐妖づ 提交于 2019-11-28 17:35:34
Does anyone know how to decode H.264 video frame in Java environment? My network camera products support the RTP/RTSP Streaming. The service standard RTP/RTSP from my network camera is served and it also supports “RTP/RTSP over HTTP”. RTSP : TCP 554 RTP Start Port: UDP 5000 Or use Xuggler . Works with RTP, RTMP, HTTP or other protocols, and can decode and encode H264 and most other codecs. And is actively maintained, free, and open-source (LGPL). I think the best solution is using "JNI + ffmpeg". In my current project, I need to play several full screen videos at the same time in a java openGL