openmax

Gstreamer with gst-omx Raspberry Pi

半腔热情 提交于 2020-01-12 08:40:11
问题 I compiled the gstreamer with gst-omx following this tutorial: http://www.onepitwopi.com/raspberry-pi/gstreamer-1-2-on-the-raspberry-pi/ Everything went fine and in the end when i ran gst-inspect-1.0 | grep omx I got: omx: omxmpeg2videodec: OpenMAX MPEG2 Video Decoder omx: omxmpeg4videodec: OpenMAX MPEG4 Video Decoder omx: omxh263dec: OpenMAX H.263 Video Decoder omx: omxh264dec: OpenMAX H.264 Video Decoder omx: omxtheoradec: OpenMAX Theora Video Decoder omx: omxvp8dec: OpenMAX VP8 Video

how do you build gstreamer's gst-launch pipelines?

不问归期 提交于 2019-12-25 01:44:27
问题 Let's say you have a video file. As far as I searched, you first need to know what container it uses by mediainfo command. $ mediainfo your_path_to_a_video.file you then need to find a demuxer for the container, so you do $ gst-inspect-1.0 | grep your_container_name_such_as_ogg now that you have a proper demuxer, such as oggdemux , you can split video and audio. If you want to display the video, you first need to know the codec name, and you will need to decode it to output to the screen.

Enabling Hardware Encoder in Jelly bean 4.1.1 rowboat DM3730

雨燕双飞 提交于 2019-12-24 23:39:06
问题 Kindly execuse me for the bit longer description about the problem. I have a custom board with DM3730 processor, and building android rowboat from http://code.google.com/p/rowboat/wiki/JellybeanOnBeagleboard_WithSGX OBJECTIVE: ENABLING HARDWARE DECODER. 2.1) For that, I need OMAX-IL Interface. Hence looked at the source code downloaded from TI. But i do not find omap3/ directory under hardware/ti/ which represents OMX implementation. 2.2) Hence downloaded from AOSP Jelly Bean Code By: git

How MediaCodec finds the codec inside the framework in Android?

别来无恙 提交于 2019-12-20 23:25:29
问题 I am trying to understanding how MediaCodec is used for hardware decoding. My knowledge in android internal is very limited. Here is my findings: There is a xml file which represents the codec details in the android system . device/ti/omap3evm/media_codecs.xml for an example. Which means, that If we create a codec from the Java Application with Media Codec MediaCodec codec = MediaCodec.createDecoderByType(type); It should be finding out respective coder with the help of xml file. What am I

Access camera via OpenMAX in Android

孤街浪徒 提交于 2019-12-13 11:54:02
问题 I currently try to figure out how to access the Camera via OpenMAX in Android 4.0. The documentation is not sufficient for me so I currently struggle with how I can retrieve the correct XADataSource for the following call. (*_engine)->CreateMediaRecorder(_engine, &_mediaRecorder, //pRecorder nullptr, //pAudioSrc XADataSource *, //pImageVideoSrc XADataSink *, //pDataSnk XAuint32, // numInterfaces const XAInterfaceID *, //pInterfaceIds const XAboolean *, //pInterfaceRequired ); And please spare

Creating Android app using OpenMAX library in GB, but showing not found?

北城余情 提交于 2019-12-12 03:04:34
问题 I'm trying to develop android app, in gingerbread using OpenMAX-AL. Basically openMAX-AL library not supporatable in GingerBread.So im using the openmax-al libray from ics (by building source).I able to compile my sample application using that library without error after installing in Ginger bread its showing error as below..And also i tried to push the library inside my application lib folder still same error. What should i do to use openmax-al library in gingerbread. Can u any one suggest

Variable length structures

冷暖自知 提交于 2019-12-11 08:14:49
问题 OMX provides a struct with following definition /* Parameter specifying the content URI to use. */ typedef struct OMX_PARAM_CONTENTURITYPE { OMX_U32 nSize; /**< size of the structure in bytes */ OMX_VERSIONTYPE nVersion; /**< OMX specification version information */ OMX_U8 contentURI[1]; /**< The URI name*/ }OMX_PARAM_CONTENTURITYPE; OMX_IndexParamContentURI, /**< The URI that identifies the target content. Data type is OMX_PARAM_CONTENTURITYPE. */ I have got a constant char array to set.

Android: MPEG4Writer fails to start when using OMXCodec as MediaSource

筅森魡賤 提交于 2019-12-09 18:44:26
问题 I'm trying to encode a video from a byte array buffer and to do so I'm using MPEG4Writer API from native code. I have created my custom MediaSource class to provide the data and I'm wrapping it with OMXCodec to give it to MPEG4Writer : sp<MediaSource> mVideoEncoder = OMXCodec::Create(client.interface(), omxEncMeta, true, mVideoOutSource); mVideoEncoder->start(); mVideoOutSource is my custom MediaSource class, omxEncMeta is the following: int32_t colorFormat = OMX_COLOR_FormatYUV420SemiPlanar;

OpenMAX recorder on Android

爷,独闯天下 提交于 2019-12-07 20:18:30
问题 I am trying to record my video buffer from memory to flash in h.264 format and am using this code to initialize the recorder. What format should I use for dataSrc ? XADataLocator_URI locUri; locUri.locatorType = XA_DATALOCATOR_URI; locUri.URI = (XAchar *) "/sdcard/test.ts"; XADataFormat_MIME format_mime = { XA_DATAFORMAT_MIME, XA_ANDROID_MIME_MP2TS, XA_CONTAINERTYPE_MPEG_TS }; XADataSource dataDst = {&locUri, &format_mime}; XADataSource dataSrc = {&locUri, &format_mime}; XADataSink

Capturing jpegs from an h264 stream with gstreamer on a Raspberry Pi

浪尽此生 提交于 2019-12-06 13:40:47
问题 I have one of the new camera add-ons for a Raspberry Pi. It doesn't yet have video4linux support but comes with a small program that spits out a 1080p h264 stream. I have verified this works and got it pushing the video to stdout with: raspivid -n -t 1000000 -vf -b 2000000 -fps 25 -o - I would like to process this stream such that I end up with a snapshot of the video taken once a second. Since it's 1080p I will need to use the rpi's hardware support for H264 encoding. I believe gstreamer is