h.264

Slow H264 1080P@60fps Decoding on Android Lollipop 5.0.2

冷暖自知 提交于 2019-12-08 06:38:25
问题 I'm developing a JAVA RTP Streaming App for a company project, which should be capable of joining the Multicast Server and receive the RTP Packets.Later I use the H264 Depacketizer to recreate the a complete frame from the NAL FU (Keep append the data until End Bit & Marker Bit set ) I want to decode and display a raw h264 video byte stream in Android and therefore I'm currently using the MediaCodec classes with Hardware Decoder configured. The Application is Up and running for the Jeallybean

media foundation H264 decoder not working properly

馋奶兔 提交于 2019-12-08 06:20:48
问题 I'm creating an application for video conferencing using media foundation and I'm having an issue decoding the H264 video frames I receive over the network. The Design Currently my network source queues a token on every request sample, unless there is an available stored sample. If a sample arrives over the network and no token is available the sample is stored in a linked list. Otherwise it is queued with the MEMediaSample event. I also have the decoder set to low latency. My Issue When

H264 RTP and packetization mode (no STAP-A in baseline H264 RTP)

喜欢而已 提交于 2019-12-08 04:14:06
问题 There is a spec that literally requires this: When the SDP negotiation results in the use of the Baseline Profile (BP), a client shall not send Single-Time Aggregation Packet type A (STAP-A) packets, even when the packetization-mode 1 has been negotiated. When accepting the use of the Constrained Baseline Profile (CBP) a client shall support the use of STAP-A packets when packetization-mode 1 was negotiated. can anybody comment that? Doesn't that sound like complete nonsense? How is that even

Change h.264 quality when using SinkWriter to encode video

早过忘川 提交于 2019-12-08 02:36:01
问题 I am using Microsoft Media Foundation to encode a H.264 video file. I am using the SinkWriter to create the video file. The input is a buffer ( MFVideoFormat_RGB32 ) where I draw the frames and the output is a MFVideoFormat_H264 . The encoding works and it creates a video file with my frames in it. But I want to set the quality for that video file. More specifically, I want to set the CODECAPI_AVEncCommonQuality property on the H.264 encoder. In order to get a handle to the H.264 encoder, I

How to map a decoded buffer from ffmpeg into QVideoFrame?

心不动则不痛 提交于 2019-12-07 21:07:53
问题 I'm trying to put by decoded ffmpeg buffer into a QFrame so I can put this frame into a QAbstractVideoBuffer and then put this buffer into a QMediaPlayer . Here's the code for the VideoSurface. According to QT's documentation, I just have to implement these two functions: constructor and bool present , which processes the frame into the QVideoFrame named frame QList<QVideoFrame::PixelFormat> VideoSurface::supportedPixelFormats(QAbstractVideoBuffer::HandleType handleType = QAbstractVideoBuffer

OpenMAX recorder on Android

爷,独闯天下 提交于 2019-12-07 20:18:30
问题 I am trying to record my video buffer from memory to flash in h.264 format and am using this code to initialize the recorder. What format should I use for dataSrc ? XADataLocator_URI locUri; locUri.locatorType = XA_DATALOCATOR_URI; locUri.URI = (XAchar *) "/sdcard/test.ts"; XADataFormat_MIME format_mime = { XA_DATAFORMAT_MIME, XA_ANDROID_MIME_MP2TS, XA_CONTAINERTYPE_MPEG_TS }; XADataSource dataDst = {&locUri, &format_mime}; XADataSource dataSrc = {&locUri, &format_mime}; XADataSink

how to use AVAssetWriter to write h264 strem into video?

孤人 提交于 2019-12-07 14:29:41
问题 I want to make the h.264 stream from server to a video file, but when I used assetwrite.finishwrite , the XCode reports Video /var/mobile/Applications/DE4196F1-BB77-4B7D-8C20-7A5D6223C64D/Documents/test.mov cannot be saved to the saved photos album: Error Domain=NSOSStatusErrorDomain Code=-12847 "This movie format is not supported." UserInfo=0x5334830 {NSLocalizedDescription=This movie format is not supported.}" Below is my code: data is the h.264 frame, just one frame, it might be i frame or

Getting video width height from RTP Packets or Rtsp Server

醉酒当歌 提交于 2019-12-07 13:44:56
问题 I have to get stream video [ which is from rtsp server] width and height. Third party servers give the following info at RTSP DESCRIBE REQUEST: One RTSP server give me width-height Server Response: RTSP/1.0 200 OK .... Content-Type: application/sdp Content-Length: 376 a=x-dimensions:1280,1024 // GET WIDTH HEIGHT .... a=x-dimensions:1280,1024 But the other does not give me width/height info.... It seems that it supports ONVIF... Server Response: RTSP/1.0 200 OK x-Accept-Dynamic-Rate: 1 ...

H264 HW Decoding on Android Using FFmpeg-10

穿精又带淫゛_ 提交于 2019-12-07 09:59:18
问题 I've noticed that ffmpeg has already included (libavcodec/libstagefright.cpp) and claimed to support H264's hardware decoding through StageFright framework. I've build the shared library according to (tools/build_libstagefright). But when doing real H264 frame decoding, it seems to be failed at Stagefright_init(). Does anybody succeed to use this new feature? Thank you in advance. 来源: https://stackoverflow.com/questions/9702503/h264-hw-decoding-on-android-using-ffmpeg-10

mux raw h 264 to an mp4 file, some odd errors

不羁的心 提交于 2019-12-07 07:43:27
What I'm doing is on an IOS app with Xcode 7.3. I got h264 data from an ip camera using UDP,the data can be decoded and displayed rightly (decoded by ffmpeg). Now I want to mux the raw H264 data to an mp4 file(some users may want to record what they are watching on his cell-phone), using ffmpeg. Nothing wrong happened when the code is running, and the result file can be played normally with QuickTime on my computer. But when played on iphone with iphone's default video player, it can't be played normally.Here is my code. Wish someone could tell me what should I do, thanks! init AVFormatContext