yuv

convert format from yuvj420p to yuv420p

99封情书 提交于 2020-07-17 06:54:18
问题 i'm trying to perform an algorithm to convert from yuvj420p to yuv420p. The difference between both formats are the range values. yuvj420p [0-255] and yuv420p [16-239] I want to know how to adapt the values to the new range. 回答1: A bit late to this, but for future reference, in case it helps anyone, here is how to deal with this problem with FFmpeg. When exporting, say, an uncompressed AVI from After Effects, sometimes the FFmpeg conversion seems to lack contrast, as if the range was being

cpp rgb to yuv422 conversion

僤鯓⒐⒋嵵緔 提交于 2020-07-07 11:48:05
问题 I'm trying to convert an image (originally from QImage) in a RGB/RGBA format (can be changed) to a YUV422 format. My initial intention was to use OpenCV cvtColor to do the work but it does not enable the conversion of RGB/RGBA to 422 format. I searched for alternatives and even considered to write my own conversion according to this but it would not work fast enough. I searched for another library to use and found this post but it is relay old and not so relevant. So my question is what good

Android camera2 jpeg framerate

南楼画角 提交于 2020-07-05 07:56:05
问题 I am trying to save image sequences with fixed framerates (preferably up to 30) on an android device with FULL capability for camera2 (Galaxy S7), but I am unable to a) get a steady framerate, b) reach even 20fps (with jpeg encoding). I already included the suggestions from Android camera2 capture burst is too slow. The minimum frame duration for JPEG is 33.33 milliseconds (for resolutions below 1920x1080) according to characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP)

Combining two YV12 image buffers into a single side-by-side image

我只是一个虾纸丫 提交于 2020-06-28 09:04:39
问题 I have two image buffers in YV12 format that I need to combine into a single side-by-side image. (1920x1080) + (1920x1080) = (3840*1080) YV12 is split into 3 seperate planes. YYYYYYYY VV UU The pixel format is 12 bits-per-pixel. I have created a method that memcpy s one buffer (1920x1080) into a larger buffer (3840x1080), but it isn't working. Here is my c++. BYTE* source = buffer; BYTE* destination = convertBuffer3D; // copy over the Y for (int x = 0; x < height; x++) { memcpy(destination,

Converting a Bitmap to a WebRTC VideoFrame

我怕爱的太早我们不能终老 提交于 2020-06-26 12:06:23
问题 I'm working on a WebRTC based app for Android using the native implementation (org.webrtc:google-webrtc:1.0.24064), and I need to send a series of bitmaps along with the camera stream. From what I understood, I can derive from org.webrtc.VideoCapturer and do my rendering in a separate thread, and send video frames to the observer; however it expects them to be YUV420 and I'm not sure I'm doing the correct conversion. This is what I currently have: CustomCapturer.java Are there any examples I

Android org.webrtc.VideoRenderer.I420Frame arrays to PreviewCallback.onPreviewFrame byte[]

穿精又带淫゛_ 提交于 2020-06-22 22:42:11
问题 I keep hoping some code will appear on the internet, but getting nowhere ;) WebRTC incoming I420Frame object seems to have 3 arrays of yuvPlanes A typical Android camera app gets PreviewCallback.onPreviewFrame byte[] as a single array of bytes. Can someone help me in how to convert I420Frames yuvPlanes to a single byte[] array like PreviewCallback.onPreviewFrame byte[] YCbCr_420_SP (NV21)? For reference, VideoStreamsView.java has this code to render to OpenGL - but I just want it like camera

Android org.webrtc.VideoRenderer.I420Frame arrays to PreviewCallback.onPreviewFrame byte[]

橙三吉。 提交于 2020-06-22 22:40:15
问题 I keep hoping some code will appear on the internet, but getting nowhere ;) WebRTC incoming I420Frame object seems to have 3 arrays of yuvPlanes A typical Android camera app gets PreviewCallback.onPreviewFrame byte[] as a single array of bytes. Can someone help me in how to convert I420Frames yuvPlanes to a single byte[] array like PreviewCallback.onPreviewFrame byte[] YCbCr_420_SP (NV21)? For reference, VideoStreamsView.java has this code to render to OpenGL - but I just want it like camera

Android org.webrtc.VideoRenderer.I420Frame arrays to PreviewCallback.onPreviewFrame byte[]

核能气质少年 提交于 2020-06-22 22:38:30
问题 I keep hoping some code will appear on the internet, but getting nowhere ;) WebRTC incoming I420Frame object seems to have 3 arrays of yuvPlanes A typical Android camera app gets PreviewCallback.onPreviewFrame byte[] as a single array of bytes. Can someone help me in how to convert I420Frames yuvPlanes to a single byte[] array like PreviewCallback.onPreviewFrame byte[] YCbCr_420_SP (NV21)? For reference, VideoStreamsView.java has this code to render to OpenGL - but I just want it like camera

Why is Yp Cb Cr image buffer all shuffled in iOS 13?

不想你离开。 提交于 2020-06-11 09:03:20
问题 I have a computer vision app that takes grayscale images from sensor and processes them. The image acquisition for iOS is written in Obj-C and the image processing is performed in C++ using OpenCV. As I only need the luminance data, I acquire the image in YUV (or Yp Cb Cr) 420 bi-planar full range format and just assign the buffer's data to an OpenCV Mat object (see aquisition code below). This worked great so far, until the brand new iOS 13 came out... For some reason, on iOS 13 the image I