textureview

Options to efficiently draw a stream of byte arrays to display in Android

旧时模样 提交于 2019-12-06 04:23:26
问题 In simple words, all I need to do is display a live stream of video frames in Android (each frame is YUV420 format). I have a callback function where I receieve individual frames as a byte array. Something that looks like this : public void onFrameReceived(byte[] frame, int height, int width, int format) { // display this frame to surfaceview/textureview. } A feasible but slow option is to convert the byte array to a Bitmap and draw to canvas on SurfaceView. In the future, I would ideally

SurfaceTexture in Android plugin doesn't work in Unity

為{幸葍}努か 提交于 2019-12-06 02:46:56
问题 I can't get the texture tied to a SurfaceTexture to display in Unity. Update 4: Based on the pipeline in update 1 (surface->external texture via surface texture -> fbo -> texture 2d) I know the SurfaceTexture isn't properly converting its surface to a texture. I can get correctly drawn pictures from its surface via pixelcopy and I can confirm my FBO drawing to texture2d pipeline works with some test colors. So the question is, why can't the SurfaceTexture convert its surface to a texture? I

When Is a TextureView's “Consumer Side” Closed?

╄→尐↘猪︶ㄣ 提交于 2019-12-04 11:18:40
问题 One of the official Google samples for the Camera2 API suffers from the same BufferQueue has been abandoned problem as is seen in: What can I do when the BufferQueue has been abandoned? Android LogCat shows BufferQueueProcedure Specifically, the sample app calls a closeCamera() method from onPause() of a fragment, where closeCamera() calls close() on the CameraCaptureSession , then close() on the CameraDevice , then close() on the ImageReader (used for actual picture-taking). After the close(

Options to efficiently draw a stream of byte arrays to display in Android

天大地大妈咪最大 提交于 2019-12-04 10:51:29
In simple words, all I need to do is display a live stream of video frames in Android (each frame is YUV420 format). I have a callback function where I receieve individual frames as a byte array. Something that looks like this : public void onFrameReceived(byte[] frame, int height, int width, int format) { // display this frame to surfaceview/textureview. } A feasible but slow option is to convert the byte array to a Bitmap and draw to canvas on SurfaceView. In the future, I would ideally like to be able to alter brightness, contrast etc of this frame, and hence am hoping I can use OpenGL-ES

Android SDK - camera2 - Draw rectangle over TextureView

核能气质少年 提交于 2019-12-04 08:25:26
问题 Im new to android development, and I'm finding it hard to find good examples on the camera2 api. Im working my way slowly through most issues, but on this one I am stuck. In the default camera, when you touch the screen to focus, it shows a rectangle of the focus area for a moment. I want to do something similar (Or in this case, the exact same thing to start off with so i can figure it out). I read somewhere (I think the TextureView page in the SDK docs) that you cant draw on a textureview

Get Bitmap from TextureView efficiently

五迷三道 提交于 2019-12-04 01:22:28
I am trying to get each frame from a TextureView , unfortunately trying: textureView.getBitmap(); Results in slow performance is there a faster way to obtain a bitmap. Is it better to use the NDK instead? Looking for actual examples A TextureView receives frames on a SurfaceTexture, which takes frames sent to its Surface and converts them to a GLES texture. To get the pixel data out, the texture must be rendered to a framebuffer, then read out with glReadPixels() . The pixel data can then be wrapped with a Bitmap object (which may or may not involve copying the pixel data). Using the NDK isn't

视频02.MediaPlayer和VideoView,TextureView的使用

戏子无情 提交于 2019-12-03 18:32:08
目录介绍 1.关于此视频封装库介绍 1.1 MediaPlayer简单介绍 2.相关方法详解 2.1 获得MediaPlayer实例 2.2 设置播放文件 2.3 其他方法 3.生命周期 3.1 生命周期图[摘自网络] 3.2 周期状态说明 4.播放视频 4.1 播放res/raw音频文件 4.2 播放本地Uri 4.3 播放网络文件 5.MediaPlayer + SurfaceView播放视频 5.1 为什么要这样 5.2 案例展示 5.3 SurfaceView局限性 6.VideoView播放视频 6.1 VideoView介绍 6.2 使用方法代码 7.MediaPlayer+TextureView 7.1 为什么使用TextureView 7.2 如何实现视频播放功能 好消息 博客笔记大汇总【16年3月到至今】,包括Java基础及深入知识点,Android技术博客,Python学习笔记等等,还包括平时开发中遇到的bug汇总,当然也在工作之余收集了大量的面试题,长期更新维护并且修正,持续完善……开源的文件是markdown格式的!同时也开源了生活博客,从12年起,积累共计47篇[近20万字],转载请注明出处,谢谢! 链接地址: https://github.com/yangchong211/YCBlogs 如果觉得好,可以star一下,谢谢!当然也欢迎提出建议

How to clear SurfaceTexture after using it with a MediaPlayer?

做~自己de王妃 提交于 2019-12-03 12:14:47
I'm using a ListView with TextureView children. The TextureView uses a MediaPlayer to play videos. When the TextureView gets recycled, the last frame remains on the surface until the next MediaPlayer makes use of it. What is the easiest way to "clear" the TextureView's surface (e.g black) so the old frames do not appear? After struggling with this for a while, I've come up with a couple of possible solutions: a) clear the surface using GLES. You can see an example in Grafika (check out the clearSurface() method). I'm personally not a fan of this because I don't think that a jarring transition

When Is a TextureView's “Consumer Side” Closed?

試著忘記壹切 提交于 2019-12-03 06:21:43
One of the official Google samples for the Camera2 API suffers from the same BufferQueue has been abandoned problem as is seen in: What can I do when the BufferQueue has been abandoned? Android LogCat shows BufferQueueProcedure Specifically, the sample app calls a closeCamera() method from onPause() of a fragment, where closeCamera() calls close() on the CameraCaptureSession , then close() on the CameraDevice , then close() on the ImageReader (used for actual picture-taking). After the close() on CameraDevice is when a few occurrences of the aforementioned BufferQueue has been abandoned

How do I use Android’s “Surface” classes?

自闭症网瘾萝莉.ら 提交于 2019-12-03 04:23:17
问题 Is there a detailed explanation of Surface, SurfaceHolder, EGLSurface, SurfaceView, GLSurfaceView, SurfaceTexture, and TextureView? In particular: What’s the difference between SurfaceView and TextureView? Do I need to use GLSurfaceView to use OpenGL ES? How do Surface and EGLSurface interact? What does SurfaceTexture do? Why does the stuff I draw on a SurfaceView have to go above or below everything else? What is SurfaceFlinger? How does composition of the status and navigation bars work?