Is it possible to capture a High-Res image while using ArCore?

两盒软妹~` 提交于 2021-02-11 13:36:38

问题


In my app I'm trying to use ArCore as sort of a "camera assistant" in a custom camera view.

To be clear - I want to display images for the user in his camera and have him capture images that don't contain the AR models.

From what I understand, in order to capture an image with ArCore I'll have to use the Camera2 API which is enabled by configuring the session to use the "shared Camera".

However, I can't seem to configure the camera to use any high-end resolutions (I'm using pixel 3 so I should be able to go as high as 12MP).

In the "shared camera example", they toggle between Camera2 and ArCore (a shame there's no API for CameraX) and it has several problems:

  1. In the ArCore mode the image is blurry (I assume that's because the depth sensor is disabled as stated in their documentation)
  2. In the Camera2 mode I can't enhance the resolution at all.
  3. I can't use the Camera2 API to capture an image while displaying models from ArCore.

Is this requirement at all possible at the moment?


回答1:


I have not worked yet with shared camera with ARCore, but I can say a few things regarding the main point of your question.

In ARCore you can configure both CPU image size and GPU image size. You can do that by checking all available camera configurations (available through Session.getSupportedCameraConfigs(CameraConfigFilter cameraConfigFilter)) and selecting your preferred one by passing it back to the ARCore Session. On each CameraConfig you can check which CPU image size and GPU texture size you will get.

Probably you are currently using (maybe by default?) a CameraConfig with the lowest CPU image, 640x480 pixels if I remember correctly, so yes it definitely looks blurry when rendered (but nothing to do with depth sensor in this regard).

Sounds like you could just select a higher CPU image and you're good to go... but unfortunately that's not the case because that configuration applies to every frame. Getting higher resolution CPU images will result in much lower performance. When I tested this I got about 3-4 frames per second on my test device, definitely not ideal.

So now what? I think you have 2 options:

  1. Pause the ARCore session, switch to a higher CPU image for 1 frame, get the image and switch back to the "normal" configuration.
  2. Probably you are already getting a nice GPU image, maybe not the best due to camera Preview, but hopefully good enough? Not sure how you are rendering it, but with some OpenGL skills you can copy that texture. Not directly, of course, because of the whole GL_TEXTURE_EXTERNAL_OES thing... but rendering it onto another framebuffer and then reading the texture attached to it could work. Of course you might need to deal with texture coordinates yourself (full image vs visible area) but that's another topic.

Regarding CameraX, note that it is wrapping Camera2 API in order to provide some camera use cases so that app developers don't have to worry about the camera lifecycle. As I understand it would not be suitable for ARCore to use CameraX as I imagine they need full control of the camera.

I hope that helps a bit!



来源:https://stackoverflow.com/questions/59190820/is-it-possible-to-capture-a-high-res-image-while-using-arcore

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!