ARCore: How to play video in the photo frame when an image detected

自作多情 提交于 2020-01-16 09:41:51

问题


I want to play video in in photo frame when image is detected, anybody who have done this using ARCore? would be great help

Thanks


回答1:


I think you mean you want to add a video as a renderable in ARCore, in your case when an image is detected.

There is actually (at the time of writing) an example included with Sceneform showing how to add a video as a renderable - it is available here: https://github.com/google-ar/sceneform-android-sdk/tree/master/samples/chromakeyvideo

This particular example also applies a Chroma filter but you can simply ignore that part.

The approach is roughly:

  • create an ExternalTexture to play the video on
  • create a MediaPlayer and set its surface to the ExternalTexture's surface
  • build a new renderable with the ExternalTexture
  • create a node and add it to your scene
  • set the renderable for the node to the the new ModelRenderable you built

For Augmented images, ArCore will automatically calculate the size of the image that it detects so long as the state of the image is 'TRACKING". From the documentation:

ARCore will attempt to estimate the physical image's width based on its understanding of the world. If the optional physical size is specified in the database, this estimation process will happen more quickly. However, the estimated size may be different from the specified size.

Your renderable will be sized to fit inside this by default but you can scale the renderable up or down as you want also.

There is a series of articles available which may cover your exact case, depending on exactly what you need, along with some example code here: https://proandroiddev.com/arcore-sceneform-simple-video-playback-3fe2f909bfbc



来源:https://stackoverflow.com/questions/56930020/arcore-how-to-play-video-in-the-photo-frame-when-an-image-detected

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!