google-project-tango

Tango is Out of Date - Cannot update Tango Core - Incompatible Device

眉间皱痕 提交于 2019-12-25 04:27:47
问题 I'm trying to execute the Unity Samples project(Tania Borealis), but it appears that Tango is Out Of Date... but there are no OTA updates available for me, neither in the Play Store are update available. But making some search it seems to be a Tango Core app that is replacing the old (Legacy) Tango Core, so I'm trying to update to the new Tango Core app, but the Play Store tells me that "This app is incompatible with your device"... So I don't know what to do. Just to be sure... What is the

project tango / arcore area mapping

北战南征 提交于 2019-12-25 03:11:09
问题 As we know, ARCore has all but replaced project tango but I have some research projects in mind that involve area mapping, therefore I have few questions regarding tango and ARCore. For Area mapping, tango produces more precise and denser point-cloud information than ARCore, therefore if I want to "area-map", a Tango device would be better for me. Is this right? The SDK for ARCore and Tango are the same thing and therefore support for its methods, documentation etc. are still effectively

How do I detect that localisation has occured?

与世无争的帅哥 提交于 2019-12-25 03:07:40
问题 When utilising a Tango what would I use, callbacks or otherwise to detect when the devise has localised to a previously loaded ADF? This is mainly for UI purposes in conjunction with Tango UX, telling a user to walk around an environment. 回答1: Localization may be detected when your TangoPoseData with a frame of ADF comes back valid. Look at the Tango Java examples of AreaLearningActivity with this simplified logic: //tell tango to provide pose for ADF ArrayList<TangoCoordinateFramePair>

Project Tango: Depthmap Transformation from XYZij data

杀马特。学长 韩版系。学妹 提交于 2019-12-24 21:23:13
问题 I'm currently trying to filter the depth information using OpenCV. For that reason I need to transform Project Tango's depth information XYZij into a image like depthmap. (Like the output of Microsoft Kinect) Unfortunately the official APIs lacking the ij part of XYZij. That's why I'm trying to project the XYZ part using the camera intrinsics projection, wich is explained in the official C API Dokumentation. My current approach looks like this: float fx = static_cast<float>(ccIntrinsics.fx);

Point cloud rendered only partially

白昼怎懂夜的黑 提交于 2019-12-24 13:34:10
问题 I only get a partial point cloud of the room. Other parts of the room do not get rendered at all. It only sees a part to the left. I am using the Point Cloud prefab in Unity. When I use one of the apps, such as Room Scanner or Explorer, I get the rest of the room. I intend to modify the pre-fab for my application but so far I get that limited view. I am using Unity 5.3.3 on Windows 10 on a 64. 回答1: set the unity camera aligned with the depth camera frame so for the matrix dTuc dTuc = imuTd

Using Tango3DR_AreaDescription_createFromDataset on Asus Zenfone AR

微笑、不失礼 提交于 2019-12-24 10:13:39
问题 I am creating meshes from Tango datasets, trying to support Asus Zenfone AR. My app is already working fine on Lenovo Phab 2 Pro. The first step is calling Tango3DR_AreaDescription_createFromDataset to create a Tango3DR_AreaDescription . I am calling the function without specifying a loop_closure_database_path in the second argument: Tango3DR_Status res = Tango3DR_AreaDescription_createFromDataset(dataset_path.c_str(), nullptr, //loop_closure_database_path &area_description_raw, progress

Camera texture in Unity with multithreaded rendering

丶灬走出姿态 提交于 2019-12-24 01:53:22
问题 I'm trying to do pretty much what TangoARScreen does but with multithreaded rendering on in Unity. I did some experiments and I'm stuck. I tried several things, such as letting Tango render into the OES texture that would be then blitted into a regular Texture2D in Unity, but OpenGL keeps complaining about invalid display when I try to use it. Probably not even OnTangoCameraTextureAvailable is called in the correct GL context? Hard to say when you have no idea how Tango Core works internally.

why ARCore Supported device Limited?

…衆ロ難τιáo~ 提交于 2019-12-23 20:02:10
问题 what makes the ARCore supported device supports ARCore? Which Features Makes This Device Support ArCore? What is difference between ARCore Device And Other non Supported Device? 回答1: What happens is not about how new the mobile is, but if this mobile had some tests and mesures when it was design and build. What it means , you cellphone today need some hardware like: Accelerometer : measures acceleration, which is change in speed divided by time. Simply put, it’s the measure of change in

Does the project tango tablet work outdoors?

感情迁移 提交于 2019-12-23 18:25:49
问题 I'm looking to develop an outdoor application but not sure if the tango tablet will work outdoors. Other depth devices out there tend to not work well outside becuase they depend on IR light being projected from the device and then observed after it bounces off the objects in the scene. I've been looking for information on this and all I've found is this video - https://www.youtube.com/watch?v=x5C_HNnW_3Q. Based on the video, it appears it can work outside by doing some IR compensation and/or

Adding ARToolkit Marker tracking into Tango

时光毁灭记忆、已成空白 提交于 2019-12-23 04:18:07
问题 I have been trying to integrate ARToolkit Marker Object tracking into a Tango Application . So far I have created a build so that a tango app can access and use the ARToolkit Native Library or the ARToolkit Unity wrappers. However, they both seem to require exclusive access to the camera in their default configurations. How could you feed the same Android video feed to both libraries? Could you create a dummy camera device which doubles out the feed? Could you take the tango feed as normal,