google-project-tango

3D AR Markers with Project Tango

陌路散爱 提交于 2020-01-24 12:34:14
问题 I'm working on a project for an exhibition where an AR scene is supposed to be layered on top of a 3D printed object. Visitors will be given a device with the application pre-installed. Various objects should be seen around / on top of the exhibit, so the precision of tracking is quite important. We're using Unity to render the scene, this is not something that can be changed as we're already well into development. However, we're somewhat flexible on the technology we use to recognize the 3D

3D AR Markers with Project Tango

浪尽此生 提交于 2020-01-24 12:34:05
问题 I'm working on a project for an exhibition where an AR scene is supposed to be layered on top of a 3D printed object. Visitors will be given a device with the application pre-installed. Various objects should be seen around / on top of the exhibit, so the precision of tracking is quite important. We're using Unity to render the scene, this is not something that can be changed as we're already well into development. However, we're somewhat flexible on the technology we use to recognize the 3D

Getting color data in Java Tango SDK (or C SDK)

夙愿已清 提交于 2020-01-15 12:29:08
问题 I just got my tablet, where I previously had a phone. As most of you probably know, the phone's SDK allowed capturing of superframes via the android camera callback. If properly parsed, the superframes contained all of the relevant sensor data. In Archimedes, I tried the following. I made an activity that implements CameraPreviewListener: public class MainActivity extends Activity implements CameraPreviewListener { // Inside of this class we manage another object that implements

Convert device pose to camera pose

浪子不回头ぞ 提交于 2020-01-11 12:57:03
问题 I'm using the camera intrinsic (fx, fy, cx, cy, width, hight) to store a depth image of TangoXyzIjData.xyz buffer. Therefore I calculate for each point of xyz the corresponding image point and store its z value x' = (fx * x) / z + cx y' = (fy * y) / z + cy depthImage[x'][y'] = z Now I would like to store the corresponding pose data as well. I'm using the timestamp of TangoXyzIjData.timestamp and the following function getPoseAtTime(double timestamp, TangoCoordinateFramePair framePair) with

connectOnFrameAvailable() provides TangoImageBuffer with curious format infos

允我心安 提交于 2020-01-07 06:48:51
问题 Also trying to get access to color data bytes from color cam of Tango, I was stuck on java API by being able to connect tango Cam to a surface for display (but just OK for display in fact, no easy access to raw data, nor time stamp)... so finally I switch using C API on native code (latest FERMAT lib and header) and follow recommendation I found on stack Overflow by registering a derivated sample code to connectOnFrameAvailable() ... (I start using PointCloudActivity sample for that test).

No Color Frames since Updating to Jacobi Release 1.8

痴心易碎 提交于 2020-01-06 18:07:10
问题 Since I have updated to the Jacobi C API (Release 1.8 and later) I do not receive any color frames anymore, i.e., more specifically, I successfully register a callback function __onFrameAvailable() , using if( TANGO_SUCCESS != TangoService_connectOnFrameAvailable( TANGO_CAMERA_COLOR, NULL, __onFrameAvailable) ) { LOG_ERROR("TangoService_connectOnFrameAvailable() failed."); return FALSE; } else { LOG_INFO("TangoService_connectOnFrameAvailable() successful."); } // if but that function is never

Project Tango: How to tell if the plane created in the Plane fitting example is a floor or a wall in Java SDK?

烂漫一生 提交于 2020-01-06 03:38:06
问题 The plane fitting example fits a cube on a plane that it created from the point cloud that it retrieves based on the point selected by user. I want to determine if that point is a floor, a wall or a roof. What I am trying to achieve is to change the example so that it only renders the cube on the floor and not on wall or roof. 回答1: The simplest solution is to check the plane normal. Usually, wall's normal is perpendicular to the gravity, and floor is parallel to gravity. 回答2: Something like

Point look at Point

烂漫一生 提交于 2020-01-05 06:59:07
问题 So I have one point in 3D space, and I have location and rotation of the Camera in 3D space. So basically there is Vector3 on the object. Camera Vector3 and Quaternion . I need to get how to look at that point. I want to tell user how to move to that point. Should user direct camera left or right or behind? 回答1: One way to do this is to calculate the direction the camera is currently facing as a yaw angle (like a compass heading), and calculate the direction it needs to face in order to be

Tango Camera Preview for RGBIR

倖福魔咒の 提交于 2020-01-04 05:23:48
问题 I am using Tango's demo for videoOverlaySample. Instead of color, I would like to see the IR data (alone or with color). So, I replaced TANGO_CAMERA_COLOR with TANGO_CAMERA_RGBIR in both places where it appears. But screen is black. Here is the code: /* * Copyright 2014 Google Inc. All Rights Reserved. * * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www

Does setting exposure/ISO in Google Tango config work?

亡梦爱人 提交于 2020-01-03 20:17:06
问题 I am trying to use the Tango device to capture HDR images, but no matter how I set the Tango config ISO and exposure settings, there is no apparent change in the image. I am disabling the auto-exposure and auto-white balance and setting manual values for the ISO and Exposure time. Regardless of my settings, the colour camera images returned from onFrameAvailable always seem to be in auto mode. The measured average RGB of a given scene is the same, regardless of setting the ISO to 100, 200,