augmented-reality

ARCore – Rendering objects 200m far from camera

空扰寡人 提交于 2019-12-01 18:53:17
I'm working on Android AR project using ARCore and Sceneform. I need to place the objects from 30 meters to 200 meters far from user's camera and faced with the frustum culling issue in ArCore, described HERE . I'm trying to set projection matrix to increase the far parameter using this method public void getProjectionMatrix (float[] dest, int offset, float near, float far); But I can't find possibility to set the rendering projection matrix. Here is my code: arFragment.arSceneView.apply { scene.addOnUpdateListener { // Some code to return from this callback if arFrame is not initialised yet

ARCore – Rendering objects 200m far from camera

∥☆過路亽.° 提交于 2019-12-01 18:46:34
问题 I'm working on Android AR project using ARCore and Sceneform. I need to place the objects from 30 meters to 200 meters far from user's camera and faced with the frustum culling issue in ArCore, described HERE. I'm trying to set projection matrix to increase the far parameter using this method public void getProjectionMatrix (float[] dest, int offset, float near, float far); But I can't find possibility to set the rendering projection matrix. Here is my code: arFragment.arSceneView.apply {

Android AR orientation

心已入冬 提交于 2019-12-01 17:17:39
I'm making program for showing objects from map on camera and this works almost well except few degrees to left and right from vertical orientation (like in 80-110 and 260-280 degrees). In other +-320 degrees it works well. I've tried to use TYPE_ROTATION_VECTOR and accelerometer with magnetometer and they have the same result. Does anybody know any solution? with TYPE_ROTATION_VECTOR: if (event.sensor.getType() == Sensor.TYPE_ROTATION_VECTOR) { float[] roationV = new float[16]; SensorManager.getRotationMatrixFromVector(roationV, event.values); float[] orientationValuesV = new float[3];

Convert latitude and longitude to ECEF coordinates system

谁都会走 提交于 2019-12-01 16:56:51
I am studying pArk Apple sample code, and how it is works. anyone knows how convert the latitude and longitude to ECEF coordinates, and Covert ECEF to ENU coordinates centered at given lat, lon functions are work? I just want to understand what is going on in this function! thanks. void latLonToEcef(double lat, double lon, double alt, double *x, double *y, double *z) { double clat = cos(lat * DEGREES_TO_RADIANS); double slat = sin(lat * DEGREES_TO_RADIANS); double clon = cos(lon * DEGREES_TO_RADIANS); double slon = sin(lon * DEGREES_TO_RADIANS); double N = WGS84_A / sqrt(1.0 - WGS84_E * WGS84

Android AR orientation

有些话、适合烂在心里 提交于 2019-12-01 16:39:57
问题 I'm making program for showing objects from map on camera and this works almost well except few degrees to left and right from vertical orientation (like in 80-110 and 260-280 degrees). In other +-320 degrees it works well. I've tried to use TYPE_ROTATION_VECTOR and accelerometer with magnetometer and they have the same result. Does anybody know any solution? with TYPE_ROTATION_VECTOR: if (event.sensor.getType() == Sensor.TYPE_ROTATION_VECTOR) { float[] roationV = new float[16]; SensorManager

Adding a material to a ModelEntity programmatically

陌路散爱 提交于 2019-12-01 13:02:12
The docs for RealityKit include the structs: OcclusionMaterial , SimpleMaterial , and UnlitMaterial for adding materials to a ModelEntity . Alternatively you can load in a model with a material attached to it. I want to add a custom material/texture to a ModelEntity programmatically. How can I achieve this on the fly without adding the material to a model in Reality Composer or some other 3D software? As you said, there are 3 types of materials in RealityKit at the moment: SimpleMaterial , UnlitMaterial and OcclusionMaterial . So you can try the following code using SimpleMaterial class: var

Analysis of a shader in VR

别说谁变了你拦得住时间么 提交于 2019-12-01 11:35:12
问题 I would like to create a shader like that that takes world coordinates and creates waves. I would like to analyse the video and know the steps that are required. I'm not looking for codes, I'm just looking for ideas on how to implement that using GLSL or HLSL or any other language. Here low quality and fps GIF in case link broke. Here is the fragment shader: #version 330 core // Interpolated values from the vertex shaders in vec2 UV; in vec3 Position_worldspace; in vec3 Normal_cameraspace; in

ARCore – Session, Frame, Camera and Pose

佐手、 提交于 2019-12-01 09:21:26
I'm studying the ARCore References , Develop , make the course from Coursera , and read, understood and learn from the Samples . But I still missing some definition with some real use examples. What is a session? Every time that I need a ARCore use I need a session? Session always has a camera connect so I can see and draw/renderer my 3D models in the screen? Can I do this without a Session? Camera has a getPose and Frame has a GetPose, what are the diferences between they? I thought about make this questions split but somehow I know that they are all connected. Sessions, CameraAr, Frame and

How to replace UIAccelerometer with CMMotionManager?

岁酱吖の 提交于 2019-12-01 07:42:37
问题 I'm new to iOS development. I follow the tutorial from Ray Wenderlich to create a little location based AR app. However, the tutorial uses an AR Toolkit which has not been updated for a while. The UIAccelerometer it uses has already been deprecated since iOS 5, so when I try to run it on my iPhone (iOS 7.0.4), the Xcode says that there are 3 warnings, and all of them are caused by UIAccelerometer. The result it leads to is that all the marks stay at the center of the screen one above another,

Does Phonegap support WebRTC?

醉酒当歌 提交于 2019-12-01 07:35:53
I want to build an augmented reality app. I was thinking of using something like the Wikitude SDK here http://www.wikitude.com/developer or using this javascript library https://github.com/mtschirs/js-objectdetect js-objectdetect which I would prefer however, it relies on webRTC support which of course is fine using a modern browser but I'm not quite sure if PhoneGap also supports it. In addition, if anyone knows how I can superimpose my 3d models over an object, that'd be great. I don't know what file format my 3d models need to be in to be used with these augmented reality solutions.