augmented-reality

Can I track more than 4 images at a time with ARKit?

馋奶兔 提交于 2020-02-14 13:03:07
问题 Out of the box it's pretty clear ARKit doesn't allow for the tracking of more than 4 images at once. (You can "track" more markers than that but only 4 will function at a time). See this question for more details on that. However, I'm wondering if there is a possible work-around. Something like adding and removing anchors on a timer or getting the position information and then displaying the corresponding models without ARKit, etc. My knowledge of Swift is fairly limited so I haven't had much

Reality Kit: Where are 'behaviours' created in reality composer stored in the .rcproject object?

拈花ヽ惹草 提交于 2020-02-05 04:05:49
问题 The situation I am making an AR app in xcode (11.3.1). I have added objects (e.g. a cube) into the scene using reality composer and added behaviours (i.e. tap and flip and look at camera) to those objects, also using reality composer. Saved that, switched to ViewController.swift In ViewController, I load in the Experience.rcproject and access the default Box scene by writing var box = try! Experience.loadBox() . All works as expected. I am then printing the various objects in the hierachy to

Submission of Augmented Reality iOS Applications to App Store

折月煮酒 提交于 2020-02-02 14:55:26
问题 I’m developing an AR application for iOS, for a range of toys. The concept of it is essentially that on the physical toy is a marker that the camera will detect, and overlay a 3D scene. My question is regarding submission of AR applications to the App Store. What are the requirements around providing AR Markers for App Store approvers to test with? Regards, Daniel 回答1: You can put the image marker into any web server, then you can provide a link for Apple in Review notes. Apple requires demo

Submission of Augmented Reality iOS Applications to App Store

我们两清 提交于 2020-02-02 14:55:11
问题 I’m developing an AR application for iOS, for a range of toys. The concept of it is essentially that on the physical toy is a marker that the camera will detect, and overlay a 3D scene. My question is regarding submission of AR applications to the App Store. What are the requirements around providing AR Markers for App Store approvers to test with? Regards, Daniel 回答1: You can put the image marker into any web server, then you can provide a link for Apple in Review notes. Apple requires demo

How can I reduce the opacity of the shadows in RealityKit?

风流意气都作罢 提交于 2020-01-29 09:53:53
问题 I composed a scene in Reality Composer and added 3 objects in it. The problem is that the shadows are too intense (dark). I tried using the Directional Light in RealityKit from this answer rather than a default light from Reality Composer (since you don't have an option to adjust light in it). Update I implemented the spotlight Lighting as explained by @AndyFedo in the answer. The shadow is still so dark. 回答1: In case you need soft and semi-transparent shadows in your scene , use SpotLight

How to do a web based AR solution without using apps?

ぐ巨炮叔叔 提交于 2020-01-25 09:35:07
问题 I was searching for different options to do web based AR without downloading apps or web supporting apps.I work with unity but it does not support web based AR experience.A requirement cam that the user cannot download any apps but can only use browser to view AR.While I searched I came to know we require javascript to do WEB based AR. Is there any alternate ways any SDK with unity to do web based AR.Other SDK to do WEB based AR. The things I tried for Image tracking (Web based) are 1)

Handling 3D Interaction and UI Controls in Augmented Reality

瘦欲@ 提交于 2020-01-25 07:30:08
问题 I'm playing with this Code example. And I'm having trouble finding where can I replace the existing 3d objects with my own 3d objects programmatically created? I want to keep all the existing functionality in the code example but with my own objects! Creating of 3d object. self.geometry = SCNBox(width: width!/110, height: height!/110, length: 57 / 700, chamferRadius: 0.008) self.geometry.firstMaterial?.diffuse.contents = UIColor.red self.geometry.firstMaterial?.specular.contents = UIColor

Tracking face mesh vertices of Augmented Faces (ARCore) regardless of rotation

╄→гoц情女王★ 提交于 2020-01-24 20:48:06
问题 I'm trying to track facial expressions such as eyebrow raise, smile, wink, etc. In ARKit I could use blendShapes (https://developer.apple.com/documentation/arkit/arfaceanchor/2928251-blendshapes) to detect the movement of the different parts of the face but in ARCore it doesn't exist yet. I've tried to access the mesh vertices which are relative to the center transform of the face but these change significantly with the rotation of the face. Is there a way to normalize the face landmark

3D AR Markers with Project Tango

陌路散爱 提交于 2020-01-24 12:34:14
问题 I'm working on a project for an exhibition where an AR scene is supposed to be layered on top of a 3D printed object. Visitors will be given a device with the application pre-installed. Various objects should be seen around / on top of the exhibit, so the precision of tracking is quite important. We're using Unity to render the scene, this is not something that can be changed as we're already well into development. However, we're somewhat flexible on the technology we use to recognize the 3D

3D AR Markers with Project Tango

浪尽此生 提交于 2020-01-24 12:34:05
问题 I'm working on a project for an exhibition where an AR scene is supposed to be layered on top of a 3D printed object. Visitors will be given a device with the application pre-installed. Various objects should be seen around / on top of the exhibit, so the precision of tracking is quite important. We're using Unity to render the scene, this is not something that can be changed as we're already well into development. However, we're somewhat flexible on the technology we use to recognize the 3D