arkit

ARKit – Viewport Size vs Real Screen Resolution

℡╲_俬逩灬. 提交于 2020-05-28 07:45:09
问题 I am writing an ARKit app that uses ARSCNView hitTest function. Also the app sends captured images to the server for some analysis. I notices when I do: let viewportSize = sceneView.snapshot().size let viewSize = sceneView.bounds.size then the first one is twice as large as the second one. The questions are: 1.Why there is a difference? 2.What "size" (e.g. coordinates) is used in hitTest? 回答1: Why there is a difference? Let's explore some important display characteristics of your iPhone 7 : a

ARKit – Viewport Size vs Real Screen Resolution

淺唱寂寞╮ 提交于 2020-05-28 07:44:40
问题 I am writing an ARKit app that uses ARSCNView hitTest function. Also the app sends captured images to the server for some analysis. I notices when I do: let viewportSize = sceneView.snapshot().size let viewSize = sceneView.bounds.size then the first one is twice as large as the second one. The questions are: 1.Why there is a difference? 2.What "size" (e.g. coordinates) is used in hitTest? 回答1: Why there is a difference? Let's explore some important display characteristics of your iPhone 7 : a

Where is the .camera AnchorEntity located?

佐手、 提交于 2020-05-28 04:27:45
问题 When adding a child to my AnchorEntity(.camera) , it appears as if the child is spawning behind my camera (meaning I can only see my child when I turn around). I have also tried to add a mesh to my Anchor directly but unfortunately ARKit / RealityKit does not render the mesh when you are inside of it (which because its centered around the camera, is theoretically always the case. However, it could also be the case that its always located behind the screen [where the user is] and I'm never

How do I make an entity a physics entity in RealityKit?

﹥>﹥吖頭↗ 提交于 2020-05-26 09:24:11
问题 I am not able to figure out how to make the "ball" entity a physics entity / body and apply a force to it. // I'm using UIKit for the user interface and RealityKit + // the models made in Reality Composer for the Augmented reality and Code import RealityKit import ARKit class ViewController: UIViewController { var ball: (Entity & HasPhysics)? { try? Entity.load(named: "golfball") as? Entity & HasPhysics } @IBOutlet var arView: ARView! // referencing the play now button on the home screen

What is the easiest way to convert .obj or .stl to .usdz?

北城以北 提交于 2020-05-26 04:59:41
问题 Apple AR Quick Look apparently only supports .usdz files. So, what is the easiest way to convert an .obj or .stl to .usdz ? I googled this first but the most popular result was to use a free tool called Vectary , but when I actually tried to use it, it wasn’t free. Thanks. 回答1: With iOS 13 & Reality Kit, Apple has released a USDZ converter that does not require Xcode; the converter is a Python based tool. I have been using it the last few days to convert various files gltf, obj files.

ARFaceTrackingConfiguration: How to distinguish pictures from real faces?

守給你的承諾、 提交于 2020-05-10 21:00:15
问题 we have several apps in the Store that use ARFaceTrackingConfiguration to detect the users face in iOS devices with FaceID cameras. As you might have seen, ARKit will also track picture of faces you put in front of your iPad Pro/iPhoneX, as if they were faces. E.g. take a picture from one of our apps (to replicate one can download&run Apples example app for ARFaceTrackingConfiguration): Now I have noticed that internally ARKit treats real faces differently then it does pictures of faces.

ARFaceTrackingConfiguration: How to distinguish pictures from real faces?

不羁的心 提交于 2020-05-10 20:56:36
问题 we have several apps in the Store that use ARFaceTrackingConfiguration to detect the users face in iOS devices with FaceID cameras. As you might have seen, ARKit will also track picture of faces you put in front of your iPad Pro/iPhoneX, as if they were faces. E.g. take a picture from one of our apps (to replicate one can download&run Apples example app for ARFaceTrackingConfiguration): Now I have noticed that internally ARKit treats real faces differently then it does pictures of faces.

How do you attach an object to your camera position with ARKit Swift?

北战南征 提交于 2020-05-09 18:55:05
问题 I have moving objects which I want to have be able to collide with me the player. I have the ability to launch objects from me by getting my current position/direction at that time, but I do not understand how to attach an object to me which will follow my positioning at all times. 回答1: In SceneKit, everything that can have a position in the scene is (attached to) a node. That includes not just visible objects, but also light sources and cameras. When you use ARSCNView , there's still a

didBeginContact delegate method not firing for ARKit collision detection

妖精的绣舞 提交于 2020-04-30 09:20:54
问题 I can't get the didBeginContact method to fire, I have been trying for a while and I can't spot the error, could use a fresh set of eyes: - (void)viewDidLoad { [super viewDidLoad]; self.lastRender = nil; self.accelX = 0.0; self.accelY = 0.0; self.accelZ = 0.0; self.isLooping = TRUE; self.tripWire = TRUE; self.lastPaddleNode = [[SCNNode alloc] init]; self.paddleNode = [[SCNNode alloc] init]; SCNPlane* paddlePlane = [SCNPlane planeWithWidth:0.067056 height:0.138176]; self.paddleNode.geometry =

How to determine that ARObjectAnchor was removed from the scene?

自作多情 提交于 2020-04-11 04:48:27
问题 I am trying to use ARKit for validating the position of a toy. I have an ARObject scan resource, and placing the toy in camera view works pretty well. In other words, didAdd and didUpdate of SCNScene and ARSession are called as expected in a reasonable time after the toy is placed in camera view. But when i move the toy away from the camera view, didRemove does not get called, neither for SCNScene nor for ARSession . I did read advocations of this behaviour, saying "well ARKit can't know if