arkit

ARKit Body Tracking using Xamarin and C# Inaccurate

不羁岁月 提交于 2020-12-31 05:36:48
问题 I am trying to translate some Swift examples (such as this one https://github.com/iamfine/ARSkeleton) to C# that show how to use ARKit Body Tracking. But I don't quite seem able to position the joint nodes correctly over the corresponding joints. They follow my body movements, but the position of the nodes seem to be incorrect. Can anyone familiar with ARKit Body Tracking see what I am doing wrong? Thanks using ARKit; using Foundation; using OpenTK; using SceneKit; using System; using System

ARKit Body Tracking using Xamarin and C# Inaccurate

我们两清 提交于 2020-12-31 05:35:57
问题 I am trying to translate some Swift examples (such as this one https://github.com/iamfine/ARSkeleton) to C# that show how to use ARKit Body Tracking. But I don't quite seem able to position the joint nodes correctly over the corresponding joints. They follow my body movements, but the position of the nodes seem to be incorrect. Can anyone familiar with ARKit Body Tracking see what I am doing wrong? Thanks using ARKit; using Foundation; using OpenTK; using SceneKit; using System; using System

ARKit Body Tracking using Xamarin and C# Inaccurate

老子叫甜甜 提交于 2020-12-31 05:35:49
问题 I am trying to translate some Swift examples (such as this one https://github.com/iamfine/ARSkeleton) to C# that show how to use ARKit Body Tracking. But I don't quite seem able to position the joint nodes correctly over the corresponding joints. They follow my body movements, but the position of the nodes seem to be incorrect. Can anyone familiar with ARKit Body Tracking see what I am doing wrong? Thanks using ARKit; using Foundation; using OpenTK; using SceneKit; using System; using System

How to calculate quadrangle for visible part of vertical plane?

三世轮回 提交于 2020-12-31 01:33:54
问题 My goal is to calculate the "visible" part of vertical plane that is anchored to some ARPlaneAnchor and represent it with quadrangle as shown in the picture below: My current approach is based on few hit tests, which unfortunately seems to not giving me the satisfying results. First, when I detect a ARPlaneAnchor I add a big invisible SCNNode to its main node. func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) { guard let planeAnchor = anchor as?

ARKit – How to put 3D Object on QRCode?

℡╲_俬逩灬. 提交于 2020-12-27 08:36:16
问题 I'm actually trying to put a 3D Object on QRCode with ARKit For that I use a AVCaptureDevice to detect a QRCode and establish the area of the QRCode that gives me a CGRect . Then, I make a hitTest on every point of the CGRect to get the average 3D coordinates like so : positionGiven = SCNVector3(0, 0, 0) for column in Int(qrZone.origin.x)...2*Int(qrZone.origin.x + qrZone.width) { for row in Int(qrZone.origin.y)...2*Int(qrZone.origin.y + qrZone.height) { for result in sceneView.hitTest(CGPoint

Adding 3D object to ARGeoAnchor

拥有回忆 提交于 2020-12-13 05:40:08
问题 Please forgive me if this question is not that great. I've hit a bit of a road block on Apple's documentation of ARGeoAnchor. Currently ARGeoAnchor just shows a blue dot in the AR Scene View. I'm trying to show any 3d rendereing or object instead. My code: let coordinate = CLLocationCoordinate2D(latitude: lat, longitude: lng) let geoAnchor = ARGeoAnchor(name: "Point 1", coordinate: coordinate) let boxGeometry = SCNBox(width: 0.1, height: 0.1, length: 0.1, chamferRadius: 0) let cube = SCNNode

What is the difference between Orientation and Rotation in SCNNode

天大地大妈咪最大 提交于 2020-12-06 06:35:22
问题 I am quite confused by the "rotation" and "orientation" for a SCNNode. In Apple's doc, they are defined quite similarly: orientation: The node’s orientation, expressed as a quaternion. Animatable. rotation: The node’s orientation, expressed as a rotation angle about an axis. Animatable. And apple doc says: The rotation, eulerAngles, and orientation properties all affect the rotational aspect of the node’s transform property. Any change to one of these properties is reflected in the others. So