arkit

Convert ARFrame's captured image to UIImage orientation issue

安稳与你 提交于 2020-01-13 11:12:31
问题 I want to detect ball and have AR model interact with it. I used opencv for ball detection and send center of ball which I can use in hitTest to get coordinates in sceneView . I have been converting CVPixelBuffer to UIImage using following function: static func convertToUIImage(buffer: CVPixelBuffer) -> UIImage?{ let ciImage = CIImage(cvPixelBuffer: buffer) let temporaryContext = CIContext(options: nil) if let temporaryImage = temporaryContext.createCGImage(ciImage, from: CGRect(x: 0, y: 0,

Swift Add Button to SCNNode

蹲街弑〆低调 提交于 2020-01-13 05:45:07
问题 I'm playing around with the ARKit and image detection. Now I have an app that detects images an places planes on the screen where the detected objects are. How can I add a clickable element like a button on the planes. I want to have a click event on each detected object. This is what my renderer function looks: func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) { guard let imageAnchor = anchor as? ARImageAnchor else { return } let referenceImage =

How are the ARKit People Occlusion samples being done?

好久不见. 提交于 2020-01-12 10:46:19
问题 This may be an obscure question, but I see lots of very cool samples online of how people are using the new ARKit people occlusion technology in ARKit 3 to effectively "separate" the people from the background, and apply some sort of filtering to the "people" (see here). In looking at Apple's provided source code and documentation, I see that I can retrieve the segmentationBuffer from an ARFrame, which I've done, like so; func session(_ session: ARSession, didUpdate frame: ARFrame) { let

How are the ARKit People Occlusion samples being done?

生来就可爱ヽ(ⅴ<●) 提交于 2020-01-12 10:45:12
问题 This may be an obscure question, but I see lots of very cool samples online of how people are using the new ARKit people occlusion technology in ARKit 3 to effectively "separate" the people from the background, and apply some sort of filtering to the "people" (see here). In looking at Apple's provided source code and documentation, I see that I can retrieve the segmentationBuffer from an ARFrame, which I've done, like so; func session(_ session: ARSession, didUpdate frame: ARFrame) { let

How to rotate or scale 3D model scn file in AR kit

[亡魂溺海] 提交于 2020-01-12 04:06:52
问题 this class render my SCN file as well. import UIKit import ARKit class SimpleViewController: UIViewController { @IBOutlet var sceneView: ARSCNView! override func viewDidLoad() { super.viewDidLoad() sceneView.scene = SCNScene(named: "duck.scn", inDirectory: "models.scnassets/furniture")! } override func viewWillAppear(_ animated: Bool) { super.viewWillAppear(animated) sceneView.session.run(ARWorldTrackingConfiguration()) } override func viewWillDisappear(_ animated: Bool) { super

ARKit Ball Pass through Torus hole collision detection

前提是你 提交于 2020-01-06 06:35:19
问题 I have one ball node and one SCNTorus I am able to pass ball thorough torus node. I have added collision and I am able to detect collision when ball pass through the torus using SCNPhysicsContactDelegate But public func physicsWorld(_ world: SCNPhysicsWorld, didEnd contact: SCNPhysicsContact) called multiple times. public func physicsWorld(_ world: SCNPhysicsWorld, didEnd contact: SCNPhysicsContact) { // print("Ended collision ") print(contact.nodeA.name) print(contact.nodeB.name) print(world

Find 3d coordinates of a point on a line projected from another point in 3d space

戏子无情 提交于 2020-01-06 05:45:07
问题 Working in Swift, ARTKit / SceneKit I have a line AB in 3d and I have xyz coordinates of both points A and B. I also have a point C and I know its xyz coordinates too. Now, I want to find out the xyz coordinates of point D on line AB; given that CD is perpendicular to AB. What would be a simple way to do it in Swift. 回答1: Parameterize the line AB with a scalar t : P(t) = A + (B - A) * t` The point D = P(t) is such that CD is perpendicular to AB , i.e. their dot product is zero: dot(C - D, B -

Prevent SCNNode from going out of the screen

强颜欢笑 提交于 2020-01-06 03:29:08
问题 I'm playing around with ARKit and SceneKit. I have a node A - this is the only node that I've added to the rootNode. Node A is allowed to move freely by moving the phone around, except that it's not allowed to leave the screen. To achieve that (prevent node A from leaving the screen), I can think of the following ways: Add a transform constraint to node A: I've tried to do it in the following manner: [SCNTransformConstraint transformConstraintInWorldSpace:false withBlock:^SCNMatrix4(SCNNode *

Is it possible to run two ARSCNView at the same time?

跟風遠走 提交于 2020-01-06 01:18:09
问题 I was thinking to do some modification to my existing AR app, and I wanted to split the view and add inside 2 ARSCNView in this way users can use the VR Card Box and have a different experience but Xcode is always returning me: Session (0x102617d10): did fail with error: Error Domain=com.apple.arkit.error Code=102 "Required sensor failed." So, I'm supposing that I can't run 2 ARSCNView sessions at the same time, or am I wrong? 回答1: The answer is: Yes, it's possible . Use the following code to

Is it possible to run two ARSCNView at the same time?

你离开我真会死。 提交于 2020-01-06 01:18:08
问题 I was thinking to do some modification to my existing AR app, and I wanted to split the view and add inside 2 ARSCNView in this way users can use the VR Card Box and have a different experience but Xcode is always returning me: Session (0x102617d10): did fail with error: Error Domain=com.apple.arkit.error Code=102 "Required sensor failed." So, I'm supposing that I can't run 2 ARSCNView sessions at the same time, or am I wrong? 回答1: The answer is: Yes, it's possible . Use the following code to