arkit

What are limitations for scanning and detecting 3d object in ARKit2.0 in iOS?

半世苍凉 提交于 2019-12-13 03:14:48
问题 I am done with 3d object scanning and detection with ARKit 2.0. I have scanned 3d object from all sides of object. Once 100% scanning is done then had given name to that object and then save that ARReference Object and image in document directory. Then on button click I am going to detect scanned object and display it’s name and image from document directory. Object get detected but it’s taking too much time to detect an object. I have gone through Apple document for best practices and

Apply ARCamera rotation transform to node (ARKit)

ε祈祈猫儿з 提交于 2019-12-13 03:06:25
问题 I want to apply the rotation of the ARCamera to a 3D node so that the node will always face the camera. How can I implement this code in Objective-C? 回答1: You can get an SCNNode to face the ARCamera by using an SCNBillboardConstraint : An SCNBillboardConstraint object automatically adjusts a node’s orientation so that its local z-axis always points toward the pointOfView node currently being used to render the scene. For example, you can use a billboard constraint to efficiently render parts

Which measuring unit is used in SCNVector3 position for x, y and z in ARKit

牧云@^-^@ 提交于 2019-12-13 02:35:39
问题 let position = SCNVector3(x: 0, y: 0, z: -2) what is the distance in measuring units from the origin to the above point 回答1: In ARKit the unit of measurement is in Meters . When you launch your ARKit app, the worldOrigin is set at SCNVector3Zero which is (0,0,0) (or in the diagram where the X,Y,Z axis intersect): ARKit defines a world coordinate space for you to use to place virtual content and locate detected objects in an AR experience. By default, this space is based on the initial

Reduce Application Lag while using shadows in SceneKit

不打扰是莪最后的温柔 提交于 2019-12-13 02:15:50
问题 I am working on a 3D map in SceneKit. When I enable the Cast Shadow Property of a directional light in SceneKit, the shadows appear, but the application becomes very slow. How do I reduce the lag while still maintaining shadows in the scene? 回答1: Use Fake Shadows (shadows generated and baked as a texture in 3D authoring tool) rather than True Shadow Map . To apply fake shadows as a texture for 3D plane use PNG file format with premultiplied alpha channel ( RGB * A ). It helps you get rid of

How to capture image in ARKit and send Binary data?

你说的曾经没有我的故事 提交于 2019-12-13 00:37:31
问题 I want to make a capture image in ARKit and send byte array to TCP server. Well This is my code. @IBOutlet weak var sceneView: ARSCNView! @IBAction func sendButtonAction(_ sender: Any) { let captureImage:UIImage = self.sceneView.snapshot() } I can get image by snapshot but i don't know how convert it to Byte Array (include pixel R,G,B Data.) I tried to change UIImage to binary data like this. let imageData: NSData = UIImagePNGRepresentation(captureImage)! as NSData but this is not correct

Resetting ARKit coordinates

白昼怎懂夜的黑 提交于 2019-12-12 10:49:45
问题 I have a simple question. If I wanted to start a game and place the board right in front of me: gameBoard!.position = SCNVector3(0, 0, -0.6) This works until I leave the game and come back again. Can I show the game board in exact same position in front of camera or 0.6m in front of me? I might have physically moved to another position. 回答1: If you want to reset you ARSession, you have to pause, remove all nodes and rerun your session by resetting tracking and removing anchors. I made a reset

Marker based initial positioning with ARCore/ARKit?

最后都变了- 提交于 2019-12-12 08:10:34
问题 problem situation: Creating AR-Visualizations always at the same place (on a table) in a comfortable way. We don't want the customer to place the objects themselves like in countless ARCore/ARKit examples. I'm wondering if there is a way to implement those steps: Detect marker on the table Use the position of the marker as the initial position of the AR-Visualization and go on with SLAM-Tracking I know there is something like an Marker-Detection API included in the latest build of the

Xcode: could not load ModelIO.framework, SceneKit.framework, etc

风格不统一 提交于 2019-12-12 07:12:39
问题 Before updating to macOS Mojave my app ran fine without errors, however, I get this error after the update. I have been Googling for 2 days but it seems no one has run into this error yet. Note: The app does run as expected. The 3D model file is in the .scn format. Buildtime error: /scntool:-1: Could not load ModelIO.framework ((null)) /scntool:-1: Could not load SceneKit.framework ((null)) /scntool:-1: Could not load PhysicsKit.framework ((null)) /scntool:-1: Could not load Jet.framework (

ARKit hitTest(_:options:) to select placed 3d-objects not working

狂风中的少年 提交于 2019-12-12 01:16:35
问题 I am trying to select an object which has been placed on a detected plane in order to perform various task on it such as rotating through gestures. In order to search for placed objects and avoid getting hitTestResults of irrelevant nature (eg. selecting the plane or the ARWorldMap itself) I am trying to use hitTest(_:options:) with SCNHitTestOption.categoryBitMask. However it seems as the hitTest returns results of all types, not just objects with the selected categoryBitMask = 5, even

How to clip ARSCNView bounds to circle?

纵饮孤独 提交于 2019-12-11 18:43:38
问题 Should be straightforward to clip an ARSCNView since ARSCNView is a subclass of UIView. After boilerplate ( sceneView.session.run(configuration) ) logic, adding a scene view programmatically works in the ViewController: sceneView.translatesAutoresizingMaskIntoConstraints = false view.addSubview(sceneView) NSLayoutConstraint.activate([ sceneView.widthAnchor.constraint(equalTo: view.widthAnchor, multiplier: 1.0), sceneView.heightAnchor.constraint(equalTo: sceneView.widthAnchor, multiplier: 1.0)