arkit

ARKit 2.0 – Scanning 3D Object and generating 3D Mesh from it

半城伤御伤魂 提交于 2019-12-18 17:36:53
问题 The iOS 12 application now allows us to create an ARReferenceObject , and using it, can reliably recognize a position and orientation of real-world object. We can also save the finished .arobject file. But: ARReferenceObject contains only the spatial features information needed for ARKit to recognize the real-world object, and is not a displayable 3D reconstruction of that object. sceneView.session.createReferenceObject(transform: simd_float4x4, center: simd_float3, extent: simd_float3) {

Creating a custom SCNGeometry polygon plane with SCNGeometryPrimitiveType polygon crash/error

ε祈祈猫儿з 提交于 2019-12-18 13:39:10
问题 I'm trying to create a custom SCNGeometry in the form of a plane with custom shape, which could be placed in an ARKit session. I'm using the option SCNGeometryPrimitiveTypePolygon in the following method which seems to work fine: extension SCNGeometry { static func polygonPlane(vertices: [SCNVector3]) -> SCNGeometry { var indices: [Int32] = [Int32(vertices.count)] var index: Int32 = 0 for _ in vertices { indices.append(index) index += 1 } let vertexSource = SCNGeometrySource(vertices:

Does ARKit 2.0 consider Lens Distortion in iPhone and iPad?

混江龙づ霸主 提交于 2019-12-18 12:33:14
问题 ARKit 2.0 updates many intrinsic (and extrinsic) parameters of the ARCamera from frame to frame. I'd like to know if it also takes Radial Lens Distortion into consideration (like in AVCameraCalibrationData class that ARKit doesn't use), and fix the video frames' distortion appropriately ( distort / undistort operations) for back iPhone and iPad cameras? var intrinsics: simd_float3x3 { get } As we all know, the Radial Lens Distortion greatly affects the 6 DOF pose estimation accuracy when we

Are there any limitations in Vuforia compared to ARCore and ARKit?

删除回忆录丶 提交于 2019-12-18 10:29:21
问题 I am a beginner in the field of augmented reality, working on applications that create plans of buildings (floor plan, room plan, etc with accurate measurements ) using a smartphone. So I am researching about the best AR SDK which can be used for this. There are not many articles pitting Vuforia against ARCore and ARKit. Please suggest the best SDK to use, pros and cons of each. 回答1: Updated: October 30, 2019 . TL;DR Google ARCore allows you build apps for Android and iOS, with Apple ARKit

Dictionary with values as array on appending one at a time, remains empty

…衆ロ難τιáo~ 提交于 2019-12-18 09:53:48
问题 In ARKit, what I am trying to do is gather a bunch of positions of node placed in my scene and then average those out so that the node movements are not jittery as what happens while using ARKit. Hence, I have a variable declared and initialised as a Dictionary with values as an Array of vector_float3. (I am thinking this is more of a Swift problem than ARKit problem, is it?) var extentOfnodesAddedInScene: [SCNNode: [vector_float3]] = [:] This is related to SceneKit/ ARKit. Within the

ARKit –Drop a shadow of 3D object on the plane surface

霸气de小男生 提交于 2019-12-18 09:30:12
问题 This is the function that I use to display object on the plane surface. private func loadScene(path: String) -> SCNNode { let spotLight = SCNLight() spotLight.type = SCNLight.LightType.probe spotLight.spotInnerAngle = 30.0 spotLight.spotOuterAngle = 80.0 spotLight.castsShadow = true let result = SCNNode() result.light = spotLight result.position = SCNVector3(-10.0, 20.0, 10.5) result.addChildNode(result) let scene = SCNScene(named: path)! for node in scene.rootNode.childNodes { result

Can I apply a CIFilter to ARkit camera feed?

此生再无相见时 提交于 2019-12-18 06:51:03
问题 I'm trying to apply a blur effect to camera live steam image in ARSCNView. I have checked the WWDC videos. They only mentioned the custom rendering with Metal, but I didn't found any complete example on web. Any idea how to do that? Updated 1 I have tried to apply a filter to the background. It show incorrect orientation. How can I fix this? let bg=self.session.currentFrame?.capturedImage if(bg != nil){ let context = CIContext() let filter:CIFilter=CIFilter(name:"CIColorInvert")! let image

ARKit Unable to run the session, configuration is not supported on this device

青春壹個敷衍的年華 提交于 2019-12-18 04:38:17
问题 Using ARWorldTrackingSessionConfiguration get and error on iPhone 6 plus Unable to run the session, configuration is not supported on this device What can it be? 回答1: Here is Apple's document for ARKit and device support ARKit requires an iOS device with an A9 or later processor. To make your app available only on devices supporting ARKit, use the arkit key in the UIRequiredDeviceCapabilities section of your app's Info.plist. If augmented reality is a secondary feature of your app, use the

Dragging SCNNode in ARKit Using SceneKit

六月ゝ 毕业季﹏ 提交于 2019-12-17 17:58:15
问题 I have a simple SCNNode in ARKit and I am trying to drag it wherever I moved my finger on the phone. Here is my code. @objc func pan(recognizer :UIGestureRecognizer) { guard let currentFrame = self.sceneView.session.currentFrame else { return } var translation = matrix_identity_float4x4 translation.columns.3.z = -1.5 let sceneView = recognizer.view as! ARSCNView let touchLocation = recognizer.location(in: sceneView) let hitTestResult = sceneView.hitTest(touchLocation, options: [:]) if

ARKit – Get current position of ARCamera in a scene

浪尽此生 提交于 2019-12-17 15:25:24
问题 I'm in the process of learning both ARKit and Scenekit concurrently, and it's been a bit of a challenge. With a ARWorldTrackingSessionConfiguration session created, I was wondering if anyone knew of a way to get the position of the user's 'camera' in the scene session. The idea is I want to animate an object towards the user's current position. let reaperScene = SCNScene(named: "reaper.dae")! let reaperNode = reaperScene.rootNode.childNode(withName: "reaper", recursively: true)! reaperNode