augmented-reality

Adding a material to a ModelEntity programmatically

|▌冷眼眸甩不掉的悲伤 提交于 2019-12-19 04:14:14
问题 The docs for RealityKit include the structs: OcclusionMaterial , SimpleMaterial , and UnlitMaterial for adding materials to a ModelEntity . Alternatively you can load in a model with a material attached to it. I want to add a custom material/texture to a ModelEntity programmatically. How can I achieve this on the fly without adding the material to a model in Reality Composer or some other 3D software? 回答1: Updated : 21st October 2019 . As you said, there are 3 types of materials in RealityKit

ARKit 2.0 – Scanning 3D Object and generating 3D Mesh from it

半城伤御伤魂 提交于 2019-12-18 17:36:53
问题 The iOS 12 application now allows us to create an ARReferenceObject , and using it, can reliably recognize a position and orientation of real-world object. We can also save the finished .arobject file. But: ARReferenceObject contains only the spatial features information needed for ARKit to recognize the real-world object, and is not a displayable 3D reconstruction of that object. sceneView.session.createReferenceObject(transform: simd_float4x4, center: simd_float3, extent: simd_float3) {

Creating a custom SCNGeometry polygon plane with SCNGeometryPrimitiveType polygon crash/error

ε祈祈猫儿з 提交于 2019-12-18 13:39:10
问题 I'm trying to create a custom SCNGeometry in the form of a plane with custom shape, which could be placed in an ARKit session. I'm using the option SCNGeometryPrimitiveTypePolygon in the following method which seems to work fine: extension SCNGeometry { static func polygonPlane(vertices: [SCNVector3]) -> SCNGeometry { var indices: [Int32] = [Int32(vertices.count)] var index: Int32 = 0 for _ in vertices { indices.append(index) index += 1 } let vertexSource = SCNGeometrySource(vertices:

Does ARKit 2.0 consider Lens Distortion in iPhone and iPad?

混江龙づ霸主 提交于 2019-12-18 12:33:14
问题 ARKit 2.0 updates many intrinsic (and extrinsic) parameters of the ARCamera from frame to frame. I'd like to know if it also takes Radial Lens Distortion into consideration (like in AVCameraCalibrationData class that ARKit doesn't use), and fix the video frames' distortion appropriately ( distort / undistort operations) for back iPhone and iPad cameras? var intrinsics: simd_float3x3 { get } As we all know, the Radial Lens Distortion greatly affects the 6 DOF pose estimation accuracy when we

Are there any limitations in Vuforia compared to ARCore and ARKit?

删除回忆录丶 提交于 2019-12-18 10:29:21
问题 I am a beginner in the field of augmented reality, working on applications that create plans of buildings (floor plan, room plan, etc with accurate measurements ) using a smartphone. So I am researching about the best AR SDK which can be used for this. There are not many articles pitting Vuforia against ARCore and ARKit. Please suggest the best SDK to use, pros and cons of each. 回答1: Updated: October 30, 2019 . TL;DR Google ARCore allows you build apps for Android and iOS, with Apple ARKit

How to begin with augmented reality? [closed]

巧了我就是萌 提交于 2019-12-18 09:58:44
问题 Closed . This question needs to be more focused. It is not currently accepting answers. Want to improve this question? Update the question so it focuses on one problem only by editing this post. Closed 4 years ago . I'm currently an undergrad in computer science and I'll be entering my final year next year. Augmented reality is something I find to be a really interesting topic, but I have no idea where to start learning about it. Where do you start learning about this topic and what libraries

ARKit –Drop a shadow of 3D object on the plane surface

霸气de小男生 提交于 2019-12-18 09:30:12
问题 This is the function that I use to display object on the plane surface. private func loadScene(path: String) -> SCNNode { let spotLight = SCNLight() spotLight.type = SCNLight.LightType.probe spotLight.spotInnerAngle = 30.0 spotLight.spotOuterAngle = 80.0 spotLight.castsShadow = true let result = SCNNode() result.light = spotLight result.position = SCNVector3(-10.0, 20.0, 10.5) result.addChildNode(result) let scene = SCNScene(named: path)! for node in scene.rootNode.childNodes { result

Load AR-model through button tap in Android

梦想的初衷 提交于 2019-12-18 05:11:19
问题 I have made a basic AR-app using Unity & Vuforia and exported it to Android, so I can add some activities there. When I scan the image with the AR-camera in the app, a model pops up. Nothing new there. What I want to achieve, is that when a user taps a button in some activity, lets say the text on the button is "Elephant", the AR-camera in the app opens, scans the image and loads a model of an Elephant. My question is: is this possible? Is it possible to load a model depending on the user

ARCore + Unity + Augmented Images - Load different prefabs for different Images

僤鯓⒐⒋嵵緔 提交于 2019-12-18 04:29:07
问题 How do I assign different prefabs to different images? right now, I have all my prefabs loading in on top of each other but how do I get it so each prefab loads in only once on top of one image so each image has a different prefab? I've modified the sample code (the frame corners) to load in my own prefab and used a dictionary to pair the prefabs with images from the database but when the program runs it instatiates all the prefabs in the same place rather than putting one prefrab on each

OpenCV IOS real-time template matching

自闭症网瘾萝莉.ら 提交于 2019-12-18 03:48:34
问题 I'd like to create an app (on iPhone) which does this: I have a template image (a logo or any object) and I'd like to find that in camera view and put a layer on the place of where it is found and tracking it! It is a markless AR with OpenCV! I read some docs and books and Q&A-s here, but sadly actually i'd like to create something like this or something like this. If anyone can send to me some source code or a really useful tutorial (step by step) i'd really be happy!!! Thank you! 回答1: