scenekit

High-Quality Rendering – RealityKit vs SceneKit vs Metal

拟墨画扇 提交于 2020-08-21 06:48:26
问题 I'm new to iPhone app developing, though have experience in graphics programming in OpenGL. I'm creating an iPhone app that I intend to display realistic/high-quality renders within AR. Whilst experimenting with these 3 options, I'm still unsure which of them I should go forward with developing my app's framework around: SceneKit, RealityKit and Metal. I've read that SceneKit is built on top of Metal, but I'm not sure whether its worth the time/effort programming any custom shaders as opposed

Swift: Obtain and save the updated SCNNode over time using projectPoint in scenekit

与世无争的帅哥 提交于 2020-08-20 05:14:55
问题 I am trying to use projectPoint to get the 2D information of the updated SCNNode in scenekit and save them. Based on ignotusverum's suggestion, I am able to save the SCNNode in to a path in a button. var lastPosition: CGPoint? func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) { guard anchor == currentFaceAnchor, let contentNode = selectedContentController.contentNode, contentNode.parent == node else { return } for (index, vertex) in vertices.enumerated

Flip ARFaceAnchor from left-handed to right-handed coordinate system

人盡茶涼 提交于 2020-08-19 10:47:10
问题 After some testings by printing faceAnchor.transform.columns.3 , digging in SO: ARFaceAnchor have negative Z position? and Apple's documentation: ARFaceAnchor, I realized that the z axis is actually flipped and it is not the right-handed coordinate system as in the documentation. The ARFaceAnchor claims the coordinate: the positive x direction points to the viewer’s right (that is, the face’s own left), the positive y direction points up (relative to the face itself, not to the world), and

Flip ARFaceAnchor from left-handed to right-handed coordinate system

二次信任 提交于 2020-08-19 10:47:05
问题 After some testings by printing faceAnchor.transform.columns.3 , digging in SO: ARFaceAnchor have negative Z position? and Apple's documentation: ARFaceAnchor, I realized that the z axis is actually flipped and it is not the right-handed coordinate system as in the documentation. The ARFaceAnchor claims the coordinate: the positive x direction points to the viewer’s right (that is, the face’s own left), the positive y direction points up (relative to the face itself, not to the world), and

What is the real Focal Length of the camera used in RealityKit?

岁酱吖の 提交于 2020-08-19 07:20:24
问题 I am doing this Augmented Reality project starting from Xcode's default AR project. I need to know the focal length of the camera used by ARKit. This page defines Focal Length well: Focal length, usually represented in millimeters (mm), is the basic description of a photographic lens. It is not a measurement of the actual length of a lens, but a calculation of an optical distance from the point where light rays converge to form a sharp image of an object to the digital sensor or 35mm film at

Sectioning or cutting my 3D Object Horizontally (3D cross section) with ARKit or else in Swift

人盡茶涼 提交于 2020-08-10 17:44:07
问题 It's my first time working with Augmented Reality and ARKit. I want to know if I am missing something, I've been looking for a solution for a long time. What I would like to do is sectioning my 3D house horizontally so I can see my house floor by floor or sectioning the house by Y height; like it's shown in these two images from the CAD Exchanger software. I've seen some applications using this tool, but I couldn't find a way to do it. ( PS: I have a 3D house implemented by AutoCAD) Is there

Sectioning or cutting my 3D Object Horizontally (3D cross section) with ARKit or else in Swift

…衆ロ難τιáo~ 提交于 2020-08-10 17:43:01
问题 It's my first time working with Augmented Reality and ARKit. I want to know if I am missing something, I've been looking for a solution for a long time. What I would like to do is sectioning my 3D house horizontally so I can see my house floor by floor or sectioning the house by Y height; like it's shown in these two images from the CAD Exchanger software. I've seen some applications using this tool, but I couldn't find a way to do it. ( PS: I have a 3D house implemented by AutoCAD) Is there

Sectioning or cutting my 3D Object Horizontally (3D cross section) with ARKit or else in Swift

◇◆丶佛笑我妖孽 提交于 2020-08-10 17:42:25
问题 It's my first time working with Augmented Reality and ARKit. I want to know if I am missing something, I've been looking for a solution for a long time. What I would like to do is sectioning my 3D house horizontally so I can see my house floor by floor or sectioning the house by Y height; like it's shown in these two images from the CAD Exchanger software. I've seen some applications using this tool, but I couldn't find a way to do it. ( PS: I have a 3D house implemented by AutoCAD) Is there

How do I create a looping video material in SceneKit on iOS in Swift 3?

廉价感情. 提交于 2020-08-01 05:08:12
问题 How do I create a material in SceneKit that plays a looping video? 回答1: It's possible to achieve this in SceneKit using a SpriteKit scene as the geometry's material. The following example will create a SpriteKit scene, add a video node to it with a video player, make the video player loop, create a SceneKit scene, add a SceneKit plane, and finally add the SpriteKit scene as the plane's diffuse material. import UIKit import SceneKit import SpriteKit import AVFoundation class ViewController:

How to write a sceneKit shader modifier for a dissolve in effect

血红的双手。 提交于 2020-07-28 06:17:28
问题 I'd like to build a dissolve in effect for a Scenekit game. I've been looking into shader modifiers since they seem to be the most light weight and haven't had any luck in replicating this effect: Is it possible to use shader modifiers to create this effect? How would you go about implementing one? 回答1: You can get pretty close to the intended effect with a fragment shader modifier. The basic approach is as follows: Sample from a noise texture If the noise sample is below a certain threshold