High-Quality Rendering – RealityKit vs SceneKit vs Metal

拟墨画扇 提交于 2020-08-21 06:48:26

问题


I'm new to iPhone app developing, though have experience in graphics programming in OpenGL.

I'm creating an iPhone app that I intend to display realistic/high-quality renders within AR. Whilst experimenting with these 3 options, I'm still unsure which of them I should go forward with developing my app's framework around: SceneKit, RealityKit and Metal.

I've read that SceneKit is built on top of Metal, but I'm not sure whether its worth the time/effort programming any custom shaders as opposed to using what SceneKit can offer by default.

Regarding RealityKit, I don't need any of the animations or special effects that it offers, just the photorealistic rendering side.

I am programming in Swift at the moment, and had used Objective-C in the past, but haven't planned to use any in my app.

Which of the three is the best to develop for AR and High-Quality Model Rendering?


回答1:


Updated: August 20, 2020.

RealityKit

RealityKit is the youngest SDK in Apple family of rendering technologies. This high-level framework was released in 2019. RealityKit is made for AR / VR projects, has simplified settings for multi-user experience and can be used on iOS / macOS. There's no Objective-C legacy, RealityKit supports only Swift, and rather declarative syntax (like in SwiftUI). The main advantage of RealityKit – it can complement / change / customise scenes coming from Reality Composer app and can be a powerful extension for ARKit – although it shines as a standalone AR SDK as well. In RealityKit the main units are entities (ModelEntity, AnchorEntity, TriggerVolume, BodyTrackedEntity, PointLight, SpotLight, DirectionalLight and PerspectiveCamera) that has components and can be created from resources like ModelEntity. At the moment RealityKit 2.0 has four materials: SimpleMaterial, UnlitMaterial, OcclusionMaterial and VideoMaterial.

Pay particular attention to shadows on iOS – devices up to A11 chipset produce projective (a.k.a. depth map) shadows, but on devices with A12 and higher we can see raytraced shadows.

Sample code:

@IBOutlet weak var arView: ARView!

let box = MeshResource.generateBox(size: 0.5)
let material = SimpleMaterial(color: .red, isMetallic: true)
let model = ModelEntity(mesh: box, materials: [material])

let anchor = AnchorEntity(world: [0, 0,-1])
anchor.addChild(model)

arView.scene.anchors.append(anchor)

RealityKit reads in .usdz, .rcproject and .reality file formats. Supports asset animation, dynamics, PBR materials, HDR Image Based Lighting and environmental audio. All scene models must be tethered with anchors. RealityKit 2.0 works with a polygonal mesh generated using Scene Reconstruction feature. AR Quick Look is built on RealityKit's engine.

Conclusion: RealityKit gives you a high-quality render technology and up-to-date AR capabilities out-of-the-box. Supports LiDAR Scanner. You can use it alone or with ARKit. RealityKit works with UIKit storyboards or with SwiftUI interfaces. And it has a minimum of a boilerplate code. For example, RealityKit has a very simple setup for models' collision. And there's composition over inheritance – so it's rather a Protocol Oriented Programming (POP) framework.

RealityKit's native view is ARView.

@available(OSX 10.15, iOS 13.0, *)
@objc open class ARView : ARViewBase


SceneKit

SceneKit is a high-level framework as well. The oldest one in Apple family of rendering technologies. It was released in 2012. SceneKit was conceived for VR and can also be used on iOS / macOS. For AR projects you can use it only in conjunction with ARKit. SceneKit supports both Objective-C and Swift. In SceneKit the main unit is a node (SCNNode class) that has its own hierarchy and can be a light (SCNLight), or a camera (SCNCamera), or a geometry (SCNGeometry), or a particle system (SCNParticleSystem), etc. The main advantage of SceneKit – it's highly customisable, it can change geometry and materials at runtime, it renders a scene at 30 to 120 fps and it has an advanced setup for particle system. There are Blinn, Constant, Lambert, Phong, and PBR shaders. Occlusion material is also available for us in SceneKit but in a custom form (there's no out-of-the-box occlusion material here like we can find in RealityKit). In case you need a video material in SCNScene you may implement SpriteKit's SKVideoNode.

Sample code:

@IBOutlet weak var sceneView: SCNView!
    
sceneView.scene = SCNScene()
sceneView.autoenablesDefaultLighting = true
    
let boxNode = SCNNode()
boxNode.geometry = SCNBox(width: 0.5, height: 0.5, length: 0.5, chamferRadius: 0)
boxNode.geometry?.firstMaterial?.lightingModel = .physicallyBased
boxNode.geometry?.firstMaterial?.diffuse.contents = UIColor.red
boxNode.geometry?.firstMaterial?.metalness.contents = 1.0

sceneView.scene?.rootNode.addChildNode(boxNode)

SceneKit reads in .usdz, .dae and .scn file formats. Supports nested asset animation, dynamics, particles, PBR materials, HDR IBL and environmental audio. For implicit and explicit transform animation of any node you can use SCNAction, SCNTransaction and CAAnimation classes. Though a collisions' setup in SceneKit is a little bit complicated.

Conclusion: SceneKit gives you a high-quality render technology (but at first you need to setup physicallyBased shaders), although for AR projects you can use it only with ARKit. SceneKit is highly customisable and can be used with Swift and Objective-C, and it gives you a set of useful renderer(...) instance methods coming from ARSCNViewDelegate protocol that allows you update AR models and tracked anchors at 60 fps.

SceneKit's native view is SCNView.

@available(iOS 8.0, tvOS 9.0, *)
open class SCNView : UIView, SCNSceneRenderer, SCNTechniqueSupport 
 
@available(OSX 10.8, *)
open class SCNView : NSView, SCNSceneRenderer, SCNTechniqueSupport 


Metal, MetalKit

To be precise, Metal is not a rendering technology but rather the GPU accelerator. Released in 2014. It's a low-level framework. Metal is implemented everywhere – in RealityKit, SceneKit, ARKit, CoreML, Vision, AVFoundation, etc. Metal combines functions similar to OpenGL and OpenCL under the hood of just one API.

According to Apple documentation: "Metal is a C++ based programming language that developers can use to write code that is executed on the GPU for graphics and general-purpose data-parallel computations. Since Metal is based on C++, developers will find it familiar and easy to use. With Metal, both graphics and compute programs can be written with a single, unified language, which allows tighter integration between the two."

In addition to Metal, you can use MetalKit module (released in 2015) that helps build Metal apps quicker and easier, using far less code. It renders graphics in a standard Metal view, load textures from many sources, and work efficiently with models provided by Model I/O framework.

Sample code:

import MetalKit

class RedCube: Primitive {
    
    override func buildVertices() {
        
        vertices = [ Vertex(position: float3(-1,1,1),   color: float4(1,0,0,1)),
                     Vertex(position: float3(-1,-1,1),  color: float4(1,0,0,1)),
                     Vertex(position: float3(1,1,1),    color: float4(1,0,0,1)),
                     Vertex(position: float3(1,-1,1),   color: float4(1,0,0,1)),
                     Vertex(position: float3(-1,1,-1),  color: float4(1,0,0,1)),
                     Vertex(position: float3(1,1,-1),   color: float4(1,0,0,1)),
                     Vertex(position: float3(-1,-1,-1), color: float4(1,0,0,1)),
                     Vertex(position: float3(1,-1,-1),  color: float4(1,0,0,1)) ]

        indices = [ 0,1,2, 2,1,3, 5,2,3, 5,3,7, 0,2,4, 2,5,4,
                    0,1,4, 4,1,6, 5,4,6, 5,6,7, 3,1,6, 3,6,7 ]
    }
}

...

class CubeScene: Scene {

    override init(device: MTLDevice) {           
        super.init(device: device)
        
        let redCube = RedCube(withDevice: device)
        objects.append(redCube)
        redCube.translate(direction: float3(0,0,-10))
        add(child: redCube)
    }
    
    override func render(commandEncoder: MTLRenderCommandEncoder, 
                              deltaTime: Float) {

        objects.forEach { $0.rotate(angle: deltaTime, 
                                     axis: float3(1,1,-1)) }

        super.render(commandEncoder: commandEncoder, 
                          deltaTime: deltaTime)
    }
}

Conclusion: Developers usually use Metal framework to generate a High-Quality GPU Rendering for games with sophisticated 3D environments, for video processing apps like Final Cut Pro and Nuke, for 3D apps like Maya, or for big data scientific apps that must perform for scientific research. Consider, raytracing in Metal is much more quicker than in RealityKit.

MetalKit's native view is MTKView.

@available(iOS 9.0, tvOS 9.0, *)
open class MTKView : UIView, NSCoding, CALayerDelegate

@available(OSX 10.11, *)
open class MTKView : NSView, NSCoding, CALayerDelegate


ARKit

ARKit 4.0 has no any rendering engine inside. This module is only responsible for high-quality World Tracking and Scene Understanding (plane detection, ray-casting, scene reconstruction and light estimation).

Here are three types of views ARKit is able to work with: ARSCNView, ARSKView and ARView.

@available(iOS 11.0, *)
open class ARSCNView : SCNView, ARSessionProviding

@available(iOS 11.0, *)
open class ARSKView : SKView, ARSessionProviding

@available(iOS 13.0, *)
@objc open class ARView : ARViewBase

If you need an additional information on ARKit and its capabilities, please read THIS POST.


SpriteKit

SpriteKit is Apple framework for creating and rendering 2D games and 2D graphics. Was released in 2013. You can use SpriteKit as a standalone API or use it with SceneKit and ARKit. Its main feature is the ability to draw sprites with physics, 2D text and shapes, images and video. In SpriteKit you can write a code in Objective-C or in Swift.

Official documentation: "SpriteKit is a general-purpose 2D framework that leverages Metal to achieve high-performance rendering, while offering a simple programming interface to make it easy to create games and other graphics-intensive apps. Using a rich set of animations and physics behaviours, you can quickly add life to your visual elements and gracefully transition between screens".

SpriteKit works with two native types of view that inherit from UIView and NSView:

@available(iOS 7.0, tvOS 9.0, *)
open class SKView : UIView

@available(OSX 10.9, *)
open class SKView : NSView


来源:https://stackoverflow.com/questions/60505755/high-quality-rendering-realitykit-vs-scenekit-vs-metal

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!