I tend to use CGFloat all over the place, but I wonder if I get a senseless \"performance hit\" with this. CGFloat seems to be something \"heavier\" than float, right? At wh
public struct CGFloat {
/// The native type used to store the CGFloat, which is Float on
/// 32-bit architectures and Double on 64-bit architectures.
public typealias NativeType = Double