Why does count return different types for Collection vs. Array?

后端 未结 2 1467
遥遥无期
遥遥无期 2021-02-19 09:34

When I\'m extending Collection the type of count is IndexDistance.

When I\'m extending Array type the count

2条回答
  •  自闭症患者
    2021-02-19 09:47

    From Associated Types in the Swift Programming Language (emphasis added):

    When defining a protocol, it’s sometimes useful to declare one or more associated types as part of the protocol’s definition. An associated type gives a placeholder name to a type that is used as part of the protocol. The actual type to use for that associated type isn’t specified until the protocol is adopted. Associated types are specified with the associatedtype keyword.

    In Swift 3/4.0, the Collection protocol defines five associated types (from What’s in a Collection?):

    protocol Collection: Indexable, Sequence {
        associatedtype Iterator: IteratorProtocol = IndexingIterator
        associatedtype SubSequence: IndexableBase, Sequence = Slice
        associatedtype Index: Comparable // declared in IndexableBase
        associatedtype IndexDistance: SignedInteger = Int
        associatedtype Indices: IndexableBase, Sequence = DefaultIndices
        ...
    }
    

    Here

        associatedtype IndexDistance: SignedInteger = Int
    

    is an associated type declaration with a type constraint (: SignedInteger) and a default value (= Int),

    If a type T adopts the protocol and does not define T.IndexDistance otherwise then T.IndexDistance becomes a type alias for Int. This is the case for many of the standard collection types (such as Array or String), but not for all. For example

    public struct AnyCollection : Collection
    

    from the Swift standard library defines

        public typealias IndexDistance = IntMax
    

    which you can verify with

    let ac = AnyCollection([1, 2, 3])
    let cnt = ac.count
    print(type(of: cnt)) // Int64
    

    You can also define your own collection type with a non-Int index distance if you like:

    struct MyCollection : Collection {
    
        typealias IndexDistance = Int16
        var startIndex: Int { return  0 }
        var endIndex: Int { return  3 }
    
        subscript(position: Int) -> String {
            return "\(position)"
        }
    
        func index(after i: Int) -> Int {
            return i + 1
        }
    }
    

    Therefore, if you extend the concrete type Array then count is an Int:

    extension Array {
        func whatever() {
            let cnt = count // type is `Int`
        }
    }
    

    But in a protocol extension method

    extension Collection {
        func whatever() {
            let cnt = count // some `SignedInteger`
        }
    }
    

    everything you know is that the type of cnt is some type adopting the SignedInteger protocol, but that need not be Int. One can still work with the count, of course. Actually the compiler error in

        for index in 0...count { //  binary operator '...' cannot be applied to operands of type 'Int' and 'Self.IndexDistance'
    

    is misleading. The integer literal 0 could be inferred as a Collection.IndexDistance from the context (because SignedInteger conforms to ExpressibleByIntegerLiteral). But a range of SignedInteger is not a Sequence, and that's why it fails to compile.

    So this would work, for example:

    extension Collection {
        func whatever() {
            for i in stride(from: 0, to: count, by: 1) {
                // ...
            }
        }
    }
    

    As of Swift 4.1, IndexDistance is no longer used, and the distance between collection indices is now always expressed as an Int, see

    • SE-0191 Eliminate IndexDistance from Collection

    In particular the return type of count is Int. There is a type alias

    typealias IndexDistance = Int
    

    to make older code compile, but that is remarked deprecated and will be removed in a future version of Swift.

提交回复
热议问题