问题
In Swift, I can use the ARC mechanism to manage the lifetime of resources external to the process because instances of classes are de-initialized predictably. This is in contrast to environments like the Java Runtime where instances are de-initialized when the garbage collector collects the object, which is not guaranteed to happen in a defined time window.
But what are the exact guarantees that the Swift language and runtime make about the lifetime of instances when those instances are referenced by local variables? E.g. what is the earliest point that an instance may be deallocated, when a local variable is holding the only reference to it?
In the following example, I am creating an instance of a class and store a reference to it in a local variable.
public final class Something {
init() { print("something.init()") }
deinit { print("something.deinit()") }
}
func useSomething() {
let something = Something()
print("useSomething()")
}
useSomething()
The variable is not used after the point where I print useSomething() but deinit runs consistenly after that call to print():
$ swift run -c release
something.init()
useSomething()
something.deinit()
It seems that references are always decremented at the point where the variable goes out of scope. Wrapping the variable declaration in a do block changes the order:
func useSomething() {
do { let something = Something() }
print("useSomething()")
}
$ swift run -c release
something.init()
something.deinit()
useSomething()
Is this order guaranteed or can it change with a different compiler or optimization level?
The reason I'm interested in this is that I want to wrap C APIs in object-oriented Swift APIs and would like to automatically manage the lifetime of resources allocated using a C API using Swift classes and reference counting. This works great if every usage of the C API requires a reference to the resource it operates on because I know that the Swift instance will live at least until the last call that operates on the resource that instance is representing.
But some APIs use global state to select a resource and subsequent calls to the API do not require a reference to the resource to be passed and implicitly operate on the selected resource instead. OpenGL's glDrawElements() implicitly uses maybe 5 or 10 such resources (vertex arrays, shaders, frame buffers, textures …).
回答1:
Swift makes no guarantee about the the lifetime of an object until the end of the closest surrounding scope, see for example the following threads in the Swift Forum:
- Should Swift apply “statement scope” for ARC
- ARC // Precise Lifetime Semantics
where it is stated you can use withExtendedLifetime(_:_:):
Evaluates a closure while ensuring that the given instance is not destroyed before the closure returns.
for that purpose. As for the rationale, Dave Abrahams (Apple) states:
The lack of such a guarantee, which is very seldom actually useful anyhow, is what allows us to turn costly copies (with associated refcount traffic and, often CoW allocation and copying fallout) into moves, which are practically free. Adopting it would basically kill our performance story for CoW.
And Joe Groff (Apple) in the same thread:
Yeah, if you want to vend resources managed by an object to consumers outside of that object like this, you need to use withExtendedLifetime to keep the object alive for as long as you're using the resources. A cleaner way to model this might be to put the class or protocol in control of handling the I/O to the file handle, instead of vending the file handle itself, so that the ownership semantics fall out more naturally:
来源:https://stackoverflow.com/questions/48974241/guarantees-about-the-lifetime-of-a-reference-in-a-local-variable