Some languages (like D and Zig) distinguish between "error defer" and "success defer", which solves the issue above.
Not as automatic as RAII, but better than the single defer (which sucks).
EDIT: My personal preference is actually something like Python's "with" or C#'s "using", since they rely on "destructors" rather than explicit calls to close. But they have the exact same issue as simple defer. If you ever need to store or return the thing, you need to handle it manually instead.
In my own usage I almost never keep stuff open for long or store it in complex data structures where this is an issue in practice, simple "using" is almost always enough. Rust and C++ need RAII because they also manage memory like any other resource, otherwise it would get incredibly painful.
(Zig uses arena and pool allocators to manage memory, if you had to manage all memory with defers I think people would go nuts)
The issue I run into with Python's "with" is that you can't turn a class into a context manager without changing the calling scope. Suppose you have a class that initializes itself from a file, reads the entire file into memory, then accesses a few bytes here and there of the contents as needed. Somebody rightfully notices that the memory usage is way too high, so you instead open the file, and on demand seek to the location and read those few bytes.
The class now stores a file handle, which it should close. So the class should be changed to a context manager. So every call site needs to change from "x = MyFoo()" to "with MyFoo() as x:" in order to work correctly. With RAII, everything is already lifetimes, so the calling scope doesn't need to change if you add a destructor.
I commented something relating to this in another post, but since in Python/C# the resource and the container of the resource don't actually have tied lifetimes, you can do a different pattern where MyFoo() is a perfectly normal class, with a method within it that actually creates the context manager that loads the resource temporarily.
If you do this, you can move and store MyFoo's around, but in order to access the resource without getting an exception to the face you need to use with, e.g.
myfoo = MyFoo()
# the actual resource is only acquired for the duration of the with block
with myfoo.acquire():
myfoo.read_stuff()
myfoos = [myfoo]
# later
with myfoos[0].acquire():
myfoo.read_stuff()
Lifetimes are loose in GCed languages, so you shouldn't really think of some object's lifetime and the resources lifetime to be tied in any way, nor should APIs be designed with that assumption, even if the object represents the resource (confusing I know, specially since the languages themselves get this wrong).
> My personal preference is actually something like Python's "with" or C#'s "using", since they rely on "destructors" rather than explicit calls to close. But they have the exact same issue as simple defer. If you ever need to store or return the thing, you need to handle it manually instead.
Python's "with" doesn't. That is to say, when the with-block closes, it doesn't call any kind of destructor against the variable. It calls the __exit__ method, but it still leaves the variable in-scope. And if the __exit__ method tries to call an actual destructor against the variable, you'll hit a double-free bug when the GC runs later.
You can safely return to the variable after storing it, and do anything it supports after-closing. Including returning the variable.
def retIOWrapper(x):
with open(x) as f:
pass
assert(f.closed == True)
return f
C#'s using is also not calling the "destructor", it's calling .Dispose()
C# and Python have actual destructors "nobody" uses because they are GCed languages which make this discussion confusing (well, in Python they do sometimes, because it is reference counted and more predictable, making the discussion even more confusing).
I'm treating __exit__() and .Dispose() as "destructors" here, in the sense that they are implicitly called methods that free the resource (like drop in Rust), vs defering a call to close, which is explicit and "configurable" (you have to call the right method).
You get to keep the variable around, but the resource itself is gone after the with/using block, you have to reacquire it. With some changes to the way the APIs were written, it would make a bit more sense (IMO):
files = [File("a.txt"), File("b.txt"), File("c.txt")]
for f in files:
with f.open('r'):
print(f.read())
return files
I personally extremely dislike the fact that "as f" is not scoped.
This can sometimes be useful, though I haven't yet decided if I think it's a dirty hack or not. At one point, I made a timer whose __enter__ method started the timer, and whose __exit__ method stopped the timer. After the "with", the elapsed time could be queried.
Not as automatic as RAII, but better than the single defer (which sucks).
EDIT: My personal preference is actually something like Python's "with" or C#'s "using", since they rely on "destructors" rather than explicit calls to close. But they have the exact same issue as simple defer. If you ever need to store or return the thing, you need to handle it manually instead.
In my own usage I almost never keep stuff open for long or store it in complex data structures where this is an issue in practice, simple "using" is almost always enough. Rust and C++ need RAII because they also manage memory like any other resource, otherwise it would get incredibly painful.
(Zig uses arena and pool allocators to manage memory, if you had to manage all memory with defers I think people would go nuts)