I like the philosophy of Rust (and some other languages) of "safe by default". Rust variables are immutable by default, but can be made mutable using "let mut". Rust is memory safe by default, but can be made unsafe using the drumroll "unsafe" keyword. As for null references, other languages still have them but enforce "strict null checking", such as Kotlin and TypeScript. They force you to verify a reference is not null before using it, but without the verbosity of an Optional or Maybe type.
Functional programming is great, but it's far from optimal in many situations. For example, implement a self-balancing binary search tree using a common imperative language with mutability. Then try implementing it again but using pure functional programming. Certainly very possible but also requires a lot more work when you're not allowed mutability.
I'd argue that FP is actually great for working with tree-like data structures. For example, the common implementations for sets and maps are based on balanced trees. IME it's graph-like or array-based stuff where the paradigm struggles.
I don't disagree that manipulating deep arrays and graph data can be harder in an immutable functional language, at least with default commands.. however many fp languages have powerful and mature libraries that let you work with deep data structures very gracefully and correctly. Clojure has Specter, Haskell has lenses and stuff.
For me, the FP dream ( and we aren't quite there yet) is that your program is an airtight declaration of the desired functionality, and then we get out of the way and let the compilers and profiling runtimes optimize away.
How cool would it be if the runtime stats could let the jit either unroll and inline a function if it's called on small arrays from network buffers, or push it to gpu if it's frequently called on huge in ram buffers? Not there yet, but as the tower of complex tooling needed to work in a compute parallel world keeps growing, we are going to be relying more and more on the compiler/runtime to optimally use that power.
> For me, the FP dream ( and we aren't quite there yet) is that your program is an airtight declaration of the desired functionality, and then we get out of the way and let the compilers and profiling runtimes optimize away.
I would guess we are at least 100 years away from this. Current imperative compilers miss obvious optimizations and FP typically needs many layers of optimization before it would be efficient. Any one layer failing to make progress would result in a very slow program.
Currently FP compilers generally don't even use successors for natural numbers because it would be slow. Typical programs use other data structures more complex than numbers that would need to be optimized into a mutable structure. If FP compilers can't do it for integers, they definitely can't for FP programs as a whole.
great for reading trees, but not for cases where nodes need to be added or removed. There's a "zipper tree" structure but it's kind of a pain to implement:
That's what I meant: You can mostly use tree-like log(n) data structures rather nicely but you'll have to isolate your mutating code into stuff like PrimMonad, see:
Actually, implementing AVL tree in purely functional way is easy, I'd say it's easier than doing it in the mutable in-place way. It will allocate more, but will have the optimal O(something) complexity.
Many other algorithms are much harder, though, especially those requiring fast array indexing (graph search, hash tables, ...)
A better example would be to implement an efficient HashMap in pure FP. This is simply impossible unless you fall back to some tree based solution which is less efficient.
Interested why this is downvoted .. Hashmaps can have O(1) time complexity for lookups and inserts in the imperative world, in pure FP either lookup or insert can be no better than O(log(n))
No, this is not the case. It is actually for non FP solutions that the amortized performance converges to constant time. But for pure FP, there is no solution that performs in constant time, also not when amortized.
Functional programming is great, but it's far from optimal in many situations. For example, implement a self-balancing binary search tree using a common imperative language with mutability. Then try implementing it again but using pure functional programming. Certainly very possible but also requires a lot more work when you're not allowed mutability.