Hacker News new | past | comments | ask | show | jobs | submit login

> Huh. I find that when my C++ compiles, it almost always works. Correctly. > > But I used the language for nearly 20 years, so maybe it's experience and the patterns I've learned?

By contrast, I learnt OCaml and Haskell 2 weeks ago, and when my code in those languages compiles, it always works. I expect I'll have the same experience with Rust.




Funny. I managed to get Haskell to crash within about 20 minutes of first trying it out.

OCaml is supposed to be pretty fast, but so far Haskell hasn't impressed me with performance either.


What code did you write?


I wrote some code that iteratively would have just taken a while to run, but because Haskell wasn't using true tail recursion, it ran out of stack space.

I also read a paper that talked about extensive research into optimizing a convolutional network algorithm in Haskell, if I remember correctly. The algorithm even used mutable data structures for speed (because of course you need data to be mutable for speed). Roughly speaking, if you ran it on 16 CPUs, it was only 4x slower than the C algorithm running on one CPU. And Amdahl's Law [1] (as well as the graph in the article) hints that it may never actually be faster than the single CPU C version, no matter how many CPUs you split it onto, because of the overhead of sending the data around.

When people claim "immutability makes it easier to run on more threads, which is the future of optimization!" I just want to cringe. Immutability kills most performance enhancements you can possibly make on nontrivial code, even considering a single thread, having all your data be immutable does nothing at all to improve threading, and forcing all data transferred to another thread to be immutable kills a whole category of optimization.

When dealing with large amounts of data, you save tons of time by not copying it around.

/rant

Sorry. I've been listening to too many people talk about Haskell like it's an amazing silver bullet that will solve all of our problems. But There Is No Silver Bullet. Really.

[1] https://en.wikipedia.org/wiki/Amdahl's_law


Well, FWIW worth that's not the primary reason I like those languages. It's the safety.

After some experience with immutable data structures, you know how to make them efficient. For example most of the time you don't need to copy that much data - if you need to change a small part of a large data structure, you arrange it so you can re-use (re-point to) the parts you didn't change. On average you should be able to reduce your cost down to O(log(n)) which usually is "good enough".

That is the correct trade-off IMO - have a newbie start off writing safe code, that gets quicker and less memory intensive with experience. Not start off be blazing quick but unsafe, then add the safety with experience.

Did you know Rust (which defaults to immutable) is currently head-to-head with C in Debian's "fastest languages" comparison?

https://benchmarksgame.alioth.debian.org/u64q/rust.html


Haskell is beautiful - including the syntax and typesystem, but the performance claims made for it are ridiculous.

I switched to Ocaml to avoid the un-evalauted thunk and space complexity overhead.

Ocaml also doesn't compete with well written C or C++, but it's a lot more performant than Haskell for general purpose code.


Something like `main = head []` probably, lol.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: