Hacker News new | past | comments | ask | show | jobs | submit login

Because it requires a non-trivial amount of static analysis work on the compiler side to make the UX comfortable. Rust's borrow checker mechanism took years to become pleasant for the end user to use. There is still next to no adoption of Ada outside of aerospace, defence, and certain embedded programming sectors.



> Because it requires a non-trivial amount of static analysis work on the compiler side to make the UX comfortable.

How so?

Any modern language that supports operator overloading enables a half arsed version where you create a struct/class for any subtype of integer you want (e.g. `milliseconds`), and then you create binary operators that takes in on both sides your class, and you add bounds checking yourself, and also this way you can restrict allowing "primitive" types from being just added/subtracted willy-nilly.

As I ranted about below, a simple "milliseconds" type would make life easier all over the place. Who hasn't mixed up an API call where part of it takes seconds and another part takes milliseconds?

That sort of crud should be easy for a simple subtyping system to catch.


So as is the answer to 99 out of 100 questions, it requires a lot of extra time and effort.

Getting those guarantees at compile time is very, very, very hard. Even ADA Spark struggles to verify seemingly simple stuff [1, 2]. It is much easier if you trade compile time for runtime verification, which you can do in C# (but not Mono) or via code comments in Java [3].

WRT subtypes: language designers have plenty of motivation to try and push every non-core feature into a library. Programming in Ada kinda sucks because there are soooo many reserved keywords. You are also stuck with bad design decisions forever, like with Java's Date API [4]. Since it's relatively easy to use the newtype pattern to achieve what you want, it probably doesn't factor into the core language design.

Which is to say that I largely agree with you, but the people paying for new languages have a different set of priorities. Making a cat video database go faster is very lucrative, but it's only with the rise of cryptocurrencies that there has been significant funding for high-assurance programming languages outside of defense and aerospace.

[1]: https://www.imperialviolet.org/2014/09/07/provers.html

[2]: https://blog.adacore.com/using-spark-to-prove-255-bit-intege...

[3]: https://www.openjml.org/

[4]: https://stackoverflow.com/questions/1571265/why-is-the-java-...


Thanks for the awesome answer, with citations!

As someone who has to do day to day programming in languages with minimal concept of safety (e.g. JavaScript, C), my concern isn't for provably correct, it's for "make my life easier by covering the 90% case."

Number of times I've needed to work with 255 bit integers: 0.

Number of times I've confused units of time passing into APIs: way too many.

Heck on one system I was working on, I had one API that took distance in as thousands of an inch. Everything else was in metric.

Fixing that sort of stuff could be done w/o great trouble, and it'd be a huge boon to everyday programmers.

Would it be a sexy language feature? No. But it'd make life better.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: