It's like they are ashamed of the language they created and would rather hide it than display it.
What's worse is that most languages that don't show code samples on their front page also usually don't seem to have easy to find example code. Language tutorials/references are very bad at giving you a feel of what a language is like because they don't give a complete program that shows off how features interact (usually they show features in isolation) and I don't want to have to dig through your code repository to hopefully find some sample code...
Eiffel, for example, a language that nobody uses but that people like me love to discuss in OO related debates, has much simpler and better support for multiple inheritance: If theres any clash of methods (name+signature) from different parents, you must choose which one the child class gets or it won't compile.
Im not sure what Nit does, but it might not be as brain dead as waht C++ does.
Finally, I prefer composition over inheritance too, but on the other side many of my React components have multiple mixins. I couldnt do that if I had used ES6 classes. Given that Nit seems targeted at UIs (android, obcj support), I guess it makes a lot of sense.
How it's implemented makes the difference. I'd say the general consensus is that C++' version of it is probably the worst (although to be fair, as much as I dislike C++, its implementation of MI has never really bothered me much in practice).
Modern languages support multiple inheritance of implementations with traits, which have a few additional restrictions compared to classes. These restrictions address some of the flaws that MI can have.
The bottom line is that you will rarely need MI and when you do, its flaws will probably not be much of a hindrance, so overall, it's a pretty overblown problem.
I recall reading OOSC, where Meyer showed that MI in Eiffel is fine, and it basically boiled down to: when there is an ambiguity, force the child class to rename the conflicting methods/redefine them, which is very close to what the original traits paper suggests (bar state in traits, IIRC).
This is supposedly used by Python, by recent versions of Dylan, and by Perl 6. I've worked in projects that used extensive multiple inheritance and superclass linearization, and it's not bad at all. Now, that doesn't mean that inheritance is always an optimal design—composition and traits are both extremely useful—but multiple inheritance can work fairly well.
As opposed to all the other languages that aim at ugliness, complexity and the principle of most surprise?
There are languages whose designers don't consider elegance or simplicity a main priority, e.g., C++.
It doesn't make sense to talk about intuitiveness without reference to a specific person's intuitions. Intuition is the ability to arrive at “probably correct” conclusions without performing a rigorous logical inference. Reliable intutions doesn't arise in a vacuum: they are the product of experience and hard work. Thus, advertising a language as “intuitive” without further qualification is meaningless at best, and a lie at worst.
However trade-offs are inevitable and it surely tells you something if a language puts these particular priorities first over say - performance, compactness or any other adjectives they could have picked.
(0) Are the covariant virtual types they advertise actually sound? In their Employee/Office example, what prevents me from doing this?
var e: Employee = some_boss_object
e.office = office_with_no_fridge
(2) If I monke... I mean, refine a class in a module Foo, will the changes be seen in a module Bar? Is there some way to limit the scope of a refinement?
I don't get this. The problem with copy pasted code is that you can forget to push fixes to both copies. So it is perfectly consistent to be against manual text copies, but for templates, macros, compiler optimizations, inheritence, and all the other forms of abstraction that allow you to not repeat code.
Moreover, of the forms of abstraction availible, inheritence usually does much less copy pasting under the hood. Instead, it makes use of vtables to actually reuse the binary.
That's a problem with manually copy-pasted code. Copy-paste automation solves this particular problem, but it creates others, like a combinatorial explosion of definitions that are similar enough to tempt humans to conflate them, but different enough to make such conflation logically unsound.
> inheritence usually does much less copy pasting under the hood. Instead, it makes use of vtables to actually reuse the binary.
I'm not talking about the implementation. Indeed, as you observe, using vtable pointers is a common technique to eliminate or reduce the need to make physical copies.
But, at the abstraction level, inheritance is fundamentally a copy-paste mechanism between two definitions. When a definition Foo inherits from another definition Bar, the parts of Bar that Foo retains (doesn't override) are copied into Foo's definition. In addition, sometimes an implicit coercion from Bar into Foo is provided. However, you can't conclude that Foo is really a specialization of Bar. This isn't true if Foo overrides any inherited behavior from Bar. In other words, the Liskov substitution principle actually matters!
> but for templates, macros, compiler optimizations, inheritence,
Of course, from this point of view, macros and templates are just more powerful forms of copy-paste automation. Compiler optimizations are an implementation detail, not a language construct, and thus need a separate discussion.
> and all the other forms of abstraction that allow you to not repeat code.
Abstraction is actually about controlling the visibility of implementation details. It's a powerful organization tool, even if you don't actually intend to reuse any code.
On the other hand, code reuse can be achieved by various means, ranging from principled (parametric polymorphism, hygienic macros) to ad-hoc (templates, unhygienic macros), with inheritance falling somewhere in between.
Arrays are covariant, and object is the base class of everything. I'm surprised that nobody's noticed that already (it's unsafe: arrays should instead be invariant).
It doesn't seem like there are any first class functions, and there's no tuple class either. I don't find any mention of mapping/selecting/reducing within standard collections.
For all (heh) the boasting about the type system, covariant arrays produce other problems too. If you have a function that takes an `Array[nullable Int]` and send it `Array[Int]`, the type system doesn't say anything about it. That means you could get into problems down the line. It's also possible to take a nullable Int and do `.as(Int)`
`var v = 1 / 0` causes FPE and core dump and dumps an ugly compiler stack trace.
The documentation about abort also mentions that there's no stack trace for it.
I don't think there are standalone functions with generic typing either, just classes with generic types. I suspect that their virtual types are probably unsound, but haven't tried working with them.
The real problem is that most languages don't actually give you arrays, they just give you mutable cells containing arrays of mutable cells. Remove mutable cells from the picture, and you'll see how wonderfully covariant array types are.
Arrays are really second-class values in the vast majority of languages. The litmus test for first-class-value-ness is “Do I need to worry about the identity of the object that holds this value?” The answer should be “no” if the value is first-class.
> `var v = 1 / 0` causes FPE and core dump and dumps an ugly compiler stack trace.
Exceptions don't make much sense for handling programming errors like division by 0. What is your program supposed to do? Fix itself?
> I suspect that their virtual types are probably unsound, but haven't tried working with them.
They are indeed unsound, for pretty much the same reason covariant method arguments are unsound in Eiffel. I have an example here: https://news.ycombinator.com/item?id=11752071
I initially thought (from hello world and the bullets) that this would be Python with static types. On closer inspection, it's Ruby with static types (see the do/end blocks).
While I'm not sure you can ever have too many programming languages, I am pretty sure I won't be using this one.
http://lbstanza.org/ (posted a few days ago)
Here is the 1 clear and important problem these languages have (which is similar to the marketplace-dilemma for ebay-type businesses):
They lack support from the community because the community would come only if they had a strong set of libraries/packages to do "actual" work OR if they had the financial-muscle of a big company backing it (Go from Google being the prime example) AND they need the community to actually build these frameworks/libraries to make the language popular/usable to more than 40 people.
If any of these languages want to expand beyond "we're an amazing systems-language that you can build: games, web applications, OSes, desktop applications, mobile applications with", they need to focus on a problem-domain and address it.
Here are some examples:
- Elixir - although not explicitly saying it, with the rise of Phoenix, they are clearly targeting web-backends with good performance
- Kotlin - Java is shit, so they are addressing the Android issue with a language that some say is better
I'm not exactly sure what specific problem-domains the 3 languages above are addressing, but ask yourself this:
If you are going to use a systems-language (similar to those above) to build some micro-service or web backend, why use anything over than Go?
It has community support (and is presumably the 1 systems-language that is growing/being-used more and more), which is 10x more useful when you have a StackOverflow question or a situation where someone else has probably experienced and fixed it.
I say this as someone who likes these fringe-languages, but nobody is going to be building their bank-IT-infrastructure in "Nit" any time soon.
They seem to be syntactically similar, at least to my layman eyes.
These are what I found taking a quick look through the standard library (http://nitlanguage.org/doc/stdlib/):
- OpenGL Wrapper A (http://nitlanguage.org/doc/stdlib/group_egl.html)
- OpenGL Wrapper B (http://nitlanguage.org/doc/stdlib/group_glesv2.html)
- emscripten framework (http://nitlanguage.org/doc/stdlib/group_emscripten.html)
- Github API wrapper (http://nitlanguage.org/doc/stdlib/group_github.html)
- Web Framework A (http://nitlanguage.org/doc/stdlib/group_nitcorn.html)
- Web Framework B (http://nitlanguage.org/doc/stdlib/group_popcorn.html)
- Game Framework A (http://nitlanguage.org/doc/stdlib/group_mnit.html)
- Game Framework B (http://nitlanguage.org/doc/stdlib/group_gamnit.html)
- Game Framework C (http://nitlanguage.org/doc/stdlib/group_bucketed_game.html)
- Game Framework D (http://nitlanguage.org/doc/stdlib/group_scene2d.html)
- A twitter like web app (http://nitlanguage.org/doc/stdlib/group_tnitter.html)
- Web app for planning meetups (http://nitlanguage.org/doc/stdlib/group_opportunity.html)
- An unofficial app for bargoers?? (http://nitlanguage.org/doc/stdlib/group_benitlux.html)
In fairness, dependency management is a difficult problem, and even some mature languages struggle to get it right - I don't blame the nit team for not tackling the problem at this stage. It does make me wonder if there is room for a language-agnostic dependency manager, rather than each language reinventing the wheel.