Hacker News new | past | comments | ask | show | jobs | submit login
The Nit Programming Language (nitlanguage.org)
60 points by olivia_S on May 23, 2016 | hide | past | web | favorite | 54 comments

Never could understand who doesn't put code examples on the main webpage of their language site. Especially seeing that their 'hello world' example is not really telling us anything.

It's like they are ashamed of the language they created and would rather hide it than display it.

I agree. The first thing I want to see when I hear about a new language is what it looks like. The second thing I want to see is what its unique features or selling point is. Third thing I want to see is what that looks like in code.

What's worse is that most languages that don't show code samples on their front page also usually don't seem to have easy to find example code. Language tutorials/references are very bad at giving you a feel of what a language is like because they don't give a complete program that shows off how features interact (usually they show features in isolation) and I don't want to have to dig through your code repository to hopefully find some sample code...


Came here to say the same exact thing. It's plain inconvenient to have to go to the github page to view an example when they hint towards it with the one line of code on the front page. Ideally, it would be nice to have a whole page on the site devoted to examples with a sandbox so you can play around with them (although this is often difficult to implement). It's not just about looks, if I'm going to install the language I want to have some short code to run as soon as I download it instead of having to start with the docs.

Pardon my ignorance but I thought by now everyone agreed that Multiple inheritance was the root of all evil. (That and the others). Also I, like many, have moved towards greener pasture, such as composition over inheritance. Did I miss a train?

Well, lots of languages have mixins or traits, which is pretty similar. Also multiple inheritance got a bad rep through C++'s particular implementation of it, but it doesn't need to get so messy.

Eiffel, for example, a language that nobody uses but that people like me love to discuss in OO related debates, has much simpler and better support for multiple inheritance: If theres any clash of methods (name+signature) from different parents, you must choose which one the child class gets or it won't compile.

Im not sure what Nit does, but it might not be as brain dead as waht C++ does.

Finally, I prefer composition over inheritance too, but on the other side many of my React components have multiple mixins. I couldnt do that if I had used ES6 classes. Given that Nit seems targeted at UIs (android, obcj support), I guess it makes a lot of sense.

Can't you have "mixins" with ES6 classes by decorating them (e.g. applying a function to the class which adds the desired behaviour)? Related ECMAscript proposal which adds some syntax sugar: https://github.com/wycats/javascript-decorators

Yeah sure, but my point is not about how flexible JS is or isn't, but about how multiple inheritance is a good fit for certain areas, such as UI code.

Thanks skrebbel :)

No problem keyle :)

> Pardon my ignorance but I thought by now everyone agreed that Multiple inheritance was the root of all evil.


How it's implemented makes the difference. I'd say the general consensus is that C++' version of it is probably the worst (although to be fair, as much as I dislike C++, its implementation of MI has never really bothered me much in practice).

Modern languages support multiple inheritance of implementations with traits, which have a few additional restrictions compared to classes. These restrictions address some of the flaws that MI can have.

The bottom line is that you will rarely need MI and when you do, its flaws will probably not be much of a hindrance, so overall, it's a pretty overblown problem.

But isn't that sort of the point? Traits aren't multiple inheritance.

O think there is a problem with multiple inheritance being somewhat ill-defined.

I recall reading OOSC[0], where Meyer showed that MI in Eiffel is fine, and it basically boiled down to: when there is an ambiguity, force the child class to rename the conflicting methods/redefine them, which is very close to what the original traits paper[1] suggests (bar state in traits, IIRC).

[0] https://en.wikipedia.org/wiki/Object-Oriented_Software_Const... [1] http://www.ptidej.net/courses/ift6251/fall06/presentations/0...

There are multiple meanings of traits flying around in different languages. Some traits are type classes (Rust), some traits are mixins (Scala). The former are not really related to class-based inheritance, while the latter most definitely is (linearized multiple inheritance as defined by Cook & Bracha, not the mixins of CommonLisp). Self and smalltalk style traits are even more different.

Don't forget C++ traits, which are more like a design pattern.

CS papers about it might disagree.

Mutliple inheritence can work quite nicely, assuming a reasonable "linearization" algorithm that decides what order `super` should use when walking up the class hierarchy, so that no method needs to worry about calling into multiple parent classes. The last I looked at the research, the gold standard for linearization was the C3 algorithm:


This is supposedly used by Python, by recent versions of Dylan, and by Perl 6. I've worked in projects that used extensive multiple inheritance and superclass linearization, and it's not bad at all. Now, that doesn't mean that inheritance is always an optimal design—composition and traits are both extremely useful—but multiple inheritance can work fairly well.

A lot of people never received that memo, sadly.

> aims at elegance, simplicity and intuitiveness.

As opposed to all the other languages that aim at ugliness, complexity and the principle of most surprise?

> elegance, simplicity

There are languages whose designers don't consider elegance or simplicity a main priority, e.g., C++.

> intuitiveness

It doesn't make sense to talk about intuitiveness without reference to a specific person's intuitions. Intuition is the ability to arrive at “probably correct” conclusions without performing a rigorous logical inference. Reliable intutions doesn't arise in a vacuum: they are the product of experience and hard work. Thus, advertising a language as “intuitive” without further qualification is meaningless at best, and a lie at worst.

All languages obviously aim to have as many 'good' qualities as possible.

However trade-offs are inevitable and it surely tells you something if a language puts these particular priorities first over say - performance, compactness or any other adjectives they could have picked.

To be fair there are competing objectives, e.g. speed and robustness.

Nit seems a lot like Nim at first glance. http://nim-lang.org/docs/tut1.html

This is in desperate need of a natural language processing library called 'wit'

Data serialization library calley Sit?

Some Nit picks (sorry, I couldn't resist the pun):

(0) Are the covariant virtual types they advertise actually sound? In their Employee/Office example, what prevents me from doing this?

    var e: Employee = some_boss_object
    e.office = office_with_no_fridge
(1) If you hate copy-pasting, inheritance is not for you, because inheritance is literally copy-paste automation. If you hate copy-pasting, you come up with designs that don't require copy-pasting, whether manual or automatic.

(2) If I monke... I mean, refine a class in a module Foo, will the changes be seen in a module Bar? Is there some way to limit the scope of a refinement?

> inheritance is literally copy-paste automation.

I don't get this. The problem with copy pasted code is that you can forget to push fixes to both copies. So it is perfectly consistent to be against manual text copies, but for templates, macros, compiler optimizations, inheritence, and all the other forms of abstraction that allow you to not repeat code.

Moreover, of the forms of abstraction availible, inheritence usually does much less copy pasting under the hood. Instead, it makes use of vtables to actually reuse the binary.

> The problem with copy pasted code is that you can forget to push fixes to both copies.

That's a problem with manually copy-pasted code. Copy-paste automation solves this particular problem, but it creates others, like a combinatorial explosion of definitions that are similar enough to tempt humans to conflate them, but different enough to make such conflation logically unsound.

> inheritence usually does much less copy pasting under the hood. Instead, it makes use of vtables to actually reuse the binary.

I'm not talking about the implementation. Indeed, as you observe, using vtable pointers is a common technique to eliminate or reduce the need to make physical copies.

But, at the abstraction level, inheritance is fundamentally a copy-paste mechanism between two definitions. When a definition Foo inherits from another definition Bar, the parts of Bar that Foo retains (doesn't override) are copied into Foo's definition. In addition, sometimes an implicit coercion from Bar into Foo is provided. However, you can't conclude that Foo is really a specialization of Bar. This isn't true if Foo overrides any inherited behavior from Bar. In other words, the Liskov substitution principle actually matters!

> but for templates, macros, compiler optimizations, inheritence,

Of course, from this point of view, macros and templates are just more powerful forms of copy-paste automation. Compiler optimizations are an implementation detail, not a language construct, and thus need a separate discussion.

> and all the other forms of abstraction that allow you to not repeat code.

Abstraction is actually about controlling the visibility of implementation details. It's a powerful organization tool, even if you don't actually intend to reuse any code.

On the other hand, code reuse can be achieved by various means, ranging from principled (parametric polymorphism, hygienic macros) to ad-hoc (templates, unhygienic macros), with inheritance falling somewhere in between.

This looks similar to Crystal[1], another statically typed OO language with ruby-like syntax.

[1] http://crystal-lang.org/

KISS may also mean keep the world simple by not adding too many languages. Maybe this language is the most beautiful in the world, but when it comes to languages, standard ugliness is at-least as beautiful as non-standard beauty. No offence of-course.

Good job your argument didn't win the day a few decades back. ;-)

I'm not really understanding whats unique about this language from the "why" section in the main page. Every other language is also going to say its KISS, without verbosity, efficient, and so on...

By reading the documentation, which is very incomplete, I cannot imagine myself doing anything practical with this language.

I'm biased in the matter, but I have to agree. There are a number of problems with it.

Arrays are covariant, and object is the base class of everything. I'm surprised that nobody's noticed that already (it's unsafe: arrays should instead be invariant).

It doesn't seem like there are any first class functions, and there's no tuple class either. I don't find any mention of mapping/selecting/reducing within standard collections.

For all (heh) the boasting about the type system, covariant arrays produce other problems too. If you have a function that takes an `Array[nullable Int]` and send it `Array[Int]`, the type system doesn't say anything about it. That means you could get into problems down the line. It's also possible to take a nullable Int and do `.as(Int)`

`var v = 1 / 0` causes FPE and core dump and dumps an ugly compiler stack trace.

The documentation about abort also mentions that there's no stack trace for it.

I don't think there are standalone functions with generic typing either, just classes with generic types. I suspect that their virtual types are probably unsound, but haven't tried working with them.

> Arrays are covariant, and object is the base class of everything. I'm surprised that nobody's noticed that already (it's unsafe: arrays should instead be invariant).

The real problem is that most languages don't actually give you arrays, they just give you mutable cells containing arrays of mutable cells. Remove mutable cells from the picture, and you'll see how wonderfully covariant array types are.

Arrays are really second-class values in the vast majority of languages. The litmus test for first-class-value-ness is “Do I need to worry about the identity of the object that holds this value?” The answer should be “no” if the value is first-class.

> `var v = 1 / 0` causes FPE and core dump and dumps an ugly compiler stack trace.

Exceptions don't make much sense for handling programming errors like division by 0. What is your program supposed to do? Fix itself?

> I suspect that their virtual types are probably unsound, but haven't tried working with them.

They are indeed unsound, for pretty much the same reason covariant method arguments are unsound in Eiffel. I have an example here: https://news.ycombinator.com/item?id=11752071

The point here is: do we need a new scripting language in 2016 that only supports OOP and let devs use multiple inheritance even though it was said not to use it since 20 years ago? That pretty much summarizes my opinion.

I sort of agree.

I initially thought (from hello world and the bullets) that this would be Python with static types. On closer inspection, it's Ruby with static types (see the do/end blocks).

While I'm not sure you can ever have too many programming languages, I am pretty sure I won't be using this one.

What's even worse is that it's not even a good Ruby like Crystal. It seems to be missing any sort of callable object (can't find any mention of a proc, a block, or a lambda in the website documentation). So consequently there's no select, reject, each, or really any sort of mapping in the standard library (unless it's buried or I missed it).

So not really Ruby at all! I thought, after posting, that I should have clarified I was only talking about the syntax. The runtime (absolutely critical for a scripting language) I didn't investigate.

Here is a list of similar languages that operate on the fringes and work in a similar compiled-yet-readable (somewhat) fashion:



http://lbstanza.org/ (posted a few days ago)

Here is the 1 clear and important problem these languages have (which is similar to the marketplace-dilemma for ebay-type businesses):

They lack support from the community because the community would come only if they had a strong set of libraries/packages to do "actual" work OR if they had the financial-muscle of a big company backing it (Go from Google being the prime example) AND they need the community to actually build these frameworks/libraries to make the language popular/usable to more than 40 people.

If any of these languages want to expand beyond "we're an amazing systems-language that you can build: games, web applications, OSes, desktop applications, mobile applications with", they need to focus on a problem-domain and address it.

Here are some examples:

- Elixir - although not explicitly saying it, with the rise of Phoenix, they are clearly targeting web-backends with good performance

- Kotlin - Java is shit, so they are addressing the Android issue with a language that some say is better

I'm not exactly sure what specific problem-domains the 3 languages above are addressing, but ask yourself this:

If you are going to use a systems-language (similar to those above) to build some micro-service or web backend, why use anything over than Go?

It has community support (and is presumably the 1 systems-language that is growing/being-used more and more), which is 10x more useful when you have a StackOverflow question or a situation where someone else has probably experienced and fixed it.

I say this as someone who likes these fringe-languages, but nobody is going to be building their bank-IT-infrastructure in "Nit" any time soon.

For me, it is communities that make or break a language. By starting from scratch, with a new language, you are starting with 0 users. You have to win hearts and minds, rewrite the libraries and get market share. Even then there will be places where this new language will not be allowed. I think a new language it's rarely a good solution, ideally we should have an assembly language, a procedural language, Oo language and a functional language, ideally syntactically consistent.

For those of us that are not professional software developers, could someone explain the differences between Nit and Python?

They seem to be syntactically similar, at least to my layman eyes.

Nit is statically typed, Python isn't. Syntactically, Nit's more like Go, Swift, or Ruby -- newlines can mean something, but it doesn't seem like indentation matters because there's an "end" keyword.

Is it just me, or is Nit's answer to dependency management: "just throw it in the standard library"?

These are what I found taking a quick look through the standard library (http://nitlanguage.org/doc/stdlib/):

- OpenGL Wrapper A (http://nitlanguage.org/doc/stdlib/group_egl.html)

- OpenGL Wrapper B (http://nitlanguage.org/doc/stdlib/group_glesv2.html)

- emscripten framework (http://nitlanguage.org/doc/stdlib/group_emscripten.html)

- Github API wrapper (http://nitlanguage.org/doc/stdlib/group_github.html)

- Web Framework A (http://nitlanguage.org/doc/stdlib/group_nitcorn.html)

- Web Framework B (http://nitlanguage.org/doc/stdlib/group_popcorn.html)

- Game Framework A (http://nitlanguage.org/doc/stdlib/group_mnit.html)

- Game Framework B (http://nitlanguage.org/doc/stdlib/group_gamnit.html)

- Game Framework C (http://nitlanguage.org/doc/stdlib/group_bucketed_game.html)

- Game Framework D (http://nitlanguage.org/doc/stdlib/group_scene2d.html)

- A twitter like web app (http://nitlanguage.org/doc/stdlib/group_tnitter.html)

- Web app for planning meetups (http://nitlanguage.org/doc/stdlib/group_opportunity.html)

- An unofficial app for bargoers?? (http://nitlanguage.org/doc/stdlib/group_benitlux.html)

That made my day! Any programming language that has a Beer class (http://nitlanguage.org/doc/stdlib/class_benitlux__src__benit...) in the standard library must be doing something right.

In fairness, dependency management is a difficult problem, and even some mature languages struggle to get it right - I don't blame the nit team for not tackling the problem at this stage. It does make me wonder if there is room for a language-agnostic dependency manager, rather than each language reinventing the wheel.

I kinda like it. It reads almost like a modern BASIC. It's not too far away from VB.Net even. No wonder the other people here don't like it :D

Looks like another Python to me, but I don't need another Python.

If you choose nit as your programming language, are you nitpicky ?

Hey I know the guys behind this language!

On, no! Yet another "fun" language, i.e. a language that uses "fun" instead of the more standard "fn" or "func", just because it's 3-letters long and aligns better than 2- or 4-letter keywords (if you use 4-space indentation)! In fact, 2-letter indents are better - especially when your language suggests lines being less than 80-characters long!

The Linux kernel has a code style with 8 spaces and still has the 80 characters maximum. The coding style says "if you need more than 3 levels of indentation, you're screwed anyway, and should fix your program.". So not everybody agrees that you need deep indentation.


The kernel is in C. In OOP, there's at least one level coming from class declarations. 3 levels are more than normal in OOP!

That's true. And in the OOP case you usually have exception handling (i.e. try/catch blocks) as well, instead of "goto error" style error handling which is common in C.

Applications are open for YC Winter 2020

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact