Hacker News new | past | comments | ask | show | jobs | submit login
What is special about Nim? (hookrace.net)
269 points by def- on Jan 1, 2015 | hide | past | favorite | 84 comments

Great write-up! I expected this to reference a few "normal" language features like static typing, operator overloading, or generics, but instead it was a list of some really neat off-the-beaten-track features:

  * Run regular code at compile time
  * Extend the language (AST templates and macros);
    this can be used to add a form of list comprehensions to the language!
  * Add your own optimizations to the compiler!
  * Bind (easily) to your favorite C functions and libraries
  * Control when and for how long the garbage collector runs
  * Type safe sets and arrays of enums; this was cool:
    "Internally the set works as an efficient bitvector."
  * Unified Call Syntax, so mystr.len() is equivalent to len(mystr)
  * Good performance -- not placing too much emphasis on this,
    but it's faster than C++ in his benchmark
  * Compile to JavaScript

As a pythonista one thing that struck me in the code fragments is zeroes and ones appearing everywhere. It looks very easy to have off by one errors. (It is very rare to have off by one errors in Python due to the way counting and ranges work.)

I guess you could use fewer magic numbers if you want, for example instead of:

  proc createCRCTable(): array[256, CRC32] =
    for i in 0..255:
You can write:

  proc createCRCTable(): array[256, CRC32] =
    for i in result.low .. result.high:

  proc createCRCTable(): array[256, CRC32] =
    for i, v in result: # index, value
Or define your own indices iterator:

  iterator indices(x) =
    for i in x.low .. x.high:
      yield i

  proc createCRCTable(): array[256, CRC32] =
    for i in result.indices:
I think I should propose adding indices to the standard library.

It does strike me as odd that some Ada-isms are present like 'low and 'high, applicable to types and objects, but the ever useful 'range didn't get included. It is a testament to Nim's flexibility that it can be added, effectively as a one-liner. This really should be in the standard library.

It isn't magic numbers as such, rather the smattering of sometimes zeroes and sometimes ones. If it was always zero or one it would be a lot less likely to have off by one errors.

Python solves this by always starting from zero, and crucially not including the final number - ie range(0,3) gives 0, 1, 2 (no 3). You can also do negative indexing to count from the end - eg range(0, 3)[-1] gives the last element (2). The right thing happens when you mix the length of things into the arithmetic too - eg range(0, len(item)).

The 0s and 1s are specific to the examples. A range in nim `x..y` includes both x and y. Sequences are always 0-indexed. Now, if you want to start with 1 for any range you can.

I don't program in Python, but I thought one of the contention points the community had was the syntax of slices and such (starting at one instead of zero, or one sire of the range being inclusive, I can't recall). If that's true, I would imagine that would encourage for some off by one errors. Then again, maybe it's not nearly as contentious as it sounded in a few comments.

Slices start at 0, not 1. However they are inclusive of both boundary items. This matches how ranges are specified in Nim.

Slices in Python are not inclusive of the second index; lst[0:2] gives you a list containing the first two elements.

Everything old is new again.

Dont know why you have been downvoted, because this is a kind of "pascal rebirth - the mission".

Of course it borrow some things from Python.. but i dont know why this has to be bad. When i was a kid pascal was a thing.. and my first hello worlds were programmed in it (away from C), then in my teens, Delphi was a thing, relearn to program in it, and it kept me away from Visual Basic (ohh-hay)

Pascal/Delphi was so fun to program with (thanks Hejlsberg!), but somehow the tech world turned into this C syntax dominated world, because of the popularity of the Unix..

And i must say, playing a little bit with Nim, it make me feel that again.. the joy of programming..

And i've always feel Python kind of nasty, i confess, not very easy to read the code.. like a messy code comming from a five year old.. but Nim doesnt have that feel.. it feels like pascal in the old days.. pretty serious but also fun and pragmatic!

Care to flesh that out a bit?

When I saw that the top comment is a list of language features which are mostly extremely old, I thought that was interesting, so I pointed out how old those features are. And got hammered for it in spite of the fact that it's true.

Obviously, it doesn't matter whether those features are old or new. What matters is whether the language executes them in a compelling way. But almost every single one of those features has been a core aspect of programming languages which have been around since before most of us were born.

Also, the mods have strongly urged me to write shorter comments, and to write comments less frequently. And they've suggested that more than once. I try to oblige, and this is the result. So technically I can't flesh out my comments without going against their advice.

Of that list of nine features, I'm certain six are extremely old:

  * Run regular code at compile time
  * Extend the language (AST templates and macros);
    this can be used to add a form of list comprehensions to the language!
  * Add your own optimizations to the compiler!
  * Bind (easily) to your favorite C functions and libraries
  * Unified Call Syntax, so mystr.len() is equivalent to len(mystr)
  * Good performance -- not placing too much emphasis on this,
    but it's faster than C++ in his benchmark
I wouldn't be surprised if GC control and typesafe enums are also extremely old, bringing that up to "8 out of 9 features are old." But it's at least 6 out of 9.

Well, you have to combine just two of these features to leave Nim in a very small company of languages, mostly developed in the last decade. Good performance and meta-programming ("run regular code at compile time") used to be found at the opposite ends of the programming languages spectrum and Nim is unifying them in a single package - that's what people are excited about.

Good performance and meta-programming ("run regular code at compile time") used to be found at the opposite ends of the programming languages spectrum

Actually, Lisp has been executing regular code at compile time with good performance since the tail end of the 1980's. Maybe even before that. This is why I said "everything old is new again."

Depends on your definition of "good performance". Idiomatic lisp was never able to compete with the Fortran-s and C/C++-s of its age, but Nim is.

"Good performance": Within a factor of ~2 to ~4 of native C.

No one said idiomatic Lisp was performant. They said Lisp was performant. Seriously, it's not even difficult. It's not like it's a situation where it's easy to accidentally kill your performance by writing elegant code while having no idea why it's slow.

Optimizing a Lisp codebase is a straightforward exercise, no more difficult than optimizing a Lua or Javascript codebase.

If someone hasn't shipped more than a toy program with Lisp, it's easy to believe the opposite. But if you try it, you'll see that it's true.

To be a bit more specific as to where else these ideas are found:

  * Run regular code at compile time
Lisp, of course. Since very long ago.

  * Extend the language (AST templates and macros);
    this can be used to add a form of list comprehensions to the language!
Lisp, but also Dylan, Elixir, JS with Sweet.js and more. See for example here: https://opendylan.org/articles/macro-system/index.html

  * Add your own optimizations to the compiler!
I'm not sure here.

  * Bind (easily) to your favorite C functions and libraries
Almost every serious language has FFI for C. Nimrod offers a possibility to use statically linked libraries, which is not very popular, but for example Gambit Scheme does this since long ago.

  * Unified Call Syntax, so mystr.len() is equivalent to len(mystr)
Lua, somewhat. Also Dylan again.

  * Good performance -- not placing too much emphasis on this,
    but it's faster than C++ in his benchmark
Too many to count.

  * Compile to JavaScript
OCaml. And more: almost every language has a compile-to-JS project for it (although only a couple are workable).


> typesafe enums are also extremely old

Aren't those just sum types? This would make them as old as ML at least.

This of course doesn't make Nimrod any less interesting a language and some combinations of those features may be rather novel. Still, 'everything old is new again' I think is 100% true for all of these features. In reality you won't ever encounter truly unprecedented features in general purpose languages and outside of academia: one can go to LtU site for those. All the rest languages are built on really old (proven) concepts, which they reimplement and package in a (very) different ways. I have no idea why would you get downvoted for stating such fact.

> * Add your own optimizations to the compiler! > > I'm not sure here.

Lisp, via compiler macros: http://www.lispworks.com/documentation/HyperSpec/Body/03_bba...

> The purpose of the compiler macro facility is to permit selective source code transformations as optimization advice to the compiler. When a compound form is being processed (as by the compiler), if the operator names a compiler macro then the compiler macro function may be invoked on the form, and the resulting expansion recursively processed in preference to performing the usual processing on the original form according to its normal interpretation as a function form or macro form.

Simple... the list of languages that provide Lisp-level dynamism AND C level performance is rather short.

Some Lisps, some Smalltalks, modern C++ and D, Haskell, Clean, SML, OCaml, Dylan, Rust, Factor, Felix, some (AOT compiled, commercial) Java implementations... and so on.

Only very tiny fraction of ideas or implementations are really new in programming language design. There is "prior art" for nearly everything, sometimes dating back to sixties. And being both dynamic and fast has been a research topic for decades now. The problem is hard, but we're steadily making progress - as a result we created a lot of (both abandoned and still alive) languages along the way.

Those are generally 2x to 5x slower than C.

Have you ever actually used Lisp, or are you going off of speculation? I have. It can deliver just short of C level performance.

This is like someone going onto a forum and saying "Java is slow!" while ignoring the fact that trading institutions use Java as their primary language for stock market trading. And they wouldn't do that if Java was slow.

I'm trying to be as patient as possible, but this is really getting out of hand. Most people in this comment thread simply have no idea what they're talking about.

Specifically, look up "GOAL Lisp". Gamedevs wouldn't use Lisp if Lisp wasn't capable of providing performance comparable to C. The entire company would've died.

Yes, I have. Lisp code with all the tweaks to even think about approaching C in speed is ugly, ugly code.

GOAL code was many things, but "ugly" is a stretch. Jak and Dexter did more than any game of its time, and the only reason they were able to do that is because of the flexibility of the codebase.

Its performance wasn't merely on par with C, but actually surpassed C in many cases. For example, they were able to stream content from disc dynamically while the game was running, and no other engine had that capability. That sounds unrelated to "performance" since it's a design decision, but in fact "performance" is everything which the end user cares about, such as load times. And the only reason the game engine performed well was Lisp. Many of Jak and Dexter's competitors didn't survive, performance problems being one of the reasons. (Game publishers used to cancel projects if development milestones aren't being met, such as a convincing tech demo. So if a tech demo was unconvincing, e.g. due to extreme performance problems, publishers basically killed the company. Maybe that's even still true as of 2015, but it was definitely true in 2001.)

EDIT to your reply: You're right, I edited my comment and added a second paragraph. Also, I apologize for my earlier tone. It was uncalled for and needlessly argumentative. Sorry.

You keep making this argmuent without really any justification besides "it just is". Everything I've read about Jax and Dexter says that it's pretty much amazing it was actually released because ND spent so much time on the esoteric, non-standard, bus-factor-one compiler-cum-game engine that they had very little time to actually make the game.

I have no idea why would you get downvoted for stating such fact.

And not merely downvoted, but karma carpet bombed (-10). Such an honor used to be reserved for troll comments rather than true statements.

Isn't compile-to-JS mandatory nowadays?

Ah, yes, I remember back in 1959 when we all was rocking the Cobol and I used to auto compile straight to Javascript with a Future module so magnificent that it auto-compile-manifested a V8 JS engine with which to run it. Deal with that boring Nim compiler.

A little OT: Courtesy of the Reddit discussion of the OP, a reference to the discussion on Wikipedia about Nim, or rather, the discussion to delete its Wikipedia entry:


At what point does a language become noteworthy enough for a site that contains full entries for side characters in non-canonical Star Wars novels?

Honestly, I thing a dozen Nim users should band together and register at Wikipedia to turn the vote the next time someone tries to delete the article. If the deletionists come with formalistic arguments, just synthesize a few articles on sites that fulfill their WP:RELIABLE criteria. There's got to be somebody working for a commercial (read: non-blog) website who doesn't like deletionism and would post a small article on Nim (or any other topic suffering from the same problem) just to stick it to them.

Wikipedia is, for some things, severely broken.

Your post will be seen by some wikipedians as a violation of some policy or other. (Off the top of my head there's sock puppeting http://en.wikipedia.org/wiki/Wikipedia:Sock_puppetry "Do not ask your friends to create accounts to support you.")

But there is a difference between straight up sockpuppetry, and telling people "If you like Wikipedia but are unsatisfied with the way it currently works, then register and start contributing. If policy discussions or delete votes come up, vote the way you believe you should (namely in this case inclusionist)".

Its a bit like encouraging people to go to elections to raise the voter turnout, to reduce influence of extremist parties. (Not that I'm comparing WP editors with extremists, its just that I believe in both cases there is a silent majority that risks being misrepresented!)

I know that and you know that.

There are a considerable number of wikipedia admins who don't see it that way. There are even more young wikipedians who don't yet have adminship who take a hardline on that kind of thing - although to be fair WP has worked hard to reduce that particular malign influence. (Removal of "vandal patrol"; stricter oversight of twinkle and rollback use; etc etc).

So here's a great way to fix it: next time Wikipedia does a donation run, post against it and advise people not to donate, explaining those reasons.

If somebody on HN has a website like that and would be willing to do this then we would really love to hear from that person!

Not to mention detailed articles on Pokemon products, something with decidedly less cultural impact than Star Wars.

Someone needs to fork Wikipedia with different site policies.

Thanks for the great write-up. In addition to straight-up language features, what draws me to play with Nim is the full environment that has already been built up including module packaging and debugger (particular pain points when moving from Python to Golang).

Additionally, having compiled executables solves a Python pain point. I love Python, but distributing packaged executables is an ongoing pain point.

I've had my eye on Nim(rod) for a long time. I think this language may be the right thing, at the right place at the right time.

I haven't seen a better boat for Python refugees to jump into. It's certainly worth evaluating if moving from a Python2 codebase, rather than porting to 3. It's far enough along that it can start poaching users looking for the best place to jump off from 2.x.

The only thing I'd like to see them somehow do, is add a CPython2.x library compatibility layer. If that could somehow be engineered into Nim before 1.0.. I think I'd declare Python3 to be in serious trouble.

The users WILL come if that's done. Maybe take a look at Nuitka in how to make this possible with the binaries that Nim produces. Or even just include the CPython2 runtime in the compiled binary if a user includes Python(2) code.

Nim is great, and this article is great, except until now no article about nim managed to sell it to me like a simple Golang example including channel and go routine.

Could any Nim advocate demonstrate a simple program that:

less than 50 loc

shows the greatest strength (1 killing selling point is enough)

There's no such thing as a single "selling point" for Nim. Or rather there are many such points: depending on what programming you do you'll appreciate different language features. Making a list of many (not all of course) powerful language features is - I think - the right approach to the problem here.

It would probably be better if you explained what kind of problems you're dealing with in your work: we could then look at Nim features and tell you if there's something that would make your life easier and your work more enjoyable.

I understand you (maybe including other Nim guys) feels undervalued/offended when I asked for a sales pitch. Nim is an opensource project and probably nobody makes a cent out of it, yet some arrogant guy like me treat the priceless work like something off shelf in the supermarket.

sadly the reality is not everybody has the time to appreciate the beauty of Nim unless he/she is convinced Nim identifies a problem largely ignored by others and solves it beautifully. my point is, could you convince me?

This was a helpful writeup. Nim has a lot of really nice features. Are there any problems or things you find lacking?

The idea that thisFunction and this_function are the same might be worse than the problem they're trying to solve. I cannot unfortunately have experience with larg-ish codebases in nim to tell if this would result in an actual problem or not, but the idea that I have to scan between two possible identifiers in the code can be a source of stupid bugs in the same vein as case-insensitive identifiers.

On the other hand I can definitely see the appeal of it (some python packages go against the recommendation and sort-of taint the style of other sources).

This so far was the only "weird" comment I could think of.

That seems like a really bad idea. It means that searching for an identifier requires you to allow for the possibility that there might be any number of underscores inserted anywhere. I suppose in practice you'll rely on the fact that no one will be inserting them other than at word boundaries, but it still seems horrible.

(I'd worry about ambiguities of the experts exchange / expert sex change type, too, but my guess is that they're extremely rare and usually either harmless or instantly caught by the compiler. Still, the mere possibility makes me twitchy.)

I would consider it bad style to abuse the case-insensitivity like that.

It can be used for good too, for example if you have a bunch of different naming conventions in the libraries you use:

  proc some_lib_function(x: int)  = echo "hi ", x
  proc anotherLibFunction(x: int) = echo "hi ", x
  proc crAzY_LIB_WRITERS(x: int)  = echo "hi ", x
Then you can still call them consistently:


The reason of this (while I agree with you) is because Nim comes with a strong interop with C/C++ so the problem is that in C and C++ there is a SnakeCase debate still open. They did that so your program can invoke c (or even nim libs) without taking care of the upcase/underscore style and use your own.

While this seems problematic they added some neat features to avoid problems with `uniqueness` of the name and so on.

One thing I wish at this point is to force the compiler to stop if this pattern is not uniform within a codebase.

Maybe you're thinking in terms of a text editor and not an IDE? With an IDE, you look for symbol references, not text with regexps. You could imagine that the IDE's search would be smart enough to know that if you look for foo_bar, it should also report occurrences of fooBar.

Either way, I still find the idea a bit weird. If the language wants to be opinionated about naming, maybe it could simply make _ illegal in an identifier or, to push it one level further, refuse to compile any identifier that is not camelCase.

I was thinking of both. An IDE is worse unless it's actually targetted specifically at Nim, because its code for finding symbol references surely won't know about Nim's underscore-blindness.

(But I'm not much of an IDE-user myself. Perhaps modern IDEs have enough hooks that you really can control this stuff?)

I'm personally not too happy about this, but there are restrictions on the use of underscores:

- You cannot have two underscores in a row

- You cannot have a leading underscore

Certainly, I would have some sort of lint to enforce naming conventions. I don't like it at all. Searching for a function becomes a nightmare.

The Nim compiler offers a set of tools intended for use in IDEs ant text-editors that help with this. Besides the style insensitive grep, there are also a compiler-assisted "go to definition" and "list references" commands.

Is it not good enough to have specific grep tools? nimgrep for example, and there are procs for working with this IIRC.

I prefer to have a consistent code base, then I can use the internal tools of my editor, which are not aware of Nim's identifier rules.

Case-insensitivity is not that much of an issue in static languages. It eliminates the burden of having to remember the Studly-Camel-Lumpy-style-du-jour. Accidental name clashes will be caught at compile time. Nim's hybrid approach is more adventurous but I can only see it being a problem with reusing symbols from other case-sensitive languages.

I'm pretty sure that was removed, the case insensitivity.

The compiler isn't completely stable yet. Often I ran into problems combining multiple features, when I filled up Nim's Rosetta Code entries: https://github.com/def-/nim-unsorted http://rosettacode.org/wiki/Category:Nimrod

The libraries aren't really there yet. Their number is growing, but some important ones are still missing: http://nim-lang.org/lib.html

So Nim really needs more active users and contributors.

>So Nim really needs more active users and contributors.

I've fixed an issue you're experiencing: https://github.com/Araq/Nim/issues/1804 :3

Thanks, that's great!

Nice article.

I was going through trying to follow the instructions and hit a couple of issues:

-For "You should then add ~/.nimble/bin to your $PATH" it took me quite a while to figure I had to enter "ln -s ~/.nimble/bin/nimble /usr/local/bin/" in my Mac's terminal to do that. I guess experienced devs know that stuff but beginner here.

-There's a bug in the client.nim and server.js code where the client refers to element "foo" but the server refers to element "item". Changing them both to foo got it to work.

Nim seems kinda cool.

> -For "You should then add ~/.nimble/bin to your $PATH" it took me quite a while to figure I had to enter "ln -s ~/.nimble/bin/nimble /usr/local/bin/" in my Mac's terminal to do that. I guess experienced devs know that stuff but beginner here.

Actually you should add "export PATH=$PATH:$HOME/.nimble/bin/" in ~/.profile, ~/.bashrc, ~/.zshrc or whatever your shell requires.

> -There's a bug in the client.nim and server.js code where the client refers to element "foo" but the server refers to element "item". Changing them both to foo got it to work.

Thanks, fixed.

For Mac users, I recommend using Homebrew [1] (which hackers should already be using IMHO) to install Nim.

  brew install nim
Done and done.

[1]: http://brew.sh

I would recommend you to do:

  export PATH = $PATH:~/.nimble/bin
instead of creating a symbolic link to /usr/local/bin. That way if you put a new binary in ~/.nimble/bin, i.e. if you have ~/.nimble/bin/foonim, it will work without you having to do

  ln -s ~/.nimble/bin/foonim /usr/local/bin/

I have done it in the same fashion as I've done it for Golang and others:

Add a line to your ~/.profile

#add Nim-lang to path

export PATH=/Users/user/nim/nim-0.10.2/bin:$PATH

In the same terminal window, run 'source ~/.profile' to run the file and update the path. Creating a symbolic link like you did is fine as well.

FWIW, I did a more direct translation[1] of the Nim code into C++. The current C++ code does things like memcpy a std::bitset on every recursive call. I also wanted the code to be doing the exact same thing as Nim is (vector with 1 byte per bool, globals for data, and I left everything as int when I would normally use other types).

The result is that C++ now just barely outperforms Nim: 1074ms for C++ vs 1165ms for Nim.

[1] http://ideone.com/q838cJ

>Yes, that's right: All we had to do was replace the var with a const. Beautiful, isn't it? We can write the exact same code and have it run at runtime and compile time. No template metaprogramming necessary.

Worth pointing out is that modern C++ does this too. Just add a constexpr, no metaprogramming necessary. Modern C++ really does feel like a new language.

Special about Nim is also that Wikipedia consistently tried to delete the article about Nimrod (and did), because someone considered it not noteworthy. What a joke those editors are. If you have no idea about software languages don't interfere.

I haven't noticed this, Nim is the new name for Nimrod.

And the first release with the new name was just a few days ago: https://news.ycombinator.com/item?id=8809215

I'm not sure I like Unified Call Syntax. I think myvar.len() should be different to len(myvar), even if they end up returning the same value. Then again, I'm probably missing some key reason why it's useful.

Adding my own optimizations to the compiler sound useful enough. I might see if I can implement this in the language I'm making.

I copy/pasted the code in the article and compiled each one. I am getting very different ratios compared to the ones you posted:

tmp$cc -Wall -O2 test.c -o test && time ./test

real 0m1.041s user 0m0.998s sys 0m0.029s

nim-0.10.2$./bin/nim --opt:speed c test.nim && time ./test

real 0m1.943s user 0m1.894s sys 0m0.036s

  $ gcc -O2 -std=c99 -lm -o c c.c
  $ time ./c
  ./c  2.49s user 0.07s system 99% cpu 2.567 total
  $ nim -d:release c nim
  $ time ./nim
  ./nim  2.47s user 0.07s system 99% cpu 2.538 total
Yes, --opt:speed is slower indeed:

  $ nim --opt:speed c nim
  $ time ./nim
  ./nim  3.02s user 0.07s system 99% cpu 3.095 total

Ah -d:release made the results almost identical. Thanks!

Try compiling with -d:release instead of --opt:speed.

What I don't like about Nim is that there is not a repl. In Lisp and Clojure you don't need to compile things.

Could someone give a brief comparison between Clojure and Nim? Let's suppose that we implement Clojure in Nim is this a crazy idea?, What about implementing Shen in Nim?. Many people complain because sbcl is not easy to use with C++ libraries, could an implementation in Nim amilliorate those problems?. Many more questions arise but those mentioned are enough.

Well, there is a REPL, but it's not that great:

    $ nim i
    >>> for i in 0..10:
    ...   echo "Hello World"[0..i]
    Hello W
    Hello Wo
    Hello Wor
    Hello Worl
    Hello World
To use up/down keys, build the Nim compiler with "./koch boot -d:release -d:useGnuReadline" or run "rlwrap nim i" instead.

Edit: It's even buggier than I remembered, best avoid it and use "nim -r c file" instead.

What bugs or things-that-are-not-great have you run into with Nim's REPL?

It seems that you can't use any libraries. After using import it doesn't work anymore.

The closest thing to what you're asking about is Pixie [1], which is a Clojure-like language implemented in Pypy's RPython.

[1] https://github.com/pixie-lang/pixie

It'd be possible to implement a Clojure in Nim, you'd just have to port the datastructures and enough primitives to bootstrap the language core (see core.clj, Clojurescript's core.clj, Pixie's stdlib.pxi).

Nice article, although I couldn't really understand some of the examples which had template in it.

Agreed. When the manual says you can pattern match with term rewriting macros, I think of Erlang pattern matching. Would something like this be possible?

    template mysum{a = @[]}(a: seq[int]): int = 0
    template mysum{a}(a: seq[int]): int =
      while a.low <= a.high:
        a[a.low] + mysum(a[a.low + 1 .. a.high])

    var x: seq[int] = @[1, 2, 3]

    echo mysum(x)

Term rewriting templates are only supposed to be optimizations, not change the semantics. They happen on the AST, so the pattern matches are on the AST as well. You won't get the actual values of any variables, they aren't even known at compile-time anyway.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact