Hacker News new | comments | show | ask | jobs | submit login
Nim (formerly Nimrod) 0.10.2 released (nim-lang.org)
199 points by def- 1028 days ago | hide | past | web | 140 comments | favorite



All the hype on HN seems to be towards Rust and Go (with Go having its share of haters), but Nim sounds particularly interesting as a side-step from Python that runs faster than both Go and Rust.

Can anyone give me some insight as to why it's not as hyped, if at all? I'm rather unfamiliar with Nim, and it's seems to have a smaller community, but it feels like it should be getting all sorts of love considering its speed of execution.

Thanks in advance for the help!


Actually it's being hyped that's weird. There are literally dozens of very interesting languages being created and developed right now and I bet you don't know of any of them. The only - and they are very few in comparison - languages which are being hyped since very early in their life cycle are the ones backed by corporations. Both Rust and Go are such languages and that's basically it.

"Normal" languages live on a different time scale: it takes many years to gather enough users to even have a chance of generating a bit of hype. For most languages that moment never comes: they are good, solid languages with decades of development behind them and completely obscure and unknown. Others become well-known in certain areas while still being unfamiliar for the "general public".

Take a look at Haxe, Opa, Elixir, Cobra or Io for example. They are all very interesting (each for its own reasons), mature and productive - and completely unknown. Out of those only Elixir gets mentioned here sometimes.

Anyway, a choice of language should never be based on "hype" alone. Of course, language popularity is an important factor, but most often limiting yourself to popular languages is a case of premature optimization: it's quite likely that in your project the language popularity won't make a difference (especially in small/short-lived projects) and that "raw" language features are more important than hype. (Of course that's not always the case: bigger or long-lived projects benefit from language popularity very much, for example by having easier time finding replacements for developers who went away and so on)


I personally wouldn't label Haxe as "completely unknown" - as a general-purpose language, yes, but it's relatively popular in game development due to its "killer app" of OpenFL.


One of the things that kills some of these "completely unknown" languages is that it isn't obvious they are still being developed. The last known release is years ago, or they still only support one platform, etc.


I'll add http://pike.lysator.liu.se/ to that list of "unknown" languages - It's my go-to language for doing simple prototyping.


Right, those were just examples, there are many (many!) more interesting languages.

Pike is a bit of a special case due to its history though. I first learned it in the 90ties, when I joined an effort to port some MUD from LPC - Pike's predecessor - to Pike. At the time both LPC and Pike looked very weird to me, coming from mostly static, compiled languages like C and Pascal. A "mixed" datatype (like void* in C) with type predicates was something new to me, built-in mappings and other high-level datatypes were too. I don't remember the details, but I think both Pike and LPC had an unusual object system, closer to prototype-oriented languages like Self, JS, Io or Lua than to "normal" class-based ones (but I may be wrong on this). Anyway, it's another good example of solid and mature but unpopular (to the point of obscurity) languages.


Last time I checked Io was slow and it didn't solve any new problems. If I wanted a slow OO language, I would rather keep using ruby/python as I already know them.

Haxe has the small niche of flash type games, and I think it is doing fine in that area. But to expect it to be as popular as a general purpose language is not warranted.


Haxe is general purpose language. It compiles - among other things - to Neko VM, which is a really nice little VM, good for rapid prototyping and to C++ code which is good for everything. The latter is how Haxe supports iOS IIRC.

Io is not "slow", although the level of dynamism it provides has obvious performance downsides. It's generally "fast enough" and - in my experience - a bit faster than CPython. Slower than JITed Lua though. It does, however, solve the C10k problem with its coroutines, actors and futures, which are implemented transparently for the user and with niceties like automatic deadlock detection (on runtime). It's not exactly anything new, but it was new when Io first appeared (I mean the implementation as a language feature, not the concept which is probably 3x as old as even I am). It has also very nice C FFI and very flexible, easily modifiable syntax and optional lazy evaluation which makes it great for developing DSLs.

That being said I don't "expect" these languages to be popular. For the most part, for most developers out there, it genuinely makes no sense to know them. I'd like them to be more popular, but I'm not stressing over this. I know them and I think I can use them to my advantage where it makes sense and that's enough for me personally.


But the thing is that Haxe doesn't really solve anything else other than being good for flash games. So even if it is Turing Complete, there is no point in spending time learning it if you do not care about making flash games.

And last time I checked, Io was slower than CPython and offered nothing more than a little more ideological OO purity. It didn't have any killer feature.


| Actually it's being hyped that's weird. |

The economic realities of software development require disposable human resources.


And this is why most most large software projects fail. Software development is not something 'disposable human resources' can do reliably.


I agree. Software development is (as of now) in its arts and crafts stage of development. Expertise matters. Experience matters. Professionalism matters.

The "economic realities" are /symptomatic/ of the impedance mismatch of an 'industrial scale' demand on the consumer side and an arts and crafts production side.

> most .. projects fail.

It is a minor miracle that they can even be attempted and many even succeed.


Another unknown language, in which I program, is Pliant www.fullpliant.org


The main reasons I think, are that Nim is not "production ready" and that other developers think of Nim as Andreas Rumpf's baby.

In my opinion, this happens when a single author is responsible for most of the implementation of a language.

Another example of this is the Crack language, which is still essentially controlled by one person, despite people declaring it "ready for writing useful code".

Two things make non-commercially funded languages successful. Firstly, they have a team of core developers (even if that is just 3-4 people), who often disagree with one another. Secondly, ad hoc contributions, which don't quite fit with the original aims for the language, are encouraged so long as they don't revert important core design decisions.

Julia and D are prime examples of successful languages along these lines. They both have more than one core developer, and they both have the feel of an ecosystem with many use cases, rather than a very focused language with strict design goals.

Experts seem to agree that both Nim and Crack are very competent, interesting languages. So I believe they have a very bright future when the main developers are happy that the core language is just about right, and when the guiding principles are understood well enough that a core group of developers can continue that vision.

Note that what I am saying has nothing to do with the main authors encouraging external contributions. In the case of both of the languages I mention, they both do that actively. And Nim certainly has other developers actively contributing.

Followup: I want to explain why this is so important. As a developer looking to use Nim (or any other language for that matter), I want to know the language is production ready, so that my code won't break. And if I have issues that need resolving in Nim before my project works, I want to know that I am not waiting on one individual to fix them. I need to know there is a core group of identifiable people I can go to who can answer my queries, escalate my queries and ultimately resolve my issues. Of course these things are also important for the members of the Nim development community itself, not just users of the language.


While Nim has mostly been developed by Andreas, there seems to be about 3-4 people working on the compiler and standard library. I've contributed a bit myself, and I've found Andreas and the other developers very friendly to outside contributions.

I am very optimisitic about the Nim community, it's still small but it is open and friendly, so I think all the potential for growth is there, especially now that the compiler itself is reaching maturity


I think you hit the nail on the head. I've been following the Nim forum and the IRC channel for a while and, while I fully agree with dom96 about Rumpf not being the only developer, still find the difference with communities like Julia (which I closely follow as well) sometimes staggering. The hypothesis that the lack of success of Nim is due to its not being supported by some organisation like Google or Mozilla does not explain why Julia has many more contributors and its development proceedes at a much quicker pace. AFAIK there is no such organisation behind Julia's development.

I've seen more than one time discussions on the Nim forum die of a sudden death after Araq just joined and said something against the idea being discussed. I've never seen something like in the Julia mailing list. Like wbhart, I think this happens mainly because there isn't just one core developer.


From what I can see Julia has a lot of academic institutions behind it. Take a look at the contributors and you will see lots of mention of MIT and Harvard. I may be wrong but I am assuming that those institutions are sponsoring the development of Julia. Nim has no such support from anyone!


I doubt academic institutions "sponsor" programming languages, other than paying the salary of some of the people who happen to be writing them.

Where is Rumpf? MIT?

http://web.mit.edu/nimrod-lang/arch/i386_linux26/doc/tutoria...


I'd say that the academic institutions may be paying some of these people to develop Julia.

As far as I know, Andreas Rumpf does not work for MIT or any academic institution if that's what you're implying. It is rather curious that the MIT website has a copy of Nim's documentation.


I'm not really implying that. Somewhere or other it says Andreas is working for a top secret startup and is always trying to start his own.

What I'm saying is that it can be misleading how much academic institutions are "behind" academic projects. If you talk to most academics, you'll discover that they believe their academic institutions are doing everything they can to discourage their projects (academics can be somewhat sarcastic).

Some of the Julia core devs are academics and write papers on Julia, yes. Their salaries are paid by those institutions. They may have students who can do some coding for them (so long as they write a thesis too). But being an academic is a very demanding lifestyle. With few exceptions, you are paid to write papers, teach, do administration and apply for grants, not to write code.


I think it's more likely that a couple of the developers are being paid to write Julia than it is that they are all working on it in their spare time.

While it may be a demanding lifestyle, you are also paid to do research. And research could include developing new programming languages.


Design by committee is inferior (too many cooks).

But most good maintainers I know listen to critique of the people who use their stuff.

More devs are good for another reason than disagreeing, if one doesn't work on the code anymore, there are others who can.


I agree with you. One needs a group of like-minded individuals who understand and agree on the core design goals, but who have individual say in what they implement. Nim isn't being designed with an inferior model, it is being designed with an inefficient model. With three or four core devs, the work goes 3-4 times faster (slight exaggeration of course). That means a language can become popular in 5 years instead of 15. This is why companies love teams so much. One that works well together is worth many developers who work well alone.


While Andreas is still doing most of the core development it is not true that he is still the sole developer of Nim. I am personally involved with a lot of work on the standard library and there are two others who also work on the compiler. We all have a say in the design of Nim. In addition to that I have built a lot of software for Nim programmers in Nim like for example Nimble and the Nim forum.


I was vaguely aware of this. But where are the major design decisions publicly discussed?

If you look at the Julia issue tracker, you'll see an enormous amount of discussion between and with core devs about language design decisions.

It's good to have someone primarily responsible for the compiler, someone responsible for libraries, another for GC, another for the package manager, etc. I'm not suggesting every core dev needs to be intimately involved with every line of code or design decision. It's more about how the development process feels to outsiders. After all, the question that was asked was about why Nim is not being hyped, rather than why is it not a fantastic language (it is fantastic).

It is great to hear about two other (albeit nameless) individuals working on the compiler though.


I will agree that discussing programming language design decisions in the open is something we have to work on. IRC unfortunately is not ideal but it is where most of these discussions take place. I am personally trying to encourage people to write RFCs on Github or on the Nim forum and there are some already there (https://github.com/Araq/Nim/labels/RFC).


I don't know much about language design, how tightly are syntactical/grammatical decisions coupled with other stuff like GC?

I understand, that, if things the language uses are implemented in that language itself, the language design breaks them. But if not?

A language like Java could probably work without GC, give the dev a "free" method and be done with it. The whole language would look the same syntactically.


Most discussions about major things occur in IRC in #nim on freenode.


> With three or four core devs, the work goes 3-4 times faster (slight exaggeration of course).

Fred Brooks would like a word...

More seriously, this is the important part:

> like-minded individuals who understand and agree on the core design goals,

A team who agrees on the big picture, but can work together to sort out the small details, is really important.

There's also balances between doing things quickly and doing things right. Sometimes, slowing down a bit gets you a significantly better end product. Sometimes it's just a slowdown.


For me the feature that makes Rust stand out is memory safety without garbage collection. It's got all the goodness you'd expect from a modern programming language while giving me basically all the flexibility I could get from writing C or C++. That's a very strong differencing trait, at least for me.

I've never used Nim and I should probably give it a try but it seems that the "garbage collected, parallel oriented" niche is somewhat more cluttered with the likes of Go, the various languages built on top of .Net or the JVM, the lisps, the haskells... Not all target exactly the same use cases but there's a lot of overlap here and it's harder to stand out.


Nim is different. Advanced but pragmatic type system, macros, native compilation, tiny runtime, good C interop - no language I know of offers quite the same set of features. I was very pleasantly surprised with it and decided to keep it as an "tiny, expressive, natively-compiled" solution in my toolbox.


There is nothing quite like Nim in the other languages you mention. Nim's GC is different than most of the others and you can also control it so it only runs for a certain amount of time. Nim is also very expressive and clean. The meta-programming capabilities are fantastic. Its overhead is significantly lower than the JVM and it is cross platform (unlike .net).


I think Nim author need to documentate/explain/illustrate a lot more about its GC. I once have interested in Nim, but I couldn't figure out how its GC is proper (predictable & deterministic rather than throughput) to RT apps. And that dropped my interest on it. I believe this is a biggest obstacle to RT app programmers to take interest on Nim.

Nowadays the term "GC" almost implies "tracing GC", so "non-tracing GC" need to be well explained. Or need to adapt some new term to explain new concept. Currently, Nim's GC explanation just looks like incremental RTGC that is not really attractive to existing RT app programmers.


Have you taken a look at the GC documentation (http://nim-lang.org/gc.html)?


The reason, in my opinion at least, is that Nim does not have a commercial company behind it like Go and Rust who have Google and Mozilla respectively.

This in turn means that Nim does not have as many resources (it is developed completely voluntarily in people's spare time) so development is slower.


Yep, that's basically true. If the development of a language like Python started at about this time, I bet it would be in the same position as Nim.

But how can you get a major company to get behind Nim when the all want to have their own programming language?


Maybe you can't. But you can get the small companies and individuals first. Slowly the contributions make the ecosystem stronger and bigger and risk-averse organizations then can hop on as well. This is pretty much how Python grew.


Rust and Go have big backers behind them. This generates a feedback loop of hype leading to toy projects and thought leadership, which subsequently turns into real adoption. If weren't for that they would be maturing at about the same rate as Nim.


Adoption is greatly helped by hype, but maturity is something different. Rust and Go have a number of paid developers working full time on the implementation. This helps maturity and is unrelated to hype.


Maturity in the compiler != ecosystem maturity. To pick up the broader mainstream also means external tooling(ide, debugger, etc.), library code, documentation, examples and tutorials, live events like conferences and lectures, and "friends who use it" (critical mass in localized communities)

When you miss some of these elements the barriers to entry go way up and the community is limited to people who can tolerate the environment without it. That's true regardless of how good or proven the base technology is. Hype and institutional use accelerates this cycle by forcing it onto developers - you end up with more people early on who say "yes, i put this into production and i can tell you how it worked."


Corporations have become better at promoting technologies to developers, including worse technology. Languages are seen as strategic advantages against the competition ("A language that only work on our devices!"). So I would bet corporate money find its way towards tech blogs and user groups, to bring the kool-aid to the masses. Also keep in mind that big corporations like Google are dominant entities and anything they do is closely watched by peons. For the success of BigCorpLang they need their toy projects anyway.


Google and Microsoft are better at doing stuff like that than others. I wonder why Apple has taken that approach with regards to Swift


If by "that approach" you mean not developing it in the public, that's always been Apple's strategy. They benefit from developers being locked in to building for their platform.


>why it's not as hyped

I don't know how relevant that is, but Rust has Mozilla (and Samsung?) and Go has Google plus some Unix / Plan 9 people behind it. But Nim? There are probably other factors as well.


By hype, do you mean fewer people using it? Can't really submit articles and projects for Nim that don't exist. If you like the language, why not write something in it then blog about it?


There is no technical reason why Rust should be slower than Nim, as far as I know.


Nim has the advantage of compiling to/through C, for which compilers have been optimized for a long time.


That isn't an advantage, IMHO. rustc generates LLVM IR, which allows it to take advantage of essentially all the same optimization passes that clang does. Generating LLVM IR instead of C has some advantages—debug information can be made more precise, precise aliasing metadata for optimization can be added to the output, accurate garbage collection is possible, there's no need to go through the C parser and semantic analyzer, etc.

Examples of features that rustc uses today for optimization beyond those available in C are the "nonnull" and "dereferenceable" attributes (along with some custom optimization passes to make non-null pointer optimizations stronger than what LLVM provides out of the box). An example of a feature beyond C that it could use, but does not today, is custom type-based alias analysis.


I'm currently working with the Xeon Phi and there you basically have to use Intel's C, C++ or Fortran compiler if you want good performance out of the box. Since Nim compiles to C I can write code that gets auto-vectorized and -optimized by the Intel compiler.


Perhaps it's not an advantage, but it certainly is not a disadvantage. An advantage that I can think of is portability as well as easier interfacing with C libraries.


Almost all systems that matter have an LLVM port these days, so portability isn't really an issue in practice (though sometimes C can help with microcontrollers, etc.) Game consoles in particular are all x86 or PPC these days. Easier interfacing with C libraries is a fair point, although you still have to figure out how the two languages' type systems interact and presenting good error messages can be a pain if you're relying on the C compiler to do semantic analysis.


I disagree to optimisation, but agree to portability. It's difference of 100% and 99%, but sometimes the 1% can be significant. And of course, it also can be insignificant.


Rust uses LLVM as a backend so it gets similar optimization passes as clang for instance, so it's not so different. Basically rustc generates LLVM IR where Nim generates C, apparently. In most of the benchmarks I've seen so far you could expect similar performances between C and Rust.

Not to start a language flamewar but I'd like to see hard numbers before I believe that in the general case Nim is faster than Rust.



I have not, I mostly glanced at http://benchmarksgame.alioth.debian.org/ where Rust generally matches C performance whithin a few %. But Nim is not part of this benchmark apparently.

I haven't looked at your link in details but the fact that one C++ implementation manages to outperform all the others (including other C and C++ implementations and of course Rust and Nim) by more than one order of magnitude leaves me perplex. It means there must be a huge room for improvement in a lot of those tests. Also the timing code is implemented in the tests themselves using each language's primitives which seems a bit risky to me when comparing language performances.

Also the unsafe rust version has exactly the same performance as the safe one which is weird. After looking at the unsafe version it appears that all the unsafe blocks are just here to remove the bound checks on the vector indexing and that's it. That seems like a weird choice for optimization, I'd have tried to make the recursive function iterative first, for instance. At least it goes to show that that particular check is not too expensive and it's better to stick to safe code when possible.


>...one C++ implementation manages to outperform all the others ... by more than one order of magnitude

As does the cached Javascript version. All of the implementations in the path benchmarking by fiat use a naiive algorithm (recompute the cost for each recursive traversal). The C++ and Javascript cached versions are supposed to show that algorithm choice is much more important than language choice (and do some memoization of the cost calculation).

https://github.com/logicchains/LPATHBench/blob/master/jscach...


> he fact that one C++ implementation manages to outperform all the others (including other C and C++ implementations and of course Rust and Nim) by more than one order of magnitude leaves me perplex.

This uses a "pruning" algorithm and is a different algorithm than all the other implementations use. It shouldn't be used to compare the different languages, as it's just not doing the same work.


Rust uses LLVM, I believe, and LLVM does as good a job optimizing as just about anything else out there.


Go and Rust and included in the Computer Language Benchmark game: http://benchmarksgame.alioth.debian.org/

Nim, and many of the other interesting languages mentioned in this thread are not.

I'm not sure if this is a cause or effect.


https://github.com/def-/LPATHBench/blob/master/nim.nim

Statistics (on an x86_64 Intel Core2Quad Q9300):

    Lang    Time [ms]  Memory [KB]  Compile Time [ms]  Compressed Code [B]
    Nim          1400         1460                893                  486
    C++          1478         2717                774                  728
    D            1518         2388               1614                  669
    Rust         1623         2632               6735                  934
    Java         1874        24428                812                  778
    OCaml        2384         4496                125                  782
    Go           3116         1664                596                  618
    Haskell      3329         5268               3002                 1091
    LuaJit       3857         2368                  -                  519
    Lisp         8219        15876               1043                 1007
    Racket       8503       130284              24793                  741


That looks amazing. One would expect it to be slower given all of the interesting language features.

I've been wanting to learn one of the up and coming languages (Rust, D, Go, etc.). Would Nim be a good choice?


If you write C/C++ for a living but want a nicer syntax and better language features, then yes, absolutely. If you're looking for a specific niche, then it depends. I'm personally more excited about Rust than Nim, but I also don't do much systems programming these days, so I don't have a specific purpose in mind.


Why are you more excited about Rust?


Probably depends on what software you would like to develop and what languages you already know and enjoy.


Well I've been writing Python for quite a while and I'm currently getting into both Scala and JS/Node.


Not to mention, Nim still doesn't have a Wikipedia article because it is "not significant". http://en.wikipedia.org/wiki/Nim_(disambiguation)


I can't see where it says Nim (programming language) is "not significant". Maybe the reason there isn't a Wikipedia article is as simple as because nobody who has an account on Wikipedia had heard of or thought to create an article on Nim. The solution to which is quite simple, go create an article.


It used to have a page. It was deleted because the language was not considered significant. This is a relatively famous case, referred to frequently.


https://en.wikipedia.org/wiki/Wikipedia:Articles_for_deletio...

> Lacks reliable independent secondary sources to establish notability as required by WP:GNG. Every source is WP:PRIMARY. Every one of them. Googling turned up posts to online discussion forums but nothing useful. Additionally, I note that the decision to delete at the previous AfD was unanimous for the same reasons.

> Perhaps think of it this way: a language becomes notable when people who haven't been involved in its creation start writing about it. If/when this language gets to that point you'll have no problem creating an article. At the moment, though, there just hasn't been enough uptake to get the coverage we need for notability.

The problem is that most sources about Nim are by Araq or not reliable enough.


I really dislike this policy when applied to programming languages. Things from pop culture are going to be referred to pervasively in the media and blogosphere, but programming languages don't get there without massive marketing pushes, press releases, etc.

That doesn't mean a language is not itself notable.

Anyway, there's plenty about Nim that is not primary:

http://goran.krampe.se/2014/10/13/here-comes-nim/

https://www.btbytes.com/notebooks/nimrod.html

http://picheta.me/articles/2013/10/about-nimrods-features.ht... (by Dominic Picheta but external to Nim website)

http://steved-imaginaryreal.blogspot.co.uk/2013/09/nimrod-re...

http://vocalbit.com/posts/exploring-type-classes-in-nimrod.h...

http://blog.ldlework.com/a-cursory-look-at-meta-programming-...

http://joshfilstrup.com/posts/2014-10-27-2014-monads-in-nim....

http://ziotom78.blogspot.de/2014/01/experiments-with-nimrod....

http://progopedia.com/language/nimrod/

https://geetduggal.wordpress.com/2014/03/03/consider-nimrod/

http://www.drdobbs.com/open-source/nimrod-a-new-systems-prog... (by Andreas Rumpf - but definitely an indicator of notability)

http://lambda-the-ultimate.org/node/4749

http://ivoras.net/blog/tree/2013/Oct-2013-10-05.what-i-like-...

http://www.infoq.com/presentations/nimrod (by Andreas Rumpf, but a sign of notability)

http://rosettacode.org/wiki/Category:Nimrod

http://gradha.github.io/articles/2014/11/swift-string-interp...

http://togototo.wordpress.com/2013/08/23/benchmarks-round-tw...

http://felsin9.de/nnis/nimrod/nimrod-gpn14.pdf

http://learnxinyminutes.com/docs/nim/

http://stackoverflow.com/questions/tagged/nim?sort=votes&pag...

http://gradha.github.io/articles/2014/03/nimrod-for-cross-pl...

https://impythonist.wordpress.com/tag/nimrod/

https://github.com/trending?l=nimrod

http://maniagnosis.crsr.net/2013/12/letterpress-cheating-in-...

That doesn't mean that the best resources aren't on the Nimrod website itself. But penalising a language for having excellent primary resources would be a bit crazy in my opinion.


I just discovered that blogs and other self published resources may not be used to establish notability for Wikipedia.

Basically Nim cannot ever become notable unless there are press releases about it or peer reviewed papers written on it. And without a company like Google garnering/writing press releases and none of the authors of Nim are at academic institutions... Well this is awkward.


Too many programming languages, too little time; and no one seems willing to make and publish measurements for all those "interesting" languages.

http://benchmarksgame.alioth.debian.org/play.html#languagex


The release is happening today, so I posted a bit early. Some parts, like the binaries, are not online yet.

There are tutorials for newcomers: http://nim-lang.org/tut1.html https://nim-by-example.github.io/

Edit: Fixed tutorial link


First, congratulations on the new version. Been playing since 0.9.2 and I hope to try this latest version too.

But, please make a zipped portable version available for Windows. You don't have to include a binary version of Nim (although that would be nice) but do put the required MinGW pieces off a subdirectory under Nim as was done with earlier releases. This allows unzipping to a thumb drive, xcopy installs, and using Nim anywhere without having to permanently install on specific systems. Also, it makes things really easy for a quick demo when showing someone else Nim on their machine.

(FWIW: The sad state of Windows is that almost every installer out there, MinGW included, means crapping in your system to some extent and never knowing what went on nor being able to removed all traces of what was installed. With each installed application you have to make a conscious decision whether it's worth the unpredictable OS pollution to install the application on a particular machine ... because you know with each install that your headed to that inevitable last one where you have to wipe the entire OS and start over.)


>(FWIW: The sad state of Windows is that almost every installer out there, MinGW included, means crapping in your system to some extent and never knowing what went on nor being able to removed all traces of what was installed. With each installed application you have to make a conscious decision whether it's worth the unpredictable OS pollution to install the application on a particular machine ... because you know with each install that your headed to that inevitable last one where you have to wipe the entire OS and start over.)

This is a huge point, and a huge downside about Windows, even in recent versions. E.g. some time ago I installed some utility EXE and suddenly my Skype stopped working - in a really weird way - the UI itself changed, IIRC it did not show the login screen, though it did start the app. But could not use it since I had no way to log in. Had to roll back to a previous system save point or some such to undo the damage. The Windows registry (being a single point of failure [1], plus not getting cleaned out or updated completely (i.e. not all the registry keys getting updated) on installs/uninstalls due to buggy (un)installers), and DLL hell [2] are major causes for this.

[1] http://en.wikipedia.org/wiki/Single_point_of_failure

[2] http://en.wikipedia.org/wiki/DLL_Hell


It looks like mingw provides a non-installer version of the binaries, although it is (unintentionally) well hidden in the web. http://sourceforge.net/projects/mingw-w64/files/Toolchains%2...


I see that, thanks. It's a zipped dump of all pieces for all OSes. If you're a Nim dev, please don't let this dissuade you from eventually releasing a portable "ready to go" version of Nim for Windows.

I know myself and my compatriots bail at any language that forces Cygwin or MinGW and has its own installer on top of that. I'll never get to try OCaml, Haskell, Kitten [which looks really interesting but relies on Haskell], and a number of other new languages just because of the install burden.

IMO, if trial and adoption are your goals then as little friction as possible to get to "hello world" is a good route. (And an unzip to the directory of your choice then a double click on "Test_Hello_World.bat" are about as good as you could get.)


It's not the same thing at all, but could you use Virtualbox to run a Linux VM on Windows, and play around with languages in that?


I'd like to clarify that http://nimbuild.nim-lang.org/tut1.html is for the devel branch. You probably want to use http://nim-lang.org/tut1.html if you're going to use the newly released version.


Something I find quite interesting about Nim is write tracking: http://nim-lang.org/blog/writetracking.html. It uses the language's effect system, and allows not only specifying for instance that a function is referentially transparent, but also specifying how an impure function accesses or modifies global state.

An example from the above link:

    var gId = 0

    proc genId(): natural {.writes: [gId].} =
      gId += 1
      return gId
To quote the article: "Here the effect systems shows its strength: Imagine there was a genId2 that writes to some other global variable; then genId and genId2 can be executed in parallel even though they are not free of side effects!"

The effect system can also be used for exception tracking, specifying what kind of exceptions are function may throw. It can moreover be extended to work with user-defined effects.


A lot of features in Nim show a similar kind of designed pragmatism. It's big, and has a lot of stuff that you just won't use in everyday situations, but the uncommon features tend to hit really specific engineering problems that do come up in the real world. I like it a lot better than what I see coming out of Rust, at least at this moment.


Out of curiosity: why is it hard for compilers to keep track of this automatically? At least, my gut feeling says that if it had been easy to do, this would be a solved design problem in newly developed languages by now - at least the statically compiled ones like Nim.


For simple cases (such as the example in the GP comment) a compiler can easily infer those annotations. Complications are introduced in languages which allow aliasing. Precise alias analysis has been shown to be NP-hard [1] (as are many other compiler problems), and thus in general compilers must make very pessimistic assumptions to avoid making incorrect optimizations, such as allowing two functions to execute in parallel.

Other issues crop up due to the open-world assumption. This is especially true in the presence of separate compilation when trying to reason about global behavior. Example: does function f() in file f.c modify global variable g in file g.c? The compiler may not be able to prove that one way or another at compile-time (consider if file f.c was already compiled to object code f.o, and only g.c was being recompiled), and so it must assume that f() may modify g. This particular example can be solved with link-time optimization, but you get the idea of how complicated the real world can get.

[1] Horwitz, "Precise flow-insensitive may-alias analysis is NP-hard," TOPLAS 1997.


Nim is very exciting to me. For a long time I've wanted to develop a viable HDL as an internal DSL in a modern imperative compiled language. Until I found Nim I couldn't find anything beyond C++ that would work (Julia might actually, I haven't tried yet), but Nim is just absolutely perfect. The custom operators, static params (a weak form of dependent typing), and the macros in particular makes Nim a perfect match. It also helps that the compiler is itself written in Nim and pretty easy to work with. It's exciting when a language allows you to solve a whole new problem that was just not feasible before.


Hopefully one day the freepascal backend gets updated. There is a lot of Pascal/Delphi code out there that can benefit (rewritten and call legacy pascal) from Nim's improvements.


Perhaps that day will come. This project was recently started https://github.com/Araq/nim2pas


So far, Nim has passed the most basic test for me: it installs according to instructions and compiles its own example code. I'm always baffled by the number of language releases that don't reliably do that.

Nim looks like a worthwhile language, and I'm looking forward to learning it.


Good for you! When I tried the test last week, the installation instruction was outdated, so that became my first contribution to Nim. :) I suspect most of your baffling cases are simply the case of outdated documentation.


Right — and that's what baffles me. If you want people to adopt your language, you certainly want to have your documentation of the very first step be complete and up-to-date. Thank you for making sure that is the case with Nim.


As a long time Python programmer who has struggled to cross the gap from dynamic scripting languages to modern statically typed languages Nim is by far the most frictionless language I have tried. Before I found Nim I longed for a language like C# where the generics "just work" and overall the language just feels like it designed at-once rather than piecemeal overtime. Everything just seem really "nice" in C# and I am able to transfer my Python experience over to it. But being locked up to .NET/Mono I never really used it outside of Unity3D.

I tried Golang, because it was sold to me as something I would love since I am a long time Python developer. I strongly dislike Golang. It doesn't have much in the way of letting me model my programs like I am used to. I am told "that's wrong, do it the Go way". This is too much friction. Once I am done thinking how to solve my problem algorithmically, I do not want to then figure out how to rethink my algorithm just for the sake of the maintainer's of Golang.

I tried Rust. I think Rust is beautiful (mostly). However, Rust has a far too fundamentalist view on memory safety. And that's not to downplay the importance of memory safety. But there's just too much friction. I want to sit down an implement my algorithm. I don't want to stop and spend just as much time thinking about the particulars that Rust demands.

When I found Nim I almost couldn't believe it. The core language was simple, clean and immediately absorbable. I was able to start writing basic Nim programs after just perusing the language docs for a few minutes while the compiler compiled. I read that Nim had powerful meta-programming facilities and this started to turn me off. I had heard that macros were generally a negative force in the world but only knew them from Lisp. Then I learned that Nim's macros are just functions that take and AST tree, perform modifications to that AST and return an AST tree. Wow that's pretty simple. Oh hey the generics "just work" like in C#. Woah, Nim even supports inheritance!

Nim is definitely the next language for me. In thinking about it, I find that I agree with one of Jonathon Blow's sentiments that we have been seeing a number of new up and coming languages but they are all Big Idea languages. Big Idea languages who's ideas have yet to be vetted and proved out over the course of a decade or two. They all incur too much friction.

Nim seems like a competent, low-level systems language with a helpful repertoire of well-implemented modern tunings to features that are all mostly established in the field. It doesn't try to revolutionize systems languages. It tries to bring us a modern refinement by bringing us a highly performant yet relentlessly productive take on what has already been shown to work.

Please don't be offended if you see me around evangelizing its existence.


As I've said before: for me Nim seems to be everything I had wished Go was.


Looks fascinating. Have you used it in any real projects yet?


LOL... net-filters in my job mark this site as porn...


I don't know why you got downvoted, same thing happened to me.


Perhaps if he didn't start his comment with "LOL" it wouldn't have been downvoted. That said, somebody on IRC mentioned that the website is blocked on Norwegian train's wifi so perhaps this problem is widespread. Any ideas where these blocking services get their black lists?


I'm curious, what's wrong with "LOL"? I thought that was the universally accepted way of expressing amusement on the Internet? Would "haha" have been more appropriate on HN or is expressing amusement generally frowned upon? I apologize for being ignorant of this matter. I'm not quite yet acquintated with proper self-censorship protocols necessary to communicate on this site. It seems rather strict but I honestly don't find that to be a problem as long as the cultural rules are well defined.

As for the topic at hand: I beleive nim, nimrod, are key words that might be flagged for inappropriate content. They are similar to "nimph", "nimphomaniac", etc.


At my location, it looks like this is the service:

http://sitereview.bluecoat.com/sitereview.jsp?url=http://nim...


Thank you. I submitted a request to review it again. Might help if you do the same.

Edit: They reclassified it, should be unblocked now.


Is the JavaScript back-end receiving any updates? Last time I looked at it it looked like it was lagging behind (emphasis in "looked like", I did not spend too much time with it).


>The unpopular "T" and "P" prefixes on types have been deprecated

I really like that change. Those always looked so archaic to me.


Anyone knows what kind of project would nim be a good choice for at the moment ? I mean both the language feature as well as available libs. I think i've read somewhere that web server coding isn't a target yet (at least regarding websockets use, which was my primary concern at the time i looked).


Well, I'm going to be starting on a project involving Arduino and pretty LEDs soon. Being able to do do stuff like

  when version == A
    const numLeds = 10
    type LedArray = array[LedState, numLeds]
And then have the number of Leds and the size of the array being passed around typechecked and having all of this done at compile time seems really interesting and useful. And it seems easy to just allocate everything statically or on the stack and then turn off the garbage collector.


Please announce it on the forum once you get it working. I am considering buying an Arduino and seeing this in Nim would certainly push me to buy it!


It's definitely a good time to get started with web server coding. While a websockets Nimble package is still not available there is a good excuse to write one now that the new async await support is implemented.

As far as web frameworks go I have written one called Jester (https://github.com/dom96/jester). I'll be releasing version 0.1.0 of it soon.


Why did they change the name?


Nimrod is a terrible name for a language. In North America it's synonymous with "idiot".


That's thanks to Bugs Bunny. Nimrod is traditionally the name of a mighty hunter so it made sense for Bugs to call Elmer that to show how far from a mighty hunter he was. But nowadays in the English speaking world we're more likely to think of the Bugs Bunny meaning than the biblical meaning.

It's always hard naming things in a language that isn't your first.


I love learning etymological anecdotes like this.


Git (albeit not a language) has a similar meaning in the UK.


Even in the US, "git" means "idiot." It was my understanding Linus called Git for the same reason Mercurial is called Mercurial: as an unflattering reference to Larry McVoy, who was involved in the Linux/Bitkeeper mess. While typing this though, I discovered that Linux apparently denies this and says it's self-reference. ("I'm an egotistical bastard, and I name all my projects after myself. First 'Linux', now 'Git'".)

Nimrod is actually a little interesting. Until the 1980s, Nimrod was just a Biblical figure, a powerful king/great hunter figure. Possibly because it was used in an ironic sense for bad hunters (perhaps beginning with Elmer Fudd), it came to mean the opposite of its original sense and generalized to "idiot."

http://www.etymonline.com/index.php?term=Nimrod

EDIT: Sorry about that...I accidentally said Perforce instead of BitKeeper. Perforce on the brain. :) Thanks nocman.


I'm assuming you meant to say the "Linux/BitKeeper" mess. McVoy is the BitKeeper guy.


Which makes Github the Moron Central. ;)


Nimrod was also the name of a British warplane, IIRC.

And also the name of a biblical king and mighty hunter.

http://en.wikipedia.org/wiki/Nimrod


I wondered the same thing, but this is all I have been able to find on reasons for the name change: http://forum.nim-lang.org/t/561#3012


TLA FTW?

Although it honestly wouldn't surprise me if the difference between typing three or five characters to compile is a huge deal in programmer land. Just like how a few tenths of a second of extra loading makes an enormous difference in customer satisfaction with the Google:

http://www.nytimes.com/2012/03/01/technology/impatient-web-u...


When I first heard about Nimrod, I thought of Green Day album "Nimrod" from 1997. Then I thought, why would this language be like a nimrod. Like most, I did not link it with the other literary meaning of "a mighty hunter" hunter.

Either way, Nim is short and sweet.


Can someone give a succinct description of the difference between Nim and Rust?


Nim has the feel of a compiled Python (with many additional features that Python is lacking). It is garbage collected, is designed to look as pseudocode'ish as possible.

Rust is designed for systems projects where GC is not a good idea, such as web browsers, or kernels, etc. One of its core design philosophies is to manage memory safely and efficiently.


To add to that:

While Nim is garbage collected primarily it can also be used without a GC. However, the many scenarios where going GC-less may be beneficial such as games, web browsers, or kernels where the application cannot afford any pauses during runtime is alleviated by the real-time aspect of Nim's GC which allows you to specify exactly when and for how long you would like it to run (http://nim-lang.org/gc.html).

In this sense Nim is betting on you using the GC for those applications whereas Rust is betting on not using a GC and providing you with tools to manage your memory manually more safely.


Maybe it's just me but I find this language pretty ugly. And regardless of any technical advantages it may offer, that's enough reason to not use it. There's too many languages out there to come out with a new one where the code isn't clean and nice.


Interesting. I came here to say pretty much the opposite.

Can you elaborate?


Care to give an example of a language you find beautiful?


First, I have never and probably would never use the word "beautiful" to describe any code. I purposefully chose "clean and nice" in my last comment.

As far as the new crop of languages go, I think Rust and Go both offer better refinements of older languages than Nim does. If you're going to come out with a new language, the syntax of it should be cleaner and easier to read than what you are replacing (in this case, seemingly C/C++/Java) or there's not much point.

But ultimately this is personal preference, which is why I worded my comment that way. Instead of people simply disagreeing they have so far down voted me to -3 and counting. HN is overrun with a groupthink mob lately. I have tested this on a number of occasions posting comments that I knew were pandering to the mob, and today posting one I thought probably would go against the grain and the results are totally predictable. It's very sad.


I downvoted you because what you say is both wrong and harmful.

This:

> I find this language pretty ugly. And [...] that's enough reason to not use it.

is the worst possible way of choosing a language to use. It's simply stupid to dismiss languages which may suit your current problem on the grounds of aesthetics. I can't imagine a single situation where "disregarding technical advantages" of a language would be even remotely related to the right thing to do.

And this:

> syntax of it should be cleaner and easier to read than what you are replacing

is another pile of misconceptions: no, designing your syntax by comparison to other languages instead of designing it to fit your semantics is NOT a good idea; there of course IS a point in creating languages even if syntax is not "better" in any way (because semantics matter too) and no, you should NOT try to make your syntax "easy to read" but rather simple and consistent, which will make it easy to read without you even trying. At least "easy to read" to those programmers who are able to work with more than one syntax flavour; the rest will have a hard time reading anything anyway, so there's not much you can do about them.

I made learning all the languages I can a hobby of mine quite a few years ago and I since learned many very, very different syntaxes: look at Prolog, Forth, Scheme and J for the most basic examples. And what I learned is that they all work and work well, despite not being "clean and easy to read" for people who don't know them. It's normal; what's wrong however is trying to make your ignorance into some kind of generally applicable law and judging languages based on it instead of first learning a bit about them.

Now, I'm sorry for being rude, and a little ranty, but I found your suggestion that I'm a part of "groupthink mob" rather offensive.


I'm not saying the downvote was warranted but your original comment wasn't exactly in-depth and high-quality. It was a bit lazy and off-hand. The only reason I bothered to engage with you is because I find language aesthetics fascinating and wanted to find out what your beef was.


Brief is not the same as lazy. I personally find comments that say what they mean to say in a short amount of text much more useful than ones that go on for paragraphs without saying anything new. I strive for brevity.

I revised the one you responded to so you may want to check that now.

As far as more details of what I don't like about Nim, that would have been easy to ask directly; I find the syntax of:

echo("Hello world!")

To be outdated and ugly. I also find the case statement to be awful to read the way they've done it, I far prefer the set of cases to be indented from the case statement itself. And the "of" on each line is redundant cruft.

This is a brand new language and it holds on to what to my eyes is the worst cruft of old languages.


Wow, I just have to reply! Which one of these do you prefer:

  echo("Hello")
  echo "Hello"
  "Hello".echo
  "Hello".echo()
Or something different?


This:

    cout << "Hello";
beats all of the above by a wide margin. It's essentially an ASCII-art ideographic representation of what the computer is supposed to do.


    proc `<<`(f: File, s: string) =
      f.write(s)

    stdout << "Hello World"
There you go :)


The nice thing is that all 4 are valid Nim.


Yeah, it's truly is fascinating. Maybe

    echo "Hello" 
Is the way ? :)


I'd say so. Maybe even echo"Hello" if you're in a hurry; it's just that it's then treated a raw string :p


> As far as more details of what I don't like about Nim, that would have been easy to ask directly; I find the syntax of:

> echo("Hello world!")

Would you prefer System.out.println("Hello world!"); instead?

> I far prefer the set of cases to be indented from the case statement itself.

You can in fact do that, it is however optional.

> And the "in" on each line is redundant cruft.

Not sure what you mean here. "in" is only used in for loops.


> Would you prefer System.out.println("Hello world!"); instead?

This is why I didn't get into details originally because I knew I would get this kind of response.

You know full well that's exactly the opposite of what I mean. You're being an asshole for no reason.

> Not sure what you mean here. "in" is only used in for loops.

I meant the "of"s, of course.


It's really not easy to guess what you meant as you can see from number of people who responded to your comment with their guesses. Maybe you don't a function call or maybe like semi-colons or you don't like double quotes for a string...


> You know full well that's exactly the opposite of what I mean. You're being an asshole for no reason.

I apologise if I came out sounding like an asshole, but I honestly am not sure what syntax you would prefer. Any chance you could show me?


Downvoted because you've been asked for examples 3 times now and you only rant further in response. How is anybody else supposed to know what you mean? Guess correctly?

Would you rather puts instead of echo?


> echo("Hello world!")

I'm struggling to think how different it could be.

Remove the parens? print instead of echo? It's a fairly minor point. I'm fairly very sensitive on issues of visual clutter in syntax and this doesn't bother me in the slightest.

> I far prefer the set of cases to be indented from the case statement itself.

As mentioned elsewhere this is optional but I do agree

> And the "of" on each line is redundant cruft.

I suspect there might be a good reason for needing a keyword here. I would have to dig deeper to be sure though.

Anyway - I can tolerate both of those. Any language with significant white-space and no curly-brackets is off to a good start aesthetically in my books. Skimming through the tutorial shows remarkably few horrors even when doing some moderately advanced stuff e.g. http://nim-lang.org/tut2.html#generics


The case thing appeared in some coding guidelines for difference languages (ex: https://www.kernel.org/doc/Documentation/CodingStyle). I guess experienced people who wrote a lot of code didn't like to waste indentation level. I do indent case's myself but I think not indenting them is not something to easily dismiss.


Here's some perspective. If you had said this instead:

> Maybe it's just me but I find this language pretty clean and nice looking. And regardless of its technical disadvantages it may have, that's enough reason to use it. There are too few languages out there, and we really need more like Nim.

... I still would have down-voted you.

It's a content free comment that doesn't constructively contribute anything. It's the kind of comment I'd expect to be top-voted in an /r/programming thread.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: