Hacker News new | past | comments | ask | show | jobs | submit login
Nim 2.0 (nim-lang.org)
504 points by kindaAnIdiot on Aug 1, 2023 | hide | past | favorite | 205 comments



Been happily crunching away at Nim in production. I'm working on what is mainly a data analysis and report generation tool, compiled as a CLI executable that gets called by server scripts.

Nim makes fast, small executables. It has an excellent heterogenous JSON data structure and a good dataframe library. It prefers the stack so strongly that dynamic data structures (sequences and tables, basically its lists and dictionaries) are pointers on the stack to heap data, where the lifetime is managed by the stack frame. I don't think I have any dynamic references anywhere in my program, and don't have to worry about GC at all. The type system is simple, sensible, and guides you to correctness with ease. Nim also defaults to referential transparency; everything is passed immutably by-value unless you opt out. Generics are powerful and work exactly as you expect, no surprises. Universal function call syntax is ridiculously powerful: You can write the equivalents to methods and interfaces on types just by making procedures and functions that take a first parameter of that type; not needing those abstractions greatly simplifies and flattens code structure. It's just procedures and objects (functions and structs) all the way down.

It's been a real joy to work with and reminds me of when I discovered D back in the day, only it's even better. If you imagine native-compiled type-annotated Python where nearly 100% of your code is business logic with no cruft, you're getting close to the Nim experience.


> “It prefers the stack so strongly that dynamic data structures (sequences and tables, basically its lists and dictionaries) are pointers on the stack to heap data, where the lifetime is managed by the stack frame.”

Isn’t that the same as a C++ vector or map on stack? They allocate internally as needed, and the whole container is destroyed when it goes out of scope.


It very much is, and the point is, it _used_ to be more like java. Araq basically pulled off a very daring switchover from reference based language system to a value based one.

So now the language can credibly claim the same as c++ - no room left closer to the metal. But it's packaged in a much nicer syntax (imho), and has features like macros which we can expect I'm C++ in maybe 10 years, if we're lucky.


Basically, but it requires no extra syntax. `var some_seq = @[1, 2, 3, 4]` is a stack-managed sequence. That's all there is to it. There's no unwrapping any pointers or boxes or what-not, the type is just `seq[int]`. Put another way, things that have become best practice in C++ are default in Nim with no syntactic noise.


There's no unwrapping any pointers or boxes or what-not

That doesn't happen by default in C++ either.

std::vector<int> some_seq{1, 2, 3, 4, 5, 7};


or even just

     std::vector some_seq{1, 2, 3, 4, 5, 7}; 
nowadays (for a value of nowadays that is 5 years old for GCC and 6 years old for Clang)


Indeed there's no question that Nim is basically following C++'s lead on this. Nim iirc always had constructors and destructors. Final piece of the puzzle is move semantics, and I recall a blog post where Araq came up with something very similar.


yes, Nim has move semantics, but takes care of you more than c++ does. for example, if you use an object that was previously moved, you dont get garbage, the compiler turns the first move into a copy (and tells you)

the relevant docs are here: https://nim-lang.org/docs/destructors.html


Agreed, only being able to put pointers on the stack, no data, would make me think it "prefers the heap".


Dynamic data structures by their nature have to be allocated to the heap. What I mean by "prefers the stack" is that you don't have to make a managed ref and dereference a managed pointer type. You just make a `seq[int]`, use it as a `seq[int]`, and pass it as a `seq[int]`, just like stack data. Behind the scenes, it has a unique scoped pointer with no mental overhead.


Sounds like a vector/array/list in any other language after C++, like Go slice, Java ArrayList, Javascript array, Python list, Rust vec. Is there something I'm missing?


I think they're discussing the lifetime of that heap data, not whether the data is heap allocated.


I don't see them drawing any distinctions from C++ or Rust there either. It really sounds to me like most of their low-level experience is in C, where the contrasts they appear to be drawing really do apply.


it depends on what you mean by 'dynamic' and 'stack'; certainly, outside of nim, you can allocate a list of, say, integers entirely on a stack in any of the following cases:

- the size of the list is known when you create a stack frame, as in c:

    int xs[n] = {0};
- when the list grows, it grows in that subroutine and not some callee, for example using alloca();

- the list is built on a stack that isn't the one you have to pop your return address off of; examples include perl's data stack, ada's secondary stack, forth's operand stack, forth's dictionary, or an mlkit region. in these cases you can even return the dynamically built structure to a caller;

- each new callee adds some fixed number of items to a linked list, such as, in c

    void with_fill(color *c, 
        env *e, 
        void (*cb)(void*, env*), 
        void *userdata)
    {
      env ne = {
        .prop = PROP_FILL_COLOR,
        .val = c,
        .parent = e };
      cb(userdata, &ne);
    }
look ma, no heap


Types are stack allocated by default.

"var data: MyObject" is on the stack. "var arr: array[1000, MyObject]" is allocated on the stack sequentially.

Only dynamic seq or ref types use the heap by default.


So, same as C++.


Same as Rust. Vec and HashMap are stack pointers to the heap-allocated container storage space.


You have convinced me to look in to Nim! Can you speak to the build system(s)? CMake is the bane of my existence.


Another Nim user here. Typically you just build the project by `nim c <myProject.nim>`, since Nim has such a strong macro system a lot of typical build stuff is just done with macros. Of course there's also the default Nimble package manager which allows you to list dependencies and tasks using Nim itself. This means that if you know how to write Nim managing the build system is a breeze.


I did a lot of nim small projects. All of them are on nimble build system: https://github.com/inv2004


Ugh, I just got done spending months fighting CMake before moving back to a position using Nim!

You can also compile C projects with Nim like bearssl [1]. Nim takes care to compile the C files and recompile them when config flags change. It's actually really nice.

1: https://github.com/status-im/nim-bearssl/blob/99fcb3405c55b2...


Gotta be honest here, I use a pretty simple Makefile. I don't have anyone else working on this program, so I can afford simplicity.


It does indeed look like Python!

I hope it gets more popular, seems like a much much easier to use Rust


This sounds great. How is the package management story, and how robust is the ecosystem currently?


You can check out the list of Nim packages at https://nimble.directory


The ecosystem is smaller than some, but wrapping libraries is mostly automatable, and there is tooling to assist interop with python, c, and c++


This provides a good overview of packages for various applications: https://github.com/ringabout/awesome-nim

I think most of them are available via nimble.


Looking forward to trying out this release!

After programming professionally for 25 years, IMO Nim really is the best of all worlds.

Easy to write like Python, strongly typed but with great inference, and defaults that make it fast and safe. Great for everything from embedded to HPC.

The language has an amazing way of making code simpler. Eg UFCS, generics, and concepts give the best of OOP without endless scaffolding to tie you up in brittle data relationships just to organise things. Unlike Python, though, ambiguity is a compile time error.

I find the same programs are much smaller and easier to read and understand than most other languages, yet there's not much behind the scenes magic to learn because the defaults just make sense.

Then the compile time metaprogramming is just on another level. It's straightforward to use, and a core part of the language's design, without resorting to separate dialects or substitution games. Eg, generating bespoke parsing code from files is easy - removing the toil and copypasta of boilerplate. At the same time, it compiles fast.

IMHO it's easier to write well than Python thanks to an excellent type system, but matches C/C++ for performance, and the output is trivial to distribute with small, self contained executables.

It's got native ABI to C, C++, ObjC, and JS, a fantasic FFI, and great Python interop to boot. That means you can use established ecosystems directly, without needing to rewrite them.

Imagine writing Python style pseudoocode for ESP32 and it being super efficient without trying, and with bare metal control when you want. Then writing a web app with backend and frontend in the same efficient language. Then writing a fast paced bullet hell and not even worrying about GC because everything's stack allocated unless you say otherwise. That's been my Nim experience. Easy, productive, efficient, with high control.

For business, there's a huge amount of value in hacking up a prototype like you might in Python, and it's already fast and lean enough for production. It could be a company's secret weapon.

So, ahem. If anyone wants to hire a very experienced Nim dev, hit me up!


I've been using it as a scripting target for both games and other things I'm not allowed to elaborate on simply because it can transpile to C and C++. It's just really really nice to be able to manage the underlying run-time (the C environment) and on the top of that be able to use a high-level modern language with so many first-class citizen things (like JSON).

It really is a nicer, better Python. And I say that as someone who does like Python.


>Then writing a web app with backend and frontend in the same efficient language.

How does that work? What i mean specifically is how convenient is it to use js interop in dev time, and not just compile nim to js as a standalone lib?

Can we simply call something like browser API directly from Nim (Or with fairly simple wrapper)?


Since Nim compiles to JS and C you just have to tell Nim what is available dn the target language and you can call stuff just as if it was a Nim function. These definitions can be auto generated, and they can live in a package you can simply import.


I think you can just import dom and get access to browser APIs.


> Imagine writing Python style pseudoocode for ESP32 and it being super efficient without trying, and with bare metal control when you want.

To be fair, I did have to spend like 2 hours tuning my ESP32 code for handling a 22 kSPS ADC where microseconds matter. ;) Mostly just to avoid extra allocations as I was pretty new to Nim at the time.

Ah, but no major regressions in performance or changes needed for ~4 years!


Ahem. Ahem.

Contact info?


Are you hiring Nim devs?


Shoot me an email at arctsint@proton.me Cheers!


Congratulations to everyone involved and the entire Nim community!

Nim has been my language of choice for the past decade and I'm really happy with the new features in Nim 2.0. Some of them are real gamechangers for my projects. For example, default values for objects theoretically allow me to make Norm[1] work with object types along with object instances. And the new overloadable enums is something Karkas [2] wouldn't be possible at all (it's still WIP though).

[1] https://norm.nim.town

[2] https://karkas.nim.town


Of all the recent changes, default values is my favorite. Aside from generally useful and further reducing the need for initialisation boilerplate, I lets us guarantee valid state at compile time for things like enums - and, I assume, object variants?


Nim is really very nice language to write software in. Ship fast, enjoy the ride, produce very performant software. Unfortunately it still in my experience has some sharp edges: juggling C/C++ compilers and options, very poor error messages, very situational libraries that only work on some settings and systems. Given the small community, tho, I can't really fault them for it. The VS Code integration works very well in my experience, rarely crashing.


the shitty error messages and poor tooling are the biggest problems of nim imo. Its overall a great language otherwise


At least error reporting has been improved recently: https://nim-lang.org/blog/2023/03/31/version-20-rc2.html


If someone at Manning Publications is reading this, it would be great to have a book on the newer Nim version, but please consider using a different typesetting with more readable fonts. I purchased the great book by Dominik Picheta, but am forced to use the .pdf because the dead tree version uses thin fonts that I find extremely hard to read even with the right pair of glasses. Font components (arms, lines, stems, etc) are just too thin. Not being a youngster anymore, I naturally thought it was my fault and took the original K&R 2nd ed as a comparison, but still can read it perfectly.


araq/Andreas Rumpf, the project lead, has also published a book: https://nim-lang.org/blog/2022/06/29/mastering-nim.html


unfortunately there is no ebook version and it seems araq is against publishing ebook (probably because of worrying about piracy) - for me who is travelling a lot, hard copy is a no go.


Araq has mentioned he'll create a new edition of the book for Nim 2.0, and publish it as an e-book, see https://forum.nim-lang.org/t/10366


Not with Manning, but if you have a login you can submit this sort of request via their contact page:

https://www.manning.com/contact


I wrote a post on how Reddit uses Nim: https://www.reddit.com/r/RedditEng/comments/yvbt4h/why_i_enj...

More and more large companies and startups are adopting Nim.

Super excited for Nim 2.0 and huge thanks to all who contributed!


>More and more large companies and startups are adopting Nim.

Ineresting.

Are there any stats / data on this, or is it anecdotal?

Even if anecdotal, can you name some names?


For startups, you can count us in[1]. Our backend, across all services, is using Nim. [1] https://cxplanner.com


What's the experience of writing web backend with Nim ? Did you use existing libraries / framework ? How good is the concurrency compared to something like Go ?


One of the reasons for choosing Nim was the ease of getting a production ready web backend. For the core part of managing the backend we are using existing Nim libraries [1], and they are easy to expand and work with. I cannot give you a comparison with Go since I haven't managed that large Go projects - but for Nim we are all into the async and threading. I think the channels within the threading is hardest part in Nim, but work is being done it.

[1] https://nimble.directory [1a] https://github.com/dom96/jester/ [1b] https://github.com/planety/Prologue [1c] https://github.com/guzba/mummy


Thanks


Nim has been my favorite language for a while now, and I'm very excited to see version 2.0 finally released. A lot of these features have been items I've been looking forward to for some time.

The only downside is some of the included modules being moved to 3rd party repositories, as mentioned at the very bottom. It's not a big deal, but it was nice having SQLite support built into the library. I suppose once you support some databases, you'll be pressured to support more and more. I am a bit surprised to see MD5 and SHA1 support moved out though.


Libraries stagnate in the batteries included. Python carries some dead batteries since the 90's, but they are required there.

While it's nice to have path or logging support in the batteries, some other things are better as third parties, to allow them to evolve.


Yes, an experiment was run a while back, incorporating community-maintained code in a "fusion" repo shipped with the compiler by default. It didn't work very well. Discoverability and maintainability of stdlib-like things is hard.


In my experience in the batteries included stdlib approach, even if libraries evolve slowly, they tend to get a lot more attention wrt bug-fixing and performance improvements. Go's stdlib is the example here.


Congrats to all involved.

I find Nim to be an absolutely fascinating language. I've been trying to find a reason to use it on my job (my work is mobile-adjacent so the idea of compiling to JS and to ObjC is fascinating) but haven't gone beyond playing around with it so far. I've been comparing it to Rust and it's just so much simpler to get started with.


Somewhat related, you can call Nim code from Node.js/Bun using Denim: https://github.com/openpeeps/denim. It works by creating a Node add-on.

This is great for reusing Nim code in a web app, and possibly for performance critical code.


Had a look a Nim few months ago - feature wise is a lot of I wish Python had (easy interop with C/C++, static typed, compiled, can be transcompiled and executed on android/iOS), but ecosystem is small even though the language is not new. There is not many high quality libraries such a numpy, scipy, pandas, opencv in python. They lack some big player adopting it - it's too bad Unreal Engine didn't try to adopt Nim instead of creating their own new scripting language Verse.

One thing I'm also lucking is out-of-the-box interop with C/C++ libraries without creating own adapters (so that you can just import header and be done with it).

Another thing is I wish it had similar easy interop with Rust - just to increase adoption and also because in Rust easier to find high quality cross-platform crates (including mobile) that work without hassle even on mobile devices.

I worry in few years either Python will catch up (because of faster python, non-GIL, nuitka, briefcase for mobile etc) or Mojo will eat Nim lunch.


To be fair to Nim, only Python has the huge ML ecosystem of numpy, scipy, pandas, opencv, pytorch, tensorflow, keres... Doing ML/AI style work in anything but python is really hard!

That said Nim does have the nimpy library that allows for pretty seamless interop with python. Which means you can just import PyTorch, or scipy, or opencv and use them in Nim.


for me (mobile developer) interop with python is not enough because of really poor python story on mobile devices (iOS / android) when using native modules. I think if Nim had a seamless interop with Rust or even Zig it could piggyback on those communities to get some libraries for free.


For Rust at least one can use https://github.com/arnetheduck/nbindgen


That looks interesting. Unfortunately it looks like it hasn't been updated in a while? Is that because it's complete or a lack of interest?

For example, the approach mentioned at the bottom of the README of integrating via nlvm (https://github.com/arnetheduck/nlvm) sounded great but appears to be unpursued.


that looks interesting, thanks! Did you try it if it delivers on promises? There was not any new commit since 2020 so not sure if the project is stale by now.


I do not use Rust, so sadly I have not.


As a 2-step approach, you could also probably use https://github.com/mozilla/cbindgen and then Nim's native C FFI.


Dunno about rust, but since both Nim and Zig compile to C as interim, it should be fairly easy to get them working together, no?


Things might changed but last time I checked you could easily call C function but you had to kind of export each single C function, structs etc. You couldn't just import single header file and be ready to call any function in the library. There is some pending project [0] futhark but not sure how mature it is and that still only for C libraries (instead of C++ or Rust) but maybe easy adopt for Zig - would be great nonetheless.

[0] https://github.com/PMunch/futhark


Most of those libraries are written in C++ or C, which Nim has excellent support for. I've used opencv c++ library with Nim. It's just that opencv is so massive it'd take a lot of work to wrap well, so I haven't yet. Some folks are working on a pure Nim pandas lib too.


I found the following article about Nim and Rust interop:

https://dev.to/dumblepytech1/the-way-integrate-rust-into-nim...

Haven't read it yet but it looks promising.


Anyone have working experience with Nim and Zig? I'd love to hear how they are similar and contrast. I'd also would like to see some idiomatic web server benchmarks between the two (now with Nim v2).


I've used both to work on a hobby OS project (Nim[1], Zig[2]). I very much prefer Nim. Code is succinct, elegant, and lets you focus on your core logic rather than fighting the language.

Zig is nice and I like its optionals support and error handling approach. But I was put off by its noisy syntax, e.g. !?[]u8 to represent an error union of an optional pointer to a many-pointer of uint8. Also having to prepare and weave allocators throughout most of the code that needs to dynamically allocate (which is most of the code) gets in the way of the main logic. Even little things like string concatenation or formatting becomes a chore. Zig also doesn't have dynamic dispatch, which makes polymorphic code hard to write; you have to work around it through some form of duck typing. In the end I realized that Zig is not for me.

[1] https://github.com/khaledh/axiom [2] https://github.com/khaledh/axiom-zig


Couldn't edit my post, but forgot to mention my main pain points with Nim have been:

- its module system, especially not being able to have mutually recursive imports (there has been a 7 year old proposal[1])

- order-sensitive declarations of procs (i.e. can't use a proc defined further down in the file unless you add a forward reference to it). For the latter there's an experimental pragma[2], but it doesn't work a lot of times once you introduce mutually recursive calls

- object variants requiring declaration of a separate enum instead of allowing inline declaration of the variant cases, and a close issue[3] with not being able to define the same field names under different variant cases.

[1] https://github.com/nim-lang/rfcs/issues/6

[2] https://nim-lang.org/docs/manual_experimental.html#code-reor...

[3] https://github.com/nim-lang/RFCs/issues/19


You might like Patty. It makes case types more ergonomic: https://github.com/andreaferretti/patty


Neat! Thanks for sharing. This might come in handy, although the constraint of unique field names across variant cases is still there.


Yah it's annoying, but I just rename one. Beff331 made a lib that uses tuples and avoids the naming issue: https://github.com/beef331/fungus


If I recall correctly, lazy symbol resolution, which would allow both circular module imports and order-independent procs, was initially on the roadmap for 2.0. Currently, it was moved to a stretch goal for 2.2.

https://github.com/nim-lang/RFCs/issues/503


> Zig also doesn't have dynamic dispatch

It doesn't have it as a language feature, but it does have VTables just like C would. The `std.mem.Allocator` is an example of this.


I maintain auto-generated bindings for my C libraries for Zig and Nim (and Odin and Rust - although the Rust bindings definitely need some love to make them a lot more idiomatic).

I think looking at the examples (which is essentially the same code in different languages) gives you a high level idea, but they only scratch the surface when it comes to language features (for instance the Zig examples don't use any comptime features):

Zig: https://github.com/floooh/sokol-zig/tree/master/src/examples

Nim: https://github.com/floooh/sokol-nim/tree/master/examples

Odin: https://github.com/floooh/sokol-odin/tree/main/examples

Rust: https://github.com/floooh/sokol-rust/tree/main/examples


And which language did you enjoy coding in the most? Yeah, a subjective question :-). (Edit: Missed the auto-generated part, so maybe you don't have an opinion on experience regarding this?)


I think actually Odin, although I was surprised by that (being more of a Zig fan). Odin has some neat convenience features which are nice for higher level code, while Zig can be a lot more 'draconian' by enforcing correctness even at the cost of some "line noise" (but I guess both languages are still in flux, so that might change).

As for Nim I enjoyed it initially (because of the Python vibes I guess) but the automatic memory management gets confusing quickly. IIRC there's quite a few different reference types - but maybe that has been simplified in 2.0

PS: Even though the bindings are auto-generated, I still try to make them 'language-idiomatic' by injecting some 'semi-manual' mappings for things like naming conventions or implicit type conversions, ideally getting the API close to what a 'native' API would look like - at least that's the goal.


This might sound strange, but in my opinion Odin is a Pascal variant with a C like syntax.


I’ve written programs in both, though it’s been a while since I used Nim now. I think I enjoyed writing Nim more. Zig is more boring, but for all the right reasons. I wouldn’t personally choose to write an OS in Nim, but I think Zig would be great for that when it’s mature. I personally started using it for embedded software.

I would probably use Nim for CLI tools, server applications, maybe GUI applications and games too.

The Zig teams seems to be putting much more effort into the whole compiler infrastructure, which is really amazing in my experience. There’s some great innovations there.


I suspect Nim would be much, much harder to wrangle for games than Zig (or easily the best of the bunch: Odin) since it doesn't make enough things clear at all in terms of allocation and only allows indirect control of allocation and deallocation.

I wouldn't necessarily prefer Nim for any of the things you listed but this doesn't have the same argument as for games with Odin (which has great tools and libraries for making games as well as gives a much better overview of important things you'll have to care about for making them in terms of performance, etc.).

Rather, it's because I've found that Nim belongs with the other languages that think that complexity can be managed by being hidden well enough, which I've found is simply not the case when something actually needs to be debugged or you need to understand the behavior of the program.

Hiding/ignoring allocation errors, not making allocation explicit, not making deallocation explicit, etc., makes for a much worse time actually understanding what's going to happen. Adding tons of GC options like alternative GC implementations isn't going to fix it and this new one is really just another example of trying even harder to hide complexity.

I think the ultimate irony of these languages that have magical features like move semantics is that they do some of those things in the name of performance but in practice many of them are so complicated to write well-performing code in with these space technology features and non-obvious behavior that the end results are worse than much, much simpler languages. I've also found that these languages' development cycles (for the end user) isn't that much longer than the space tech ones because there is ultimately much, much less to use in them so people end up just writing the actual code instead of trying to wrangle all of the magic.


Many game developers want to focus on writing games instead of fighting a memory allocator. Unless you're making a 3D game with realistic graphics, you don't need every last bit of performance.


No one needs to be fighting allocators; they are far less inhibiting and pose less of an issue than GC or RAII will in the vast majority of cases. They're far easier and simpler to deal with on the whole as well. You're always interfacing with memory management somehow and the implicit way is usually much harder to work with overall. The idea that having an allocator and explicitly working with it is for "that last bit of performance" is a bit disingenuous, you're usually losing far more than that with implicit allocation and deallocation. On top of that you simply inherently have a harder time understanding the behavior of your program.


Manual memory management is one more thing to care about instead of the actual logic. With automatic memory management, you don't need to think about memory at all; what could be simpler?


Easier in the best case and much harder in the worst, when your lack of thinking is an issue (which it definitely will be unless you're prepared to use more of the machine for no reason). Simplicity is not about what's easier to use, it's about how you interface with something, how simple and straight forward that interface is to use, how many things are implicitly or explicitly affected by that thing, and so on. Automatic memory management usually implies an assumption that allocations can't fail, memory is infinite, etc., so the assumptions and complications are many. It also adds more code you didn't write and have no direct control over, which complicates your problem solving in many ways.

GC or other automatic memory management is only easier if you have absolutely zero care for resource usage. RAII will oftentimes lead to single allocations and deallocations, for example, unless you take care to not have it be so, which is an immense waste of resources.

It's fine if you don't care and you know that that's going to produce slow, bad software, but let's be honest about that instead of saying you can not care and everything will be fine.


nim seems good for embedded too,just use cross c compiler as the backend?



I've done green-field work with both Nim and Zig.

There were loads of specific differences, but if I could characterize both languages in a simple way:

- Nim seems to emphasize being a swiss army knife in the way that Python is, except as a compiled language.

- Zig is a much more focused language that tries to hit a certain specific niche - being a successor and replacement for C - and hits that mark spectacularly.

I think language preference comes down to what your personal needs and wants out of a new language that isn't being served by whatever you're using currently. I personally landed in the "Zig" camp because the way it approaches its ambition of being a C successor is intriguing, but I could see why other people might land on Nim.


Zig doesn’t seem to have an implementation for the TechEmpower Benchmarks but Nim does: https://www.techempower.com/benchmarks/#section=data-r21&l=y...


Nim's should probably be redone with mummy (multi-threaded) and chronos (async single-threaded) for a better showing:

https://github.com/guzba/mummy

https://github.com/status-im/nim-chronos


Why does the stdlib implementation do so badly in the first place?


It is maybe the most simple web server implementation, similar to what you get from "python3 -m http.server"? What sense does it make to compare highly focused web server frameworks to languages most simple stdlib implementations, much apples vs oranges.. (thus also not getting why the proposal to compare with actual web frameworks for nim is that much downvoted?!)


Too much string copying iirc. It was written a while ago.

Hopefully it'll get updated/replaced some time, but there's plenty of faster 3rd party ones already.


And I'm surprised by how terribly it ranks – basically dead last against everything, even the Python frameworks, which is impressive.


Nim's default json library is terrible in performance, but there're much faster drop-in replacements like jsony[1]. I'm not sure that's the main issue for low rank, but it's definitely one of them.

1. https://github.com/treeform/jsony


I would not call std/json it "terrible in performance" probably still way faster then what you get in many other languages (like python). But yes the JSON lib I wrote is faster due to avoiding branches and allocations.


Interesting. The Vercel benchmarks make it look pretty good. Only slightly behind rust. https://programming-language-benchmarks.vercel.app/zig-vs-ru... Benchmarks are as much about the skill of the programmer as they are about the language. I suspect those numbers could improve drastically.


You've linked to Zig vs. Rust, which really should be outperforming Rust. The grandparent was talking about how Nim seems to be doing terribly.


httpbeast[1] reached #6 at one point, but I think the author is busy with other things nowadays.

https://www.techempower.com/benchmarks/#section=data-r18&hw=...

[1] https://github.com/dom96/httpbeast


Sums up most nim libraries unfortunatelly


Possibly it wasn't compiled with `-d:release`. I only looked briefly — is there a way to see the source code and cli flags used for the various implementations?


https://github.com/TechEmpower/FrameworkBenchmarks/blob/mast...

That appears to be the docker file they used, it's compiled with -d:release.


For reference, here's some other benchmarks which show happyx (a Nim framework) to come on top: https://web-frameworks-benchmark.netlify.app/result


It isn't really a language community so interested in web server efficiency, and until recent years threading efficiently was kind of tricky with the GC scheme they used. If someone wanted Nim to rank high you could do it, but I'm not sure it is worth the effort?


There are a lot of features in Nim that are basically the polar opposite to Zig's values; macros/templates as opposed to comptime which has no real capability of just inserting random code and the very pervasive naked imports (functions/methods can come from anywhere) that are all over the place come to mind, as opposed to the explicit imports and qualified names you would have to use in Zig (or deconstruction of imports to get the bare names, making it obvious where an identifier is coming from).

On top of that you have only indirect control over memory allocation and deallocation, which goes completely against Zig's values where custom allocators are used and everything that allocates should take an allocator as an argument (or member in the case of structures). In contrast to that there isn't even the concept of an allocator in the Nim standard library.

I would say that my experience with Nim has made me fairly certain that Nim has absolutely no desire to make things obvious but rather chooses convenience over almost everything. It's not so much a competitor (in performance or clarity) to Odin or Zig as it is a competitor to Go or something with a much higher-level baseline.

On top of all of this it doesn't really have tagged unions with proper support for casing on them and getting the correct payload type-wise out of them, which is an incredibly odd choice when all of its competitors have exactly that or an equivalent.

Overall I would say that coming from Odin or Zig (or Go) and actually liking those languages it's very hard to like Nim. I could imagine that if someone came from a much higher-level language where performance is nearly inscrutable anyway and nothing is really obvious in terms of what it's doing, Nim would feel like more of the same but probably with better performance.

Edit:

Often while reading the Nim manual, news and forum posts, etc., I get the sense that Nim is really just an ongoing research project that isn't necessarily trying to solve simpler problems it already has along the way. If you look at some of the features in this announcement, it's hard to see anyone ever asking for them, yet here they are. In many ways it's way worse than Haskell, which often gets derided as "just a research language". A lot of what Nim has makes for a much worse experience learning and using the language and I'm sure it doesn't get easier in the large.


> It's not so much a competitor (in performance or clarity) to Odin or Zig as it is a competitor to Go or something

That seems accurate. Dealing with raw pointers as one does in Odin or Zig is very much de-emphasized in favour of dealing with safe references, and a lot of effort is put into optimizing out all the overhead of those reference checks (hence ARC/ORC) and writing code to evade them. The manual memory management features of Nim are there for flexibility and fallbacks and are not really the main way to write code: even for embedded. The stuff that Zig (and Odin?) do surrounding allocators and alignment, and constructs for slightly-safer pointers, are really very interesting yet are most helpful if you are indeed working with pointers and worrying about offsets: which you usually aren't in Nim.

I am curious as to what you mean about comptime, though. I have gotten the impression that equivalent constructs in Nim are more powerful. You have `static` blocks and parameters, `const` expressions, `when` conditionals, and then also both templates and typed macros operating on the AST (before or after semantic checking)... `when` even provides for type-checking functions with varying return types (well, monomorphized to one type) via `: auto` or the `: int | bool | ...` syntax.

I will also defend "naked imports" as a feature that works very well with the rest of the language: functions are disambiguated by signature and not just name and so conflicts scarcely occur (and simply force qualification when they do). And, this allows for the use of uniform function call syntax - being able to call arbitrary functions as "methods" on their first parameter. This is incredibly useful and allows for chaining function calls via the dot operator, among other things. Besides, if you really want you can `from module import nil` and enforce full qualification.

Interest in proper structural pattern matching sparked back up again recently and some complementary RFCs were proposed: https://github.com/nim-lang/RFCs/issues/525 and https://github.com/nim-lang/RFCs/issues/527.


> I will also defend "naked imports" as a feature that works very well with the rest of the language: functions are disambiguated by signature and not just name and so conflicts scarcely occur (and simply force qualification when they do). And, this allows for the use of uniform function call syntax - being able to call arbitrary functions as "methods" on their first parameter. This is incredibly useful and allows for chaining function calls via the dot operator, among other things. Besides, if you really want you can `from module import nil` and enforce full qualification.

This is spot on. You can also not really have productive and well-fitting errors-as-values in a language that emphasizes UFCS, which is why Nim (and D) has/have to have exceptions. In order to productively use errors as values in Nim you either have to chain some kind of `Result` type (which, if you `map` & `mapError` over it will have to be able to implicitly allocate in certain cases, etc.) so the list of potential victims of this (and other features) just seems to go on and on.

In general, if you go over the list of features in Nim there is a coherence in them only in that some of the (mis)features actually have to exist in order for other features to make sense. I would feel like it was "designed" except in the case of Nim it really feels mostly accidental and not very well though out in general. The end result is (for me) that it feels very much like it ended up on the wrong side of readability, clarity and overall coherence.


> You can also not really have productive and well-fitting errors-as-values in a language that emphasizes UFCS

Eh, https://github.com/arnetheduck/nim-results and associated syntax from https://github.com/codex-storage/questionable would beg to disagree. Nim's stdlib does not have productive and well-fitting errors because it suffers from inertia and started far before the robust wonders of recoverable error handling via errors-as-types entered the mainstream with Rust and were refined with Swift (IMO). Option/Result types are fantastic and I do so wish the standard library used them: but it's nothing a (very large) wrapper couldn't provide, I suppose.

I do strongly think that other languages are greatly missing out on UFCS and I miss it dearly whenever I go to write Python or anything else. I'm not quite sure how you think UFCS would make it impossible to have good error handling? Rust also has (limited, unfortunately) UFCS and syntax around error handling does not suffer because of it. If by errors-as-values you mean Go-style error handling, I quite despise it - I think any benefits of the approach are far offset by the verbosity, quite similarly to Java's checked exceptions.

(in general concerns surrounding performance of errors surprise me - they're errors! they shouldn't be hit often! but if they are, you can certainly avoid such performance hits in nim.)


So, Nim doesn’t seem to be under an umbrella of a non-profit. Isn’t this destined to be a problem at some point regarding either acquisition of rights or succession?

Edit: Ouch. Just found this thread. Very disappointing, and actually makes a greater case for institutional ownership: https://forum.nim-lang.org/t/10312


Was about to bring up that thread. Having a dictator for life isn't a good combination with that dictator going on unprompted, unhinged rants.


I thought it was strange that comments were deleted and the thread was closed for going off the original topic when it was Araq who shifted the topic.


I find using main a little obnoxious, but like who cares that much?


I think "default" (Mercurial) is a way better name than anything else for the default branch, duh. But, I err on the side of the disadvantaged because it's impossible to fully empathize with their individual experience, so I try to use "main" whenever I can.

That said, I'm a Slav which is the origin of the word "slave" because in Europe, slaves were predominantly Slavs once. I don't really mind it because it feels irrelevant today. Connotations of "master" doesn't feel that ancient yet though, considering that black people weren't allowed to live in Palo Alto, CA (heart of Silicon Valley today) until 1950's.


Despite the remark about confusing people over 50, a primary branch called "master" isn't exactly an unalterable ancient tradition in version control, either. "Trunk" was common in centralized VCSes. I had to get used to "master" and "main" is at worst a lateral move.


Right, I found such changes silly as well, but this kind of anger is a bad sign. Sounds like someone watches too much corporate TV.

If you can't handle a little silliness from humanity, might as well bow out now.


I also found that thread. It was very pleasing and it seems he has definitely thought about the issue. While his response seems to go from 0 to 100 in a second, he is still honest and gets to the point instead of waiting for all the "why not?"s to roll in.

It's not surprising he has his head screwed on straight. There is clear genius in Nim's design. I'm not a genius, and I don't know much about compilers, just scant knowledge of some data structures and algorithms, but what I do know is that being able to make something so powerful be used by mere mortals like me is very much genius (an idiot values complexity and all that jazz).


On the flip side the Rust Foundation hasn't been that great either...


I don't think that one project dying with its maintainer because he thinks woke is stupid is comparable to the community-hostile changes in trademark policy. No, not the flip side at all.


lmao


I loved Nim when I used it first.

But I left it because of recursive imports. I had to basically put all my types into one file and use them from various others. For a relatively medium sized project (~10LOC), its a but of a hassle. Refactoring is an issue.

That being said, the language is fantastic. Can anybody with experience suggest me what HTTP library/framework do they prefer for servers?


The lack of recursive imports can be annoying, but I found I don't mind it. It keeps your module tree into a DAG.

Chronos is probably the most feature rich and uses async. Mummy is newer and uses a threading model. Both are used in production.


Node and Python also don't really work with circular imports.


Python doesn't work with circular imports. NodeJS does. After using NodeJS for such a long time, I think it is a feature that is taken for granted.

That being said, the Nim team is working on it as per a few issues: [1] https://github.com/nim-lang/rfcs/issues/6 [2] https://forum.nim-lang.org/t/2114

I love the language, and this is probably the only bottleneck for me.


Node really doesn't work with circular imports. There are runtime gotchas with it, for example destructuring a cyclic import via require() will give undefined for the destructured values as they "don't exist yet".

Glad to see progress on the Nim side!


Just started learning Nim recently and really loving it.

Even though it's older than its peers like Rust and Go, it still quite the underdog.

Hope more people start paying attention to it.


Go had full time engineers designing the language, tooling, docs, etc. Nim has never had huge industry sponsorship, so comparing the languages on age alone is hardly fair.


Seems like it kinda has Sum Types, so Nim passes the litmus test for respectable static type-system in this day and age.

https://nim-lang.org/docs/manual.html#types-object-variants


When I looked at it a few years ago, the compiler didn't prevent you from accessing fields from the wrong variant, and didn't provide exhaustivity checks. So I think it still falls short of this (excellent) litmus test :/


They've improved the compiler analysis for them significantly, including exhaustivity checks and field checks.


Also Nim requires you to use unique field names across all variants.


This annoying restriction is lifted in Nim 2; see the linked announcement.


Where in the announcement does it say that?


They likely read "overloadable enums" and went "Oh Rust calls their tagged unions enums" so assumed all languages did.


Oh, no, I actually misunderstood cobby's complaint: the field names, yes, those still have to be unique. Which is also a bit annoying, though I've seen discussions about changing it.


I feel the same way as you! I've seen many language ideas come and go in my career and sum types are one I feel now should be a basic requirement. I miss them in any language without them.


"Nim is a programming language that is good for everything, but not for everybody."

now you got me really interested.

At some point dlang-betterc + zig + nim should have an interoperability article and share libraries.


Really, I'm quite hopeful for crabi: https://github.com/rust-lang/rust/pull/105586

An ABI for languages with a proper type system seems fantastic. Swift, Rust, Nim, D all share very similar type systems (and memory management systems) and it would be very cool to see what kinds of interop easy dynamic linking would allow.


I didn't know about it. Seems life has bright sides too.


> Now one can define constructors and virtual procs that map to C++ constructors and virtual methods, allowing one to further customize the interoperability.

I hope this will help with bindings for C++ libraries that have historically been tricky to wrap.

For example, I would like to use Qt from a compiled language that's a pleasure to use, and this project looks promising:

https://github.com/jerous86/nimqt


Could be a fun python alternative

Questions:

- value/object semantic: i peeked at some code, and i can't tell what is a value, and what is a reference type, is everything heap allocated?

- tooling: what's the state of their language server? does it work with all of their language features?

- debugging: does gdb/lldb understand nim's types and slices?

And finally: is a no-gc mode available?

I'll play with it later today, it's always been in my todo list of languages to try, now is the perfect time


Reference semantics are part of the type.

So "var i: int" is value, "var i: ref int" is a heap allocated reference that's deterministically managed like a borrow checked smart pointer, eliding reference counting if possible.

You can turn off GC or use a different GC, but some of the stdlib uses them, so you'd need to avoid those or write/use alternatives.

Let me say though, the GC is realtime capable and not stop the world. It's not like Java, it's not far off Rust without the hassle.


1. Nim uses 'var' modifier to pass by reference, e.g. "proc (n: var int)...", default behaviour is pass by value. And there're also raw pointers and references (safe pointers).

>is a no-gc mode available?

You can disable gc, but most of standard library depends on it. But in Nim 2.0 there's finally support for ARC and ORC (ARC + cycle collector).


> default behaviour is pass by value

Is not exactly true, smaller than 24 bytes is passed by value, the compiler optimizes larger calls and passes by reference implicitly.


I've been searching and can't find it: A in ARC is for automatic, what does the O stand for?


O is a visual pun because it adds a cycle collector to ARC.


I assumed the O was for "optimised", but apparently it stands for "cycle":

https://nim-lang.org/blog/2020/12/08/introducing-orc.html


> i can't tell what is a value

If a type is declared with 'ref', it's a reference type. Otherwise, it's a value type.


nim is a better python(syntax wise) that compiles to c(or c++,js,etc but c is the default) with GC turned on by default. I have always been wanting to use it outside of what my job needs(c/c++/python). I hope some big players adopt Nim to make it one of the mainstream language.


Reddit was hiring for Nim positions. So demand is growing. New languages have easier time being adopted at startups which grow into big players eventually.


Nim looks awesome. Does anyone know why it doesn't have first-class support for wasm? That's the only thing that would keep me from diving into it more.


I think the short answer is it's built on top of C tooling so it doesn't really need another way to do it because you can use emscripten. Search their forum for "web assembly".

I did ask him about it eight years ago: https://forum.nim-lang.org/t/1392#8675

But that was a little early on and there have been other priorities for the language.


The reason I ask is that I was poking around and saw some projects to help with wasm compilation, and on this random list of wasm-capable languages [0], Nim is listed as "Work in Progress." Notably, Swift is ranked higher, and I view Swift as extremely experimental when it comes to wasm

[0]: https://github.com/appcypher/awesome-wasm-langs


With 2.0 out and ORC being default it compiles and works just fine. There's not too many libraries specializing for wasm stuff though, so you gotta use emscriptens or similar.


I suspect it's simply a size-of-community thing. If you want it, you should take a crack at implementing it! Or least start a thread about on the official developer forum.


The last time I used the language, it was still using a garbage-collector and there were talks about transitioning towards a new way of doing things - I assume that ARC/ORC ended up being that destination.

Now that ARC/ORC is considered "complete," are there any remnants of the old GC still in the language, or has the entire ecosystem hopped over?


For most of us the move from GC to Orc is pretty transparent. Most libraries just work and don't require any major restructuring.


    proc echoLine(): void = discard
Discard looks cool! I'm Rust I had to use the unimpl macro crate [0] to get sane error messages. It would be good if that was build in though.

[0]: https://crates.io/crates/unimpl


As someone who doesn't know much about Nim:

    Improved type inference
    ...
    let foo: seq[(float, byte, cstring)] = @[(1, 2, "abc")]
This looks like a normal type declaration to me, why is there any inference involved?


It looks simple but in a typed language it's actually somewhat tricky. The compiler needs to infer that the 1 is a float type, 2 is a byte, and compile it appropriately.

Previously Nim didn't do any "reverse" type inference so you'd need to say `@[1'f64, 2'byte, "abc")]`. That was because it's a constraints problem that can become exponentially expensive to solve. Exploding compile times in Rust and Swift are good examples of this. But there's limited subsets which can still be quick and are helpful like this case.


> It looks simple but in a typed language it's actually somewhat tricky.

But that example looks about as simple as it can be, so I clearly must miss something.

> The compiler needs to infer that the 1 is a float type, 2 is a byte, and compile it appropriately.

And I don't understand _why_ it has to infer anything, as the type is explicitly declared. I mean, there are 2 possibilities: * 1 is both a valid integer and a float literal => Nim needs the type declaration on the left to unify the type (from "integer or float" or "numeric" or whatever the type checker inferred) to `float`. * 1 is not a valid float literal (but an integer) => the type is not inferred, but implicitly converted to `float`. In both cases the solution does not involve inference?


You're completely right, but believe it or not Nim 1.6 actually doesn't manage to connect the dots between `1` and it being a possible `float`, `int64`, etc.. Even if you wanted a different size integer literal you'd have to say, for example, `42'int64`. You would be forgiven for asking how the language has purity checks for functions (`func` vs. `proc`) but somehow does not have this fairly elementary implicit type conversion (where Odin manages to even say `1.0` is a valid int value, for example, but won't permit anything that is not safely representable as a conversion).


> And I don't understand _why_ it has to infer anything, as the type is explicitly declared.

The seq declaration doesn’t need to be inferred. However the right side does need to be inferred from the declaration.

> I mean, there are 2 possibilities: * 1 is both a valid integer and a float literal => Nim needs the type declaration on the left to unify the type

Yep, `1` is an ambiguous number literal. So the compiler needs to back the info from the type into the assignment expression. Not super hard to do for simple cases, but it can become expensive for complex types.


The problem apparently is that they didn't actually have full type inference and the expressions on the right were given their types (probably a tuple of int x int x cstring) before they attempted the assignment into foo. Now they're using unification (they call it "top-down inference", so I'm guessing it's regular unification like other type inference systems use) so that the expression on the right will have the correct type and be assignable into the variable on the left.


I see, I guess you're right.


What are some noteworthy projects or libraries written in Nim?


Ones that have not been mentioned so far:

- npeg lets you write PEGs inline in almost normal notation: https://github.com/zevv/npeg

- owlkettle is a declarative macro-oriented library for GTK: https://github.com/can-lehmann/owlkettle

- ratel is a framework for embedded programming: https://github.com/PMunch/ratel

- futhark provides for much more automatic C interop: https://github.com/PMunch/futhark

- nimpy allows calling Python code from Nim and vice versa: https://github.com/yglukhov/nimpy

- questionable provides a lot of syntax sugar surrounding Option/Result types: https://github.com/codex-storage/questionable

- nlvm is an unofficial LLVM backend: https://github.com/arnetheduck/nlvm

- chronos is an alternative async/await backend: https://github.com/status-im/nim-chronos

- cps allows arbitrary procedure rewriting in continuation passing style: https://github.com/nim-works/cps

A longer list can be found at https://github.com/ringabout/awesome-nim.


We have written pixie: https://github.com/treeform/pixie . Pixie is a 2D graphics library similar to Cairo and Skia written entirely in Nim. Which I think is a big accomplishment. It even has python bindings: https://pypi.org/project/pixie-python/


Is pixie capable of realtime usage like in games or generative art? Last time I looked it seemed CPU-only.


Not by itself! But together with other library boxy you can: https://github.com/treeform/boxy

You should use pixie to load textures, create text, rasterize vector graphic etc... and send them to boxy to be drawn every frame.

Yes Pixie is CPU only, and just like you can't use Cairo or Skea for real time games you can't use Pixie, but boxy you totally can.


I use pixie for my game framework. It's used to load textures, layout fonts, and render font atlases. It can be used for generative art but https://github.com/EriKWDev/nanim or sdl2 using renderer's makes more sense as they are gpu accelerated.


Nitter (Twitter frontend) is written in Nim: https://github.com/zedeus/nitter/


Lots of high-quality Nim projects and libs are being worked on and used by the folks at Status:

https://github.com/status-im/nimbus-eth2

https://github.com/orgs/status-im/repositories?language=nim&...


https://findsight.ai is my project, written in Nim

I gave a talk about it here: https://www.youtube.com/watch?v=elNrRU12xRc including some more intense use of Nim (for inline PEG grammars and data-parallel processing with Weave)


I don't know if my particular version is noteworthy, but I recently started making updated Nim bindings for OpenCV and it was kinda fun. I don't consider myself an advanced C++ programmer, but Nim made the process easier than I had feared it would be. https://github.com/tapsterbot/mvb-opencv


Not familiar with Nim enough to figure it out - are the bindings auto generated in similar style like opencv bindings to any other supported language (python, julia, objc, rust, etc)?


They are currently not auto-generated. (I only implemented the absolute minimum to get started calling the most commonly used OpenCV methods from Nim.) Hopefully the bindings will be auto-generated be in the future, though!


Not sure if it counts as noteworthy, but I'm submitting this comment via my TUI web browser that I've been writing in Nim.

https://git.sr.ht/~bptato/chawan

Also, there exists another Nim web browser project; from what I can tell, it's in somewhat earlier stages of development.

https://github.com/xTrayambak/ferus


Check out my project Torrentinim for a popular but simple enough project if you want to taste what Nim is like.

https://github.com/sergiotapia/torrentinim

It's easy to understand code.


Nice, congratulations on v2.0.

Shameless plug: I'm working on a programming language called Yaksha that is also inspired by Python like syntax, however, philosophy differs from nim. Please take a look and let me know what you think :) https://yakshalang.github.io/documentation.html


Just as a general comment, the website is extremely hard to read at the default zoom, mainly due to the font size but also due to the lack of contrast on the background color and text color. The code samples in particular are very hard to read due to this.


Thank you. Will have a think.

Question - does dark theme and light theme both have the contrast issue?


The light theme is a little bit better. I usually don't have a problem with text without much contrast though, so you may want to get opinions other than mine. The main problem for me is the small text size.


I have updated the CSS, used a font with more weight, relative font sizes and high contrast themes for both light and dark themes (rrt theme, default theme in pygments)


OK I had someone with keratoconus to test it. I think it is good now.


If your Python programs heavily use Pandas and Numpy, could there still be speed benefits to translating them to Nim?


Your programs could benefit from small dependency-free executables and compile time code generation and execution. Nim code can also be called directly from python or vice versa, check out nimpy[1].

1. https://github.com/yglukhov/nimpy


Following up on this: As someone who uses Python with NumPy/SciPy heavily, are there any Nim libraries that would make the transition smooth? Libraries that can help with e.g. sparse matrices, linear algebra, differential equations, etc.


I've not used them myself but there was an effort to group these kinds of libraries in a common 'SciNim' community:

https://github.com/SciNim https://scinim.github.io/getting-started/

Generally, projects created by Mamy Ratsimbazafy (mratsim) are a good start since he's very adept at optimisating data science-related libraries.

You might want to ask in the #science channel of the Nim Discord server since although it's often quiet, that's where people working on these repositories hang out.


That might depend on how many raw Python loops and functions you use. Even if most of your code uses pandas and numpy, things like string processing could still benefit from a compiled language.


Here's a presentation from last year where a Python data scientist compares a Python and Nim implementation for a problem, with the Python version calling out to Numpy. There are performance comparisons at the end and his Nim version was faster so Nim should be usable for scientific programming:

https://archive.fosdem.org/2022/schedule/event/nim_hpcfrompy...

The big issue Nim faces isn't performance but rather the relative community sizes, and thus how many libraries are available (and also how much help you might find when you run into problems).


https://github.com/belamenso/v

This cleans up Nim's syntax a little, we use it in production with not much maintenance.


Still more readable than the average JS front end.


Fantastic language, hoping this will lead to increased adoption!


It seems almost too good to be true. Well done!

Could someone share some bad experiences when adopting Nim so I can weight that in? I'm seriously considering it.


I have a shortlist of pain points:

- Tooling is not great. The language server has a tendency to silently crash on occasion, and it's no rust-analyzer to begin with. A tooling rewrite has been delayed behind proper incremental compilation, which has been delayed behind ARC/ORC...

- Interfaces ("concepts") are experimental and there are two differing implementations.

- It lacks proper sum types and structural pattern matching in the core language. There are a number of quite good macro-based libraries that provide for this, however: fusion/matching, andreaferretti/patty, beef331/fungus, alaviss/union...

- Optional types are not the standard: the stdlib will throw exceptions. This is more so a personal preference than anything.

But that's about it. I do like Nim quite a lot.


> It lacks proper sum types

We've talked about this before! You know it has sum types, just not the variation you want.


I know, I know! I do very much like the "type wrapper" approach more than the "object variant" approach.

Perhaps in the future we'll simply have both and have no reason to debate ;-)


Nice! Probably worth taking another look at Nim again.

How is interop with Rust these days?


I dearly wish there was something like LibGDX for Nim. I have big Java projects I'd probably move over... scene2d is great for simple cross platform UIs.


I like capability separation with tags, though typeclasses are weird, higher-kinded types are not supported and implicits are weird.


Is Nim 1 going to receive patches going forward? Or is it a hard upgrade?


1.6 should have a few years of support. Though 2.0 is largely compatible with older Nim code.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: