Been happily crunching away at Nim in production. I'm working on what is mainly a data analysis and report generation tool, compiled as a CLI executable that gets called by server scripts.
Nim makes fast, small executables. It has an excellent heterogenous JSON data structure and a good dataframe library. It prefers the stack so strongly that dynamic data structures (sequences and tables, basically its lists and dictionaries) are pointers on the stack to heap data, where the lifetime is managed by the stack frame. I don't think I have any dynamic references anywhere in my program, and don't have to worry about GC at all. The type system is simple, sensible, and guides you to correctness with ease. Nim also defaults to referential transparency; everything is passed immutably by-value unless you opt out. Generics are powerful and work exactly as you expect, no surprises. Universal function call syntax is ridiculously powerful: You can write the equivalents to methods and interfaces on types just by making procedures and functions that take a first parameter of that type; not needing those abstractions greatly simplifies and flattens code structure. It's just procedures and objects (functions and structs) all the way down.
It's been a real joy to work with and reminds me of when I discovered D back in the day, only it's even better. If you imagine native-compiled type-annotated Python where nearly 100% of your code is business logic with no cruft, you're getting close to the Nim experience.
> “It prefers the stack so strongly that dynamic data structures (sequences and tables, basically its lists and dictionaries) are pointers on the stack to heap data, where the lifetime is managed by the stack frame.”
Isn’t that the same as a C++ vector or map on stack? They allocate internally as needed, and the whole container is destroyed when it goes out of scope.
It very much is, and the point is, it _used_ to be more like java. Araq basically pulled off a very daring switchover from reference based language system to a value based one.
So now the language can credibly claim the same as c++ - no room left closer to the metal. But it's packaged in a much nicer syntax (imho), and has features like macros which we can expect I'm C++ in maybe 10 years, if we're lucky.
Basically, but it requires no extra syntax. `var some_seq = @[1, 2, 3, 4]` is a stack-managed sequence. That's all there is to it. There's no unwrapping any pointers or boxes or what-not, the type is just `seq[int]`. Put another way, things that have become best practice in C++ are default in Nim with no syntactic noise.
Indeed there's no question that Nim is basically following C++'s lead on this. Nim iirc always had constructors and destructors. Final piece of the puzzle is move semantics, and I recall a blog post where Araq came up with something very similar.
yes, Nim has move semantics, but takes care of you more than c++ does.
for example, if you use an object that was previously moved, you dont get garbage, the compiler turns the first move into a copy (and tells you)
Dynamic data structures by their nature have to be allocated to the heap. What I mean by "prefers the stack" is that you don't have to make a managed ref and dereference a managed pointer type. You just make a `seq[int]`, use it as a `seq[int]`, and pass it as a `seq[int]`, just like stack data. Behind the scenes, it has a unique scoped pointer with no mental overhead.
Sounds like a vector/array/list in any other language after C++, like Go slice, Java ArrayList, Javascript array, Python list, Rust vec. Is there something I'm missing?
I don't see them drawing any distinctions from C++ or Rust there either. It really sounds to me like most of their low-level experience is in C, where the contrasts they appear to be drawing really do apply.
it depends on what you mean by 'dynamic' and 'stack'; certainly, outside of nim, you can allocate a list of, say, integers entirely on a stack in any of the following cases:
- the size of the list is known when you create a stack frame, as in c:
int xs[n] = {0};
- when the list grows, it grows in that subroutine and not some callee, for example using alloca();
- the list is built on a stack that isn't the one you have to pop your return address off of; examples include perl's data stack, ada's secondary stack, forth's operand stack, forth's dictionary, or an mlkit region. in these cases you can even return the dynamically built structure to a caller;
- each new callee adds some fixed number of items to a linked list, such as, in c
Another Nim user here. Typically you just build the project by `nim c <myProject.nim>`, since Nim has such a strong macro system a lot of typical build stuff is just done with macros. Of course there's also the default Nimble package manager which allows you to list dependencies and tasks using Nim itself. This means that if you know how to write Nim managing the build system is a breeze.
Ugh, I just got done spending months fighting CMake before moving back to a position using Nim!
You can also compile C projects with Nim like bearssl [1]. Nim takes care to compile the C files and recompile them when config flags change. It's actually really nice.
After programming professionally for 25 years, IMO Nim really is the best of all worlds.
Easy to write like Python, strongly typed but with great inference, and defaults that make it fast and safe. Great for everything from embedded to HPC.
The language has an amazing way of making code simpler. Eg UFCS, generics, and concepts give the best of OOP without endless scaffolding to tie you up in brittle data relationships just to organise things. Unlike Python, though, ambiguity is a compile time error.
I find the same programs are much smaller and easier to read and understand than most other languages, yet there's not much behind the scenes magic to learn because the defaults just make sense.
Then the compile time metaprogramming is just on another level. It's straightforward to use, and a core part of the language's design, without resorting to separate dialects or substitution games. Eg, generating bespoke parsing code from files is easy - removing the toil and copypasta of boilerplate. At the same time, it compiles fast.
IMHO it's easier to write well than Python thanks to an excellent type system, but matches C/C++ for performance, and the output is trivial to distribute with small, self contained executables.
It's got native ABI to C, C++, ObjC, and JS, a fantasic FFI, and great Python interop to boot. That means you can use established ecosystems directly, without needing to rewrite them.
Imagine writing Python style pseudoocode for ESP32 and it being super efficient without trying, and with bare metal control when you want. Then writing a web app with backend and frontend in the same efficient language. Then writing a fast paced bullet hell and not even worrying about GC because everything's stack allocated unless you say otherwise. That's been my Nim experience. Easy, productive, efficient, with high control.
For business, there's a huge amount of value in hacking up a prototype like you might in Python, and it's already fast and lean enough for production. It could be a company's secret weapon.
So, ahem. If anyone wants to hire a very experienced Nim dev, hit me up!
I've been using it as a scripting target for both games and other things I'm not allowed to elaborate on simply because it can transpile to C and C++. It's just really really nice to be able to manage the underlying run-time (the C environment) and on the top of that be able to use a high-level modern language with so many first-class citizen things (like JSON).
It really is a nicer, better Python. And I say that as someone who does like Python.
>Then writing a web app with backend and frontend in the same efficient language.
How does that work?
What i mean specifically is how convenient is it to use js interop in dev time, and not just compile nim to js as a standalone lib?
Can we simply call something like browser API directly from Nim (Or with fairly simple wrapper)?
Since Nim compiles to JS and C you just have to tell Nim what is available dn the target language and you can call stuff just as if it was a Nim function. These definitions can be auto generated, and they can live in a package you can simply import.
> Imagine writing Python style pseudoocode for ESP32 and it being super efficient without trying, and with bare metal control when you want.
To be fair, I did have to spend like 2 hours tuning my ESP32 code for handling a 22 kSPS ADC where microseconds matter. ;) Mostly just to avoid extra allocations as I was pretty new to Nim at the time.
Ah, but no major regressions in performance or changes needed for ~4 years!
Congratulations to everyone involved and the entire Nim community!
Nim has been my language of choice for the past decade and I'm really happy with the new features in Nim 2.0. Some of them are real gamechangers for my projects. For example, default values for objects theoretically allow me to make Norm[1] work with object types along with object instances. And the new overloadable enums is something Karkas [2] wouldn't be possible at all (it's still WIP though).
Of all the recent changes, default values is my favorite. Aside from generally useful and further reducing the need for initialisation boilerplate, I lets us guarantee valid state at compile time for things like enums - and, I assume, object variants?
Nim is really very nice language to write software in. Ship fast, enjoy the ride, produce very performant software. Unfortunately it still in my experience has some sharp edges: juggling C/C++ compilers and options, very poor error messages, very situational libraries that only work on some settings and systems. Given the small community, tho, I can't really fault them for it. The VS Code integration works very well in my experience, rarely crashing.
If someone at Manning Publications is reading this, it would be great to have a book on the newer Nim version, but please consider using a different typesetting with more readable fonts. I purchased the great book by Dominik Picheta, but am forced to use the .pdf because the dead tree version uses thin fonts that I find extremely hard to read even with the right pair of glasses. Font components (arms, lines, stems, etc) are just too thin. Not being a youngster anymore, I naturally thought it was my fault and took the original K&R 2nd ed as a comparison, but still can read it perfectly.
unfortunately there is no ebook version and it seems araq is against publishing ebook (probably because of worrying about piracy) - for me who is travelling a lot, hard copy is a no go.
What's the experience of writing web backend with Nim ? Did you use existing libraries / framework ? How good is the concurrency compared to something like Go ?
One of the reasons for choosing Nim was the ease of getting a production ready web backend. For the core part of managing the backend we are using existing Nim libraries [1], and they are easy to expand and work with. I cannot give you a comparison with Go since I haven't managed that large Go projects - but for Nim we are all into the async and threading. I think the channels within the threading is hardest part in Nim, but work is being done it.
Nim has been my favorite language for a while now, and I'm very excited to see version 2.0 finally released. A lot of these features have been items I've been looking forward to for some time.
The only downside is some of the included modules being moved to 3rd party repositories, as mentioned at the very bottom. It's not a big deal, but it was nice having SQLite support built into the library. I suppose once you support some databases, you'll be pressured to support more and more. I am a bit surprised to see MD5 and SHA1 support moved out though.
Yes, an experiment was run a while back, incorporating community-maintained code in a "fusion" repo shipped with the compiler by default. It didn't work very well. Discoverability and maintainability of stdlib-like things is hard.
In my experience in the batteries included stdlib approach, even if libraries evolve slowly, they tend to get a lot more attention wrt bug-fixing and performance improvements. Go's stdlib is the example here.
I find Nim to be an absolutely fascinating language. I've been trying to find a reason to use it on my job (my work is mobile-adjacent so the idea of compiling to JS and to ObjC is fascinating) but haven't gone beyond playing around with it so far. I've been comparing it to Rust and it's just so much simpler to get started with.
Had a look a Nim few months ago - feature wise is a lot of I wish Python had (easy interop with C/C++, static typed, compiled, can be transcompiled and executed on android/iOS), but ecosystem is small even though the language is not new. There is not many high quality libraries such a numpy, scipy, pandas, opencv in python. They lack some big player adopting it - it's too bad Unreal Engine didn't try to adopt Nim instead of creating their own new scripting language Verse.
One thing I'm also lucking is out-of-the-box interop with C/C++ libraries without creating own adapters (so that you can just import header and be done with it).
Another thing is I wish it had similar easy interop with Rust - just to increase adoption and also because in Rust easier to find high quality cross-platform crates (including mobile) that work without hassle even on mobile devices.
I worry in few years either Python will catch up (because of faster python, non-GIL, nuitka, briefcase for mobile etc) or Mojo will eat Nim lunch.
To be fair to Nim, only Python has the huge ML ecosystem of numpy, scipy, pandas, opencv, pytorch, tensorflow, keres... Doing ML/AI style work in anything but python is really hard!
That said Nim does have the nimpy library that allows for pretty seamless interop with python. Which means you can just import PyTorch, or scipy, or opencv and use them in Nim.
for me (mobile developer) interop with python is not enough because of really poor python story on mobile devices (iOS / android) when using native modules. I think if Nim had a seamless interop with Rust or even Zig it could piggyback on those communities to get some libraries for free.
That looks interesting. Unfortunately it looks like it hasn't been updated in a while? Is that because it's complete or a lack of interest?
For example, the approach mentioned at the bottom of the README of integrating via nlvm (https://github.com/arnetheduck/nlvm) sounded great but appears to be unpursued.
that looks interesting, thanks! Did you try it if it delivers on promises? There was not any new commit since 2020 so not sure if the project is stale by now.
Things might changed but last time I checked you could easily call C function but you had to kind of export each single C function, structs etc. You couldn't just import single header file and be ready to call any function in the library. There is some pending project [0] futhark but not sure how mature it is and that still only for C libraries (instead of C++ or Rust) but maybe easy adopt for Zig - would be great nonetheless.
Most of those libraries are written in C++ or C, which Nim has excellent support for. I've used opencv c++ library with Nim. It's just that opencv is so massive it'd take a lot of work to wrap well, so I haven't yet. Some folks are working on a pure Nim pandas lib too.
Anyone have working experience with Nim and Zig? I'd love to hear how they are similar and contrast. I'd also would like to see some idiomatic web server benchmarks between the two (now with Nim v2).
I've used both to work on a hobby OS project (Nim[1], Zig[2]). I very much prefer Nim. Code is succinct, elegant, and lets you focus on your core logic rather than fighting the language.
Zig is nice and I like its optionals support and error handling approach. But I was put off by its noisy syntax, e.g. !?[]u8 to represent an error union of an optional pointer to a many-pointer of uint8. Also having to prepare and weave allocators throughout most of the code that needs to dynamically allocate (which is most of the code) gets in the way of the main logic. Even little things like string concatenation or formatting becomes a chore. Zig also doesn't have dynamic dispatch, which makes polymorphic code hard to write; you have to work around it through some form of duck typing. In the end I realized that Zig is not for me.
Couldn't edit my post, but forgot to mention my main pain points with Nim have been:
- its module system, especially not being able to have mutually recursive imports (there has been a 7 year old proposal[1])
- order-sensitive declarations of procs (i.e. can't use a proc defined further down in the file unless you add a forward reference to it). For the latter there's an experimental pragma[2], but it doesn't work a lot of times once you introduce mutually recursive calls
- object variants requiring declaration of a separate enum instead of allowing inline declaration of the variant cases, and a close issue[3] with not being able to define the same field names under different variant cases.
If I recall correctly, lazy symbol resolution, which would allow both circular module imports and order-independent procs, was initially on the roadmap for 2.0. Currently, it was moved to a stretch goal for 2.2.
I maintain auto-generated bindings for my C libraries for Zig and Nim (and Odin and Rust - although the Rust bindings definitely need some love to make them a lot more idiomatic).
I think looking at the examples (which is essentially the same code in different languages) gives you a high level idea, but they only scratch the surface when it comes to language features (for instance the Zig examples don't use any comptime features):
And which language did you enjoy coding in the most? Yeah, a subjective question :-). (Edit: Missed the auto-generated part, so maybe you don't have an opinion on experience regarding this?)
I think actually Odin, although I was surprised by that (being more of a Zig fan). Odin has some neat convenience features which are nice for higher level code, while Zig can be a lot more 'draconian' by enforcing correctness even at the cost of some "line noise" (but I guess both languages are still in flux, so that might change).
As for Nim I enjoyed it initially (because of the Python vibes I guess) but the automatic memory management gets confusing quickly. IIRC there's quite a few different reference types - but maybe that has been simplified in 2.0
PS: Even though the bindings are auto-generated, I still try to make them 'language-idiomatic' by injecting some 'semi-manual' mappings for things like naming conventions or implicit type conversions, ideally getting the API close to what a 'native' API would look like - at least that's the goal.
I’ve written programs in both, though it’s been a while since I used Nim now.
I think I enjoyed writing Nim more. Zig is more boring, but for all the right reasons. I wouldn’t personally choose to write an OS in Nim, but I think Zig would be great for that when it’s mature. I personally started using it for embedded software.
I would probably use Nim for CLI tools, server applications, maybe GUI applications and games too.
The Zig teams seems to be putting much more effort into the whole compiler infrastructure, which is really amazing in my experience. There’s some great innovations there.
I suspect Nim would be much, much harder to wrangle for games than Zig (or easily the best of the bunch: Odin) since it doesn't make enough things clear at all in terms of allocation and only allows indirect control of allocation and deallocation.
I wouldn't necessarily prefer Nim for any of the things you listed but this doesn't have the same argument as for games with Odin (which has great tools and libraries for making games as well as gives a much better overview of important things you'll have to care about for making them in terms of performance, etc.).
Rather, it's because I've found that Nim belongs with the other languages that think that complexity can be managed by being hidden well enough, which I've found is simply not the case when something actually needs to be debugged or you need to understand the behavior of the program.
Hiding/ignoring allocation errors, not making allocation explicit, not making deallocation explicit, etc., makes for a much worse time actually understanding what's going to happen. Adding tons of GC options like alternative GC implementations isn't going to fix it and this new one is really just another example of trying even harder to hide complexity.
I think the ultimate irony of these languages that have magical features like move semantics is that they do some of those things in the name of performance but in practice many of them are so complicated to write well-performing code in with these space technology features and non-obvious behavior that the end results are worse than much, much simpler languages. I've also found that these languages' development cycles (for the end user) isn't that much longer than the space tech ones because there is ultimately much, much less to use in them so people end up just writing the actual code instead of trying to wrangle all of the magic.
Many game developers want to focus on writing games instead of fighting a memory allocator. Unless you're making a 3D game with realistic graphics, you don't need every last bit of performance.
No one needs to be fighting allocators; they are far less inhibiting and pose less of an issue than GC or RAII will in the vast majority of cases. They're far easier and simpler to deal with on the whole as well. You're always interfacing with memory management somehow and the implicit way is usually much harder to work with overall. The idea that having an allocator and explicitly working with it is for "that last bit of performance" is a bit disingenuous, you're usually losing far more than that with implicit allocation and deallocation. On top of that you simply inherently have a harder time understanding the behavior of your program.
Manual memory management is one more thing to care about instead of the actual logic. With automatic memory management, you don't need to think about memory at all; what could be simpler?
Easier in the best case and much harder in the worst, when your lack of thinking is an issue (which it definitely will be unless you're prepared to use more of the machine for no reason). Simplicity is not about what's easier to use, it's about how you interface with something, how simple and straight forward that interface is to use, how many things are implicitly or explicitly affected by that thing, and so on. Automatic memory management usually implies an assumption that allocations can't fail, memory is infinite, etc., so the assumptions and complications are many. It also adds more code you didn't write and have no direct control over, which complicates your problem solving in many ways.
GC or other automatic memory management is only easier if you have absolutely zero care for resource usage. RAII will oftentimes lead to single allocations and deallocations, for example, unless you take care to not have it be so, which is an immense waste of resources.
It's fine if you don't care and you know that that's going to produce slow, bad software, but let's be honest about that instead of saying you can not care and everything will be fine.
There were loads of specific differences, but if I could characterize both languages in a simple way:
- Nim seems to emphasize being a swiss army knife in the way that Python is, except as a compiled language.
- Zig is a much more focused language that tries to hit a certain specific niche - being a successor and replacement for C - and hits that mark spectacularly.
I think language preference comes down to what your personal needs and wants out of a new language that isn't being served by whatever you're using currently. I personally landed in the "Zig" camp because the way it approaches its ambition of being a C successor is intriguing, but I could see why other people might land on Nim.
It is maybe the most simple web server implementation, similar to what you get from "python3 -m http.server"? What sense does it make to compare highly focused web server frameworks to languages most simple stdlib implementations, much apples vs oranges.. (thus also not getting why the proposal to compare with actual web frameworks for nim is that much downvoted?!)
Nim's default json library is terrible in performance, but there're much faster drop-in replacements like jsony[1]. I'm not sure that's the main issue for low rank, but it's definitely one of them.
I would not call std/json it "terrible in performance" probably still way faster then what you get in many other languages (like python). But yes the JSON lib I wrote is faster due to avoiding branches and allocations.
Interesting. The Vercel benchmarks make it look pretty good. Only slightly behind rust. https://programming-language-benchmarks.vercel.app/zig-vs-ru... Benchmarks are as much about the skill of the programmer as they are about the language. I suspect those numbers could improve drastically.
Possibly it wasn't compiled with `-d:release`. I only looked briefly — is there a way to see the source code and cli flags used for the various implementations?
It isn't really a language community so interested in web server efficiency, and until recent years threading efficiently was kind of tricky with the GC scheme they used. If someone wanted Nim to rank high you could do it, but I'm not sure it is worth the effort?
There are a lot of features in Nim that are basically the polar opposite to Zig's values; macros/templates as opposed to comptime which has no real capability of just inserting random code and the very pervasive naked imports (functions/methods can come from anywhere) that are all over the place come to mind, as opposed to the explicit imports and qualified names you would have to use in Zig (or deconstruction of imports to get the bare names, making it obvious where an identifier is coming from).
On top of that you have only indirect control over memory allocation and deallocation, which goes completely against Zig's values where custom allocators are used and everything that allocates should take an allocator as an argument (or member in the case of structures). In contrast to that there isn't even the concept of an allocator in the Nim standard library.
I would say that my experience with Nim has made me fairly certain that Nim has absolutely no desire to make things obvious but rather chooses convenience over almost everything. It's not so much a competitor (in performance or clarity) to Odin or Zig as it is a competitor to Go or something with a much higher-level baseline.
On top of all of this it doesn't really have tagged unions with proper support for casing on them and getting the correct payload type-wise out of them, which is an incredibly odd choice when all of its competitors have exactly that or an equivalent.
Overall I would say that coming from Odin or Zig (or Go) and actually liking those languages it's very hard to like Nim. I could imagine that if someone came from a much higher-level language where performance is nearly inscrutable anyway and nothing is really obvious in terms of what it's doing, Nim would feel like more of the same but probably with better performance.
Edit:
Often while reading the Nim manual, news and forum posts, etc., I get the sense that Nim is really just an ongoing research project that isn't necessarily trying to solve simpler problems it already has along the way. If you look at some of the features in this announcement, it's hard to see anyone ever asking for them, yet here they are. In many ways it's way worse than Haskell, which often gets derided as "just a research language". A lot of what Nim has makes for a much worse experience learning and using the language and I'm sure it doesn't get easier in the large.
> It's not so much a competitor (in performance or clarity) to Odin or Zig as it is a competitor to Go or something
That seems accurate. Dealing with raw pointers as one does in Odin or Zig is very much de-emphasized in favour of dealing with safe references, and a lot of effort is put into optimizing out all the overhead of those reference checks (hence ARC/ORC) and writing code to evade them. The manual memory management features of Nim are there for flexibility and fallbacks and are not really the main way to write code: even for embedded. The stuff that Zig (and Odin?) do surrounding allocators and alignment, and constructs for slightly-safer pointers, are really very interesting yet are most helpful if you are indeed working with pointers and worrying about offsets: which you usually aren't in Nim.
I am curious as to what you mean about comptime, though. I have gotten the impression that equivalent constructs in Nim are more powerful. You have `static` blocks and parameters, `const` expressions, `when` conditionals, and then also both templates and typed macros operating on the AST (before or after semantic checking)... `when` even provides for type-checking functions with varying return types (well, monomorphized to one type) via `: auto` or the `: int | bool | ...` syntax.
I will also defend "naked imports" as a feature that works very well with the rest of the language: functions are disambiguated by signature and not just name and so conflicts scarcely occur (and simply force qualification when they do). And, this allows for the use of uniform function call syntax - being able to call arbitrary functions as "methods" on their first parameter. This is incredibly useful and allows for chaining function calls via the dot operator, among other things. Besides, if you really want you can `from module import nil` and enforce full qualification.
> I will also defend "naked imports" as a feature that works very well with the rest of the language: functions are disambiguated by signature and not just name and so conflicts scarcely occur (and simply force qualification when they do). And, this allows for the use of uniform function call syntax - being able to call arbitrary functions as "methods" on their first parameter. This is incredibly useful and allows for chaining function calls via the dot operator, among other things. Besides, if you really want you can `from module import nil` and enforce full qualification.
This is spot on. You can also not really have productive and well-fitting errors-as-values in a language that emphasizes UFCS, which is why Nim (and D) has/have to have exceptions. In order to productively use errors as values in Nim you either have to chain some kind of `Result` type (which, if you `map` & `mapError` over it will have to be able to implicitly allocate in certain cases, etc.) so the list of potential victims of this (and other features) just seems to go on and on.
In general, if you go over the list of features in Nim there is a coherence in them only in that some of the (mis)features actually have to exist in order for other features to make sense. I would feel like it was "designed" except in the case of Nim it really feels mostly accidental and not very well though out in general. The end result is (for me) that it feels very much like it ended up on the wrong side of readability, clarity and overall coherence.
> You can also not really have productive and well-fitting errors-as-values in a language that emphasizes UFCS
Eh, https://github.com/arnetheduck/nim-results and associated syntax from https://github.com/codex-storage/questionable would beg to disagree. Nim's stdlib does not have productive and well-fitting errors because it suffers from inertia and started far before the robust wonders of recoverable error handling via errors-as-types entered the mainstream with Rust and were refined with Swift (IMO). Option/Result types are fantastic and I do so wish the standard library used them: but it's nothing a (very large) wrapper couldn't provide, I suppose.
I do strongly think that other languages are greatly missing out on UFCS and I miss it dearly whenever I go to write Python or anything else. I'm not quite sure how you think UFCS would make it impossible to have good error handling? Rust also has (limited, unfortunately) UFCS and syntax around error handling does not suffer because of it. If by errors-as-values you mean Go-style error handling, I quite despise it - I think any benefits of the approach are far offset by the verbosity, quite similarly to Java's checked exceptions.
(in general concerns surrounding performance of errors surprise me - they're errors! they shouldn't be hit often! but if they are, you can certainly avoid such performance hits in nim.)
So, Nim doesn’t seem to be under an umbrella of a non-profit. Isn’t this destined to be a problem at some point regarding either acquisition of rights or succession?
Edit: Ouch. Just found this thread. Very disappointing, and actually makes a greater case for institutional ownership: https://forum.nim-lang.org/t/10312
I think "default" (Mercurial) is a way better name than anything else for the default branch, duh. But, I err on the side of the disadvantaged because it's impossible to fully empathize with their individual experience, so I try to use "main" whenever I can.
That said, I'm a Slav which is the origin of the word "slave" because in Europe, slaves were predominantly Slavs once. I don't really mind it because it feels irrelevant today. Connotations of "master" doesn't feel that ancient yet though, considering that black people weren't allowed to live in Palo Alto, CA (heart of Silicon Valley today) until 1950's.
Despite the remark about confusing people over 50, a primary branch called "master" isn't exactly an unalterable ancient tradition in version control, either. "Trunk" was common in centralized VCSes. I had to get used to "master" and "main" is at worst a lateral move.
I also found that thread. It was very pleasing and it seems he has definitely thought about the issue. While his response seems to go from 0 to 100 in a second, he is still honest and gets to the point instead of waiting for all the "why not?"s to roll in.
It's not surprising he has his head screwed on straight. There is clear genius in Nim's design. I'm not a genius, and I don't know much about compilers, just scant knowledge of some data structures and algorithms, but what I do know is that being able to make something so powerful be used by mere mortals like me is very much genius (an idiot values complexity and all that jazz).
I don't think that one project dying with its maintainer because he thinks woke is stupid is comparable to the community-hostile changes in trademark policy. No, not the flip side at all.
But I left it because of recursive imports. I had to basically put all my types into one file and use them from various others. For a relatively medium sized project (~10LOC), its a but of a hassle. Refactoring is an issue.
That being said, the language is fantastic. Can anybody with experience suggest me what HTTP library/framework do they prefer for servers?
Node really doesn't work with circular imports. There are runtime gotchas with it, for example destructuring a cyclic import via require() will give undefined for the destructured values as they "don't exist yet".
Go had full time engineers designing the language, tooling, docs, etc. Nim has never had huge industry sponsorship, so comparing the languages on age alone is hardly fair.
When I looked at it a few years ago, the compiler didn't prevent you from accessing fields from the wrong variant, and didn't provide exhaustivity checks. So I think it still falls short of this (excellent) litmus test :/
Oh, no, I actually misunderstood cobby's complaint: the field names, yes, those still have to be unique. Which is also a bit annoying, though I've seen discussions about changing it.
I feel the same way as you! I've seen many language ideas come and go in my career and sum types are one I feel now should be a basic requirement. I miss them in any language without them.
An ABI for languages with a proper type system seems fantastic. Swift, Rust, Nim, D all share very similar type systems (and memory management systems) and it would be very cool to see what kinds of interop easy dynamic linking would allow.
> Now one can define constructors and virtual procs that map to C++ constructors and virtual methods, allowing one to further customize the interoperability.
I hope this will help with bindings for C++ libraries that have historically been tricky to wrap.
For example, I would like to use Qt from a compiled language that's a pleasure to use, and this project looks promising:
So "var i: int" is value, "var i: ref int" is a heap allocated reference that's deterministically managed like a borrow checked smart pointer, eliding reference counting if possible.
You can turn off GC or use a different GC, but some of the stdlib uses them, so you'd need to avoid those or write/use alternatives.
Let me say though, the GC is realtime capable and not stop the world. It's not like Java, it's not far off Rust without the hassle.
1. Nim uses 'var' modifier to pass by reference, e.g. "proc (n: var int)...", default behaviour is pass by value. And there're also raw pointers and references (safe pointers).
>is a no-gc mode available?
You can disable gc, but most of standard library depends on it. But in Nim 2.0 there's finally support for ARC and ORC (ARC + cycle collector).
nim is a better python(syntax wise) that compiles to c(or c++,js,etc but c is the default) with GC turned on by default. I have always been wanting to use it outside of what my job needs(c/c++/python). I hope some big players adopt Nim to make it one of the mainstream language.
Reddit was hiring for Nim positions. So demand is growing. New languages have easier time being adopted at startups which grow into big players eventually.
Nim looks awesome. Does anyone know why it doesn't have first-class support for wasm? That's the only thing that would keep me from diving into it more.
I think the short answer is it's built on top of C tooling so it doesn't really need another way to do it because you can use emscripten. Search their forum for "web assembly".
The reason I ask is that I was poking around and saw some projects to help with wasm compilation, and on this random list of wasm-capable languages [0], Nim is listed as "Work in Progress." Notably, Swift is ranked higher, and I view Swift as extremely experimental when it comes to wasm
With 2.0 out and ORC being default it compiles and works just fine. There's not too many libraries specializing for wasm stuff though, so you gotta use emscriptens or similar.
I suspect it's simply a size-of-community thing. If you want it, you should take a crack at implementing it! Or least start a thread about on the official developer forum.
The last time I used the language, it was still using a garbage-collector and there were talks about transitioning towards a new way of doing things - I assume that ARC/ORC ended up being that destination.
Now that ARC/ORC is considered "complete," are there any remnants of the old GC still in the language, or has the entire ecosystem hopped over?
It looks simple but in a typed language it's actually somewhat tricky. The compiler needs to infer that the 1 is a float type, 2 is a byte, and compile it appropriately.
Previously Nim didn't do any "reverse" type inference so you'd need to say `@[1'f64, 2'byte, "abc")]`. That was because it's a constraints problem that can become exponentially expensive to solve. Exploding compile times in Rust and Swift are good examples of this. But there's limited subsets which can still be quick and are helpful like this case.
> It looks simple but in a typed language it's actually somewhat tricky.
But that example looks about as simple as it can be, so I clearly must miss something.
> The compiler needs to infer that the 1 is a float type, 2 is a byte, and compile it appropriately.
And I don't understand _why_ it has to infer anything, as the type is explicitly declared.
I mean, there are 2 possibilities:
* 1 is both a valid integer and a float literal => Nim needs the type declaration on the left to unify the type (from "integer or float" or "numeric" or whatever the type checker inferred) to `float`.
* 1 is not a valid float literal (but an integer) => the type is not inferred, but implicitly converted to `float`.
In both cases the solution does not involve inference?
You're completely right, but believe it or not Nim 1.6 actually doesn't manage to connect the dots between `1` and it being a possible `float`, `int64`, etc.. Even if you wanted a different size integer literal you'd have to say, for example, `42'int64`. You would be forgiven for asking how the language has purity checks for functions (`func` vs. `proc`) but somehow does not have this fairly elementary implicit type conversion (where Odin manages to even say `1.0` is a valid int value, for example, but won't permit anything that is not safely representable as a conversion).
> And I don't understand _why_ it has to infer anything, as the type is explicitly declared.
The seq declaration doesn’t need to be inferred. However the right side does need to be inferred from the declaration.
> I mean, there are 2 possibilities: * 1 is both a valid integer and a float literal => Nim needs the type declaration on the left to unify the type
Yep, `1` is an ambiguous number literal. So the compiler needs to back the info from the type into the assignment expression. Not super hard to do for simple cases, but it can become expensive for complex types.
The problem apparently is that they didn't actually have full type inference and the expressions on the right were given their types (probably a tuple of int x int x cstring) before they attempted the assignment into foo. Now they're using unification (they call it "top-down inference", so I'm guessing it's regular unification like other type inference systems use) so that the expression on the right will have the correct type and be assignable into the variable on the left.
I use pixie for my game framework. It's used to load textures, layout fonts, and render font atlases. It can be used for generative art but https://github.com/EriKWDev/nanim or sdl2 using renderer's makes more sense as they are gpu accelerated.
I gave a talk about it here: https://www.youtube.com/watch?v=elNrRU12xRc including some more intense use of Nim (for inline PEG grammars and data-parallel processing with Weave)
I don't know if my particular version is noteworthy, but I recently started making updated Nim bindings for OpenCV and it was kinda fun. I don't consider myself an advanced C++ programmer, but Nim made the process easier than I had feared it would be. https://github.com/tapsterbot/mvb-opencv
Not familiar with Nim enough to figure it out - are the bindings auto generated in similar style like opencv bindings to any other supported language (python, julia, objc, rust, etc)?
They are currently not auto-generated. (I only implemented the absolute minimum to get started calling the most commonly used OpenCV methods from Nim.) Hopefully the bindings will be auto-generated be in the future, though!
Shameless plug:
I'm working on a programming language called Yaksha that is also inspired by Python like syntax, however, philosophy differs from nim. Please take a look and let me know what you think :) https://yakshalang.github.io/documentation.html
Just as a general comment, the website is extremely hard to read at the default zoom, mainly due to the font size but also due to the lack of contrast on the background color and text color. The code samples in particular are very hard to read due to this.
The light theme is a little bit better. I usually don't have a problem with text without much contrast though, so you may want to get opinions other than mine. The main problem for me is the small text size.
I have updated the CSS, used a font with more weight, relative font sizes and high contrast themes for both light and dark themes (rrt theme, default theme in pygments)
Your programs could benefit from small dependency-free executables and compile time code generation and execution.
Nim code can also be called directly from python or vice versa, check out nimpy[1].
Following up on this: As someone who uses Python with NumPy/SciPy heavily, are there any Nim libraries that would make the transition smooth? Libraries that can help with e.g. sparse matrices, linear algebra, differential equations, etc.
Generally, projects created by Mamy Ratsimbazafy (mratsim) are a good start since he's very adept at optimisating data science-related libraries.
You might want to ask in the #science channel of the Nim Discord server since although it's often quiet, that's where people working on these repositories hang out.
That might depend on how many raw Python loops and functions you use. Even if most of your code uses pandas and numpy, things like string processing could still benefit from a compiled language.
Here's a presentation from last year where a Python data scientist compares a Python and Nim implementation for a problem, with the Python version calling out to Numpy. There are performance comparisons at the end and his Nim version was faster so Nim should be usable for scientific programming:
The big issue Nim faces isn't performance but rather the relative community sizes, and thus how many libraries are available (and also how much help you might find when you run into problems).
- Tooling is not great. The language server has a tendency to silently crash on occasion, and it's no rust-analyzer to begin with. A tooling rewrite has been delayed behind proper incremental compilation, which has been delayed behind ARC/ORC...
- Interfaces ("concepts") are experimental and there are two differing implementations.
- It lacks proper sum types and structural pattern matching in the core language. There are a number of quite good macro-based libraries that provide for this, however: fusion/matching, andreaferretti/patty, beef331/fungus, alaviss/union...
- Optional types are not the standard: the stdlib will throw exceptions. This is more so a personal preference than anything.
I dearly wish there was something like LibGDX for Nim. I have big Java projects I'd probably move over... scene2d is great for simple cross platform UIs.
Nim makes fast, small executables. It has an excellent heterogenous JSON data structure and a good dataframe library. It prefers the stack so strongly that dynamic data structures (sequences and tables, basically its lists and dictionaries) are pointers on the stack to heap data, where the lifetime is managed by the stack frame. I don't think I have any dynamic references anywhere in my program, and don't have to worry about GC at all. The type system is simple, sensible, and guides you to correctness with ease. Nim also defaults to referential transparency; everything is passed immutably by-value unless you opt out. Generics are powerful and work exactly as you expect, no surprises. Universal function call syntax is ridiculously powerful: You can write the equivalents to methods and interfaces on types just by making procedures and functions that take a first parameter of that type; not needing those abstractions greatly simplifies and flattens code structure. It's just procedures and objects (functions and structs) all the way down.
It's been a real joy to work with and reminds me of when I discovered D back in the day, only it's even better. If you imagine native-compiled type-annotated Python where nearly 100% of your code is business logic with no cruft, you're getting close to the Nim experience.