Hacker News new | past | comments | ask | show | jobs | submit login
The Odin Programming Language (odin-lang.org)
194 points by gingerBill on Jan 31, 2020 | hide | past | favorite | 141 comments



Odin seems rather similar to Zig: https://ziglang.org/

It would be interesting to hear from anyone who has spent time in both about the differences in experience.

It’s a trend of “Better C than C” languages that don’t go as far as Rust or C++ in terms of either complexity (or safety) but are more opinionated and modern than C. I think Nim can squeeze into this category, too, but maybe not? Nim is confusing to me.

Jon Blow is also working on one of these, but it isn’t released yet: https://github.com/BSVino/JaiPrimer/blob/master/JaiPrimer.md


You can add Beef to that list: https://www.beeflang.org/

I've actually enjoyed playing around with Beef, the dev basically took C++ and gave it the syntax and core library design of C#. Here's a page from the guide just to give an example of the look and feel: https://www.beeflang.org/docs/language-guide/memory/

Note that despite the appearance, there's no GC, no ref-counting, explicit manual memory management, and so on.


This is one of the languages I've liked the most, though I may be biased since I love C#; Beef is actually made by a legendary PopCap developer ;) Loved popcap back in the day. PopCap Framework, HGE Game Engine etc. Good times;


I hadn't seen Beef before. Is it 'as fast as C'? The docs look nice.


I can't speak to whether it's literally as fast as compiled C code. I imagine that there will be common uses cases where it's very slightly less efficient due to its support for things like dynamic dispatch; but I don't see any reason in my brief experience with it why it couldn't at least be as fast as equivalent C++ code. It uses LLVM for its backend.

One of my favorite things about it is the way it handles memory during debug runs. (My #1 favorite thing is that it was all developed by one guy working solo - Brian Fiete, a cofounder of PopCap Games.) First, it's able to trace allocated memory during debug execution and immediately report on memory leaks, along with the location in code where the leaked memory was allocated. Second, it can guard against use-after-frees in debug by marking the memory freed, but not actually reclaiming it. Any subsequent writes/reads to that location will immediately fail with the object's allocation stack trace.

Both of these are turned off during normal builds but seem to be excellent debugging tools.

The IDE is also cool and was developed in conjunction with the language. It's not the most feature-rich application and there are some bugs as you'd expect, but it's a perfect example of dogfooding as it was written in Beef. Hot code reloading is also supported.

Note that I have zero connection with this language, other than that I really enjoy it so far, and that I reported a minor issue that is probably very specific to me and me alone, and Brian replied to me personally and we had a pleasant brief exchange.


Odin can achieve the same feature of checking memory leaks with it's custom allocator system. You can very easily have a custom allocator track the allocations of everything, including its source location. This is because Odin was heavily designed around custom allocators as a core feature.

And coupled with the implicit `context` system, it can very easily profile third party Odin code's allocations too!


BeefLang will report leaks as they occur, not when the program shuts down. There's a tracing "GC" in debug mode for detecting unreachable memory in realtime. It's also used for reliably detecting use-after-free since memory is held when there are references and released when the last reference goes away -- again, a debug-only feature.


This is totally possible with custom allocators in Odin too. You'll be surprised by how much custom allocators can do, but so many people know little about them.


Am I understanding correctly that in Odin feature could even be provided by a third party library implementing such a custom collector? If so, that's seriously cool.


Thanks for the mention, Gene. Thanks for being an early adopter, and keep those bugs and feedback coming in!

As far as performance vs C - there's not a lot of features that incur dynamic dispatch: virtual method calls, dynamic casts (as/is keywords), and direct interface dispatch. When you use interfaces as generic constraints, those monomorph into static dispatches unless the implementing method itself is virtual.

For C-style code, however, the performance should be the same as the C equivalent in Clang. File a bug if it isn't!


> One of my favorite things about it is the way it handles memory during debug runs. (My #1 favorite thing is that it was all developed by one guy working solo - Brian Fiete, a cofounder of PopCap Games.) First, it's able to trace allocated memory during debug execution and immediately report on memory leaks, along with the location in code where the leaked memory was allocated. Second, it can guard against use-after-frees in debug by marking the memory freed, but not actually reclaiming it. Any subsequent writes/reads to that location will immediately fail with the object's allocation stack trace.

Existing C++ tooling (sanitizers) achieve this too.


It’s also cross platform as hell. Amazing at this stage of development.


I had the pleasure of meeting gingerBill at Handmade Seattle:

https://mobile.twitter.com/odinlang/status/11957620628419379...

It was really fun to hang out and talk shop


I'm making it a point to generally have independent compiler authors at Handmade Seattle. You two were class acts -- thanks for attending!


Thank you for inviting us! It was a brilliant event, and thank you for all your hard work.


It was a pleasure meeting you too, Andy.


Jonathan Blow's language is the most interesting from these three for me because it will have nice metaprogramming features. The document you linked to seems slightly outdated. One of the beta testers has published some more up to date videos with his language introduction and initial experience: https://youtube.com/watch?v=i1vbvikDiI8&index=1&list=PLhEuCy...


Personally, I agree with Jon in that he is not too bothered about being the "winner" with regarding to his language. He wants a tool that can help him actually solve problems he has. And he wants to make sure that his tool is good before releasing it to the public. Releasing a product before it is done is can of worms in itself, not to mention the problems related to open source software itself. So it is entirely respectable for why Jon has not been as opened up his language to the public yet.

I created Odin for the very same reason that I wanted a tool that made me more productive and helped me solve the problems that I actually have. Even if Odin only benefited myself, I would class that as a success, but it has been helping so many people create amazing things.


> Releasing a product before it is done is can of worms in itself

Perhaps, but a programming language which is released when “done” is essentially dead on arrival. I’d guess the people who created amazing things with Odin helped the language move forward, right? Nothing wrong with keeping something to oneself, but promising for a long time to move something into an open source model with no set date.. I’ve rarely seen that end well. Hope I’m proved wrong though!


Is there a link to success stories?


well Jai is probably the least interesting language for me as the others actually exist. When Jai ever gets released, only then I can see all its warts and glories.


I mean it obviously exists. He livestreams coding on the language and in the language. Sure very few people can actually use it right now, but existence is not in question.


But existing for someone else, and existing where I can use it are two different things. I'm sure there are lots of cool tools and things that exist at big companies like Google (or more interestingly say the NSA) but for all my intents and purposes they don't because I can't interact with them.

A reverse corollary to Russell's Teapot? So what if there is a teapot orbiting Jupiter, I can't get it so it allows me no ability to serve tea. Therefore it might as well not exist!


Respectfully speaking, I don't think we need to get anywhere near some weird "reality is relative" argument here.

Just say it's not useful to you because you can't use it. No need to say it doesn't exist, even if it just doesn't exist "for you."

(As an aside, I think a lot of societal problems in the world today have roots in relativism and such -- so it's a bit of a bugbear.)


I understand that. However my comment was just a statement that the existence the person above me was implying was useful existence. As in we can all be excited about this new tool, or game, or whatever, but if we never get to use it, its usefulness is moot. So while the language certainly exists we can observe him working on it, and he could even create a game using it that would have some societal impact, the current usefulness of the language to anyone except him is as if it did not exist at all.

Of course you could make the argument that its mere existence and him highlighting certain aspects could have influences on other language designers... and wow am I going down a tangent spiral now...

Anyways: TL;DR reality is objective but words can have complex semantics


But that's a big deterrent for a lot of people. Odin, Zig, and the like are languages that can be used right now. Yeah, they're not mature yet, but you don't have to wonder what it is like to program in them, and you can start building familiarity with them today.


It’s been a while since I was following Jai descriptions, but IIRC, Nim exists today and has all the meta programming features Jai is supposed to - and to a large extent so does D.


Jai has no GC though, which is critical in game dev.


    nimc --gc:none
Nim has 5 GCs to choose from, as well as being able to have none at all.


The GC can also run incrementally to meet deadlines. It’s not like a Java GC. It has been used in microcontrollers. People should probably learn more about it before shooting it down based on the word GC.


Java GC also has been used in microcontrollers, with soft real time deadlines, there are plenty of JVM vendors out there, including a couple that are only focused on embedded development like PTC, Aicas, Gemalto, microEJ, Virtenio.


That’s great, but can you use the Nim stdlib with no GC?


Yes, with some relatively minor caveats, mostly having to release memory yourself.

Also, the arc/orc GC is shaping up and already allows you to use exclusively reference counted memory management - so, efficient and perfectly deterministic timing but still get automatic memory management. (As usual, if you introduce cycles, it becomes more complicated)

And the Nim compiler elides many ref/unref ops, as well as keeping objects thread-local, so most performance objections to ref counting don’t actually apply. (.... and you have a choice of other automatic GC modes, including “none”)


Good to know. Thanks for the explanation.


Yes, but you'll need to call dealloc [0] on the result of any call, unless you want to leak memory.

[0] https://nim-lang.org/docs/system.html#dealloc%2Cpointer


And D has the DasBetterC mode, but I'm not sure that's good enough: when the language has a GC by default, all the libraries use it, the APIs relies on it..

What good is a language if you can't (easily) use its libraries?


Nim’s relationship with GC is very different than any other language that I’ve used.

It has very different selectable GC systems - Boehm, Bacon/Dingle, reference counting, or real-time deadline mark-and-sweep, and “none”. Perhaps I forgot one. Some libraries rely on a specific GC behavior but most work with any (with the caveat that “none” requires you to manually deal with garbage).

Nim’s mark-and-sweep is suitable for embedded systems and games, unlike Java’s, and so is the ref counting one; but even if none of the GCs work for you, the fact that there’s many of them and they are mostly interchangeable means that the dependency on them is much, much weaker than you are used to (although it still exists)


Just like Java actually, each implementation has its own set of algorithms including none.


You can use Nim in games using deterministic, real-time GC. Also, the new ARC memory management will replace GC https://forum.nim-lang.org/t/5734


I like Jai's `use` keyword.

Fun fact: Odin has this exact same thing with `using`. https://odin-lang.org/docs/overview/#using-statement


Kotlin also has it as "apply":

    Windows().apply {
      width = 100
      height = 200
    }
Now, I can't help but see the following kind of code:

    val someStructure = ...
    someStructure.x = ...
    someStructure.y = ...
    someStructure.name = ...
as an immediate language code smell


Odin's `using` comes from Pascal's `with`, of which was block based. Odin's `using` can be applied to a load more than just scopes:

https://odin-lang.org/docs/overview/#using-statement


Yes, the general idea is not new but the devil is in the details.

As far as I can tell, Odin's `using` is applicable in the same areas as Kotlin's, but I still favor Kotlin's scoped version. When I see "using foo;" in Odin, it's not quite obvious to me where the scope of applicability is.

Any reason why you didn't do something like

    entity.use {
      // "this" is now the entity instance
    }
    // back to your regular "this"

?


You can easily do the same in Odin, if you would like. However, `using` works for a lot more than that just this. `using` allows for many for type system features beyond a basic scope import like Pascal's with. `using` can be applied to procedure parameters, struct field declarations allow for subtype polymorphism (even of pointers), scope imports, and more.

Koltin's approach is limited to purely the same Pascal's `with` allowed.

But if you want to be clear:

    { using entity;
        // the fields of "entity" are not usable in this scope 
        x = 123;
    }
    entity.x = 123;


* are now usable typo


> Better C than C

The D programming language can be used as a better C:

https://dlang.org/spec/betterc.html

and only requires the C runtime library to link to.


Not to go too far afield, but I agree that Nim is in that category and I'm curious what parts of it are confusing to you.


Nim is a cool language but it does have a lot of things about it by default where it is not targeting itself as being a C alternative, such as garbage collection (by default).

Nim is also a very complex language with numerous features to it making it a lot more complicated compared to C. So for someone who might like C might not like Nim. However, I recommend everyone try out as many languages to see what they like.

Being that I am the Odin creator, I do think you should give Odin a go because it is much more focused on being a C alternative and tackling the kinds of problems it solves.


I really like Nim but GC makes it a nonstarter for embedded devices. So to me Nim is in the same category as Go.


The nim team is currently working on removing the garbage collector by means of a new reference counting based “garbage collector mode” called “arc” (for automatic reference counting). You can get more info in the following link, where it is described as “plain old reference counting with optimizations thanks to move semantics”:

https://forum.nim-lang.org/t/5734

The objective is to make nim suitable for embedded programming and other use cases for which garbage collection is a non starter. This new —gc:arc mode is already available in the nightly builds and the benchmarks are already impressive. I believe that the plan is to make arc the default “garbage collector mode” in nim 1.2.


One of the issues with Nim is that ARC is still a form of automatic memory management, of which many domains do not want. This is why Odin has been designed to take advantage of custom allocators so that programmers have a huge control of how memory is allocated at all levels. And coupled with the `context` system, you can also have control over and track third-party code in how it allocates things.

Custom allocators are a pleasure to use, and allow for so much control. Along with the temporary allocator, you can make Odin feel like it's a dynamic language whilst being extremely fast. Custom allocators are an under-utilitized thing in programming in general, and I hope more people release what is possible with them that is not possible with automatic memory management schemes.


Hey, I am going to show my stupidity for a moment but I have to ask:

Why is garbage collection considered a negative thing?

I have no experience programming low-level languages, but I do follow and try new/obscure languages for fun.

Zig, Nim and V were the few I found + tried first, but I learned about Odin and Scopes recently and found them both interesting.


I write C++ for video games, and garbage collection is often shunned in performance-critical systems since it's often nondeterministic and GC pauses can become too long. I think garbage collection could work in games, but it's commonly implemented for use cases that are not video games (eg: servers, desktop applications, etc.)

A programmer writing managed C# in an engine like Unity will often spend a lot of programming time ensuring that the code will not allocate every frame, as the additions to the heap will eventually trigger a pause.

That said, every game and its requirements are different, and some game development might not mind that as much. A C++ engine programmer on a Nintendo Switch is in a very different situation than a hobbyist in JavaScript or a server backend programmer on a mobile game.


Just like doing virtual calls or using OS memory allocator used to be shunned by early C++ adopters on the game industry.

I still remember when doing games in Basic/Pascal/C was considered to be like Unity nowadays, real games had to be written in Assembly.

As you say, every game and its requirements are different, and many 8/16 bit games were perfectly doable in C, Pascal, Basic and eventually the community moved along, just like it happened with C vs C++ a couple of years later.

I see the use of GC enabled languages the same way, and C# belongs to those languages that also offer other means of memory allocation, not everything needs to live on the GC heap.


I think you’re absolutely right here. The reason people often dismiss garbage collection in game programming is the pauses, but if the pauses aren’t noticeable then the reason to dismiss goes away. Computer performance gains over time can totally help dismiss that, much akin to virtual calls or default allocators. The writing on the wall was there after games like Minecraft became a huge hit.


Garbage collection is not so much considered a negative thing, but a thing that's inappropriate for the embedded domain. The problem is that garbage collection entails some system code periodically scanning lists of memory allocations to identify stuff that's now garbage that can be recycled. Embedded Devs worry about the scheduling of that code, and how long it could take to run worst case, and whether it will spoil their real time guarantees. There are various mitigation strategies, but for good or evil many individuals and organisations apply a simple "no, we're not going to use GC ever" policy.


@danbolt @billforsternz

Thank you guys for the response, super appreciate it.

I guess, I can understand from an abstract perspective that you can manually tune performance and optimize to a higher degree if you can control memory allocation yourself.

And for a lot of purposes where performance is imperative, like games or embedded devices it can make or break the ability of software to function properly.

But my question then is, if languages like Crystal, Nim, or D (or any other GC lang with similar speed) can operate either at/near the performance of C, why exactly do you need manual memory management?

And if you do need it, I assume many languages that cater to this audience provide some sort of symbolic annotation that allow you to manually control GC where you feel you need it, aye?


I think you are correct in your basic assertion that no one wants manual memory management for its own sake. What they really want is sufficient performance for their use case. The benchmarks you usually see are throughput oriented, and on small heaps. If you have tight latency budgets and/or huge heaps, the performance is not close.

Optional manual memory management sounds great, but I'm skeptical it would work well in practice. The reason is that if the language default is GC, libraries won't be designed for manual memory management, meaning it will be hard for your manual code to interact with data structures created by non-manual parts.


"Near C" performance is often not good enough, and usually misleading. You can write poorly performing applications in C, and certain benchmarks may favor or disfavor certain elements of a language. Generally they're created to be "similarly written" in all benchmarked languages, which may seem like the fairest comparison at face value. But what that means is that they are often naively written in one or more of the languages. Expertly written, hand-tailored-to-the-problem-domain C code is almost always going to outperform other languages by a significant margin, especially languages without manual memory management. You can do things in C like use arena allocators to significantly reduce memory performance overhead - things which require low-level control and a non-naive understanding of the problem domain. Garbage collectors can be quite performant, but they aren't capable of this kind of insight. Code that is written in C similarly to a garbage collected language will be similarly naive (another malloc call for each and every allocated thing, versus allocating out of an arena, for instance).


As I said, mitigation strategies exist, including manual control of GC etc. It's not true that using GC is universally impossible in embedded / real-time situations. It is true that it can cause performance and non-determinism issues (which are potentially solvable), and it's also true that some developers avoid GC so they don't have to deal with those potential issues. They would prefer to deal with the issues associated with manual memory management.

Who's to say who's right and who's wrong? Ultimately life (and the subset of life that is software development) is a massively complex strategy and tactics game with a myriad of possible playing strategies and no agreed perfect solution.


> if languages like ... can operate either at/near the performance of C

That depends entirely on how you define and measure performance. If total throughput is your metric, then it's no problem - for example, Go is perfectly acceptable for web services.

Predictability of latency, however, is absolutely _not_ on par with C code. For example, 3D rendering with a GC can easily result in perceptible stuttering if care isn't taken to minimize allocations and manually trigger the GC at appropriate times.

> some sort of symbolic annotation that allow you to manually control GC

It's not that simple. D tried to sell this at one point, but it just doesn't work for large multithreaded programs and things aren't single threaded these days. Manually controlling a global GC means manually balancing, for example, one block of threads that perform lots of allocations and deallocations (and will starve if the GC doesn't run regularly) with soft real time networking code and hard real time VR rendering code. And (for example) you certainly don't want your rendering loop pausing to scan the _entire heap_ (likely multiple gigabytes) on each frame! Alternatively, in the case of Go (and depending on your particular workload) you might not appreciate the concurrent GC constantly trashing the caches.

Custom allocators and non-atomic reference counting are fantastic though.


Religon against GC and cargo cult.

Several companies have been selling Java, Oberon and now Go runtimes targeted to bare metal deployment on embedded scenarios.

Some of them are more than 20 years old, so apparently they might have one or two customers keeping them alive.

The hate against GC feels like the hate against high level languages on 8 and 16 bit platforms back in the day, because anyone doing "serious" stuff naturally could only consider Assembly as a viable option.


Being able to use a GC in some embedded cases (not too hard constraints on memory use or latency), doesn't mean that you're able to use GC in every embedded cases. I work on telecoms just above the FPGA/DSP even a 1ms pause would be a big issue.


Agreed, however there is a big difference between stating that it doesn't work at all, and accepting that there are plenty of use cases where having a soft real time GC tailored for embedded development is perfectly fine, and actually does improve productivity.

Since you mention telecommunications, I would consider network switches running Erlang a use case of embedded development.

Other examples would be the Gemalto M2M routers for messaging processing, or some of the NSN base station reporting platform.

So while it doesn't fit your scenario, it does fit other ones, this is what some in anti-GC field need to realise.

This isn't an all or nothing equation.


Because garbage collection, and in particular tracing garbage collection, adds significant overhead both in CPU cycles and memory. This overhead is also very unpredictable and depends heavily on memory allocation and object lifecycle patterns. Simple GCs can pause the program for a very long time, proportional to the size of the used memory, and this may be several tens of seconds for large heaps, so quite unacceptable. There are ways to mitigate these long pauses with incremental or concurrent GC, but they increase complexity of the runtime system and have even more average overhead, and although in the average case they may perform acceptably, they tend to have very complex failure modes. In addition to that, a tracing GC typically needs some additional memory "room" to operate, so programs using GC tend to use much more memory than really needed.

There is also a common misbelief that compacting GC helps make heap allocations faster than malloc. While technically true - the allocation itself is simple and fast, because it is only a pointer bump, a problem occurs immediately afterwards - this new heap memory hasn't been touched since the last GC, and it is very likely not cached. Therefore you get a cache miss immediately after the allocation (managed runtimes initialize memory on allocation for safety). Because of that, even allocating plenty of short-lived objects, which is the best case for GC, is not actually faster than a pair of malloc+free.

There are also other overheads:

* Managed runtimes typically use heap for most allocations and make stack allocation harder or not possible in all cases - e.g. it is much harder to write Java code with no heap allocations than C.

* To facilitate GC, objects need additional word or two words of memory - e.g. for mark flags or reference counts. This makes cache locality worse and increases memory consumption.

* During heap scanning, a lot of memory bandwidth is utilized. Even if GC does that concurrently and doesn't pause the app, this process has significant impact on performance.

* Tracing GC prevents rarely used parts of the heap to be swapped out.


At least for my use scenario in embedded systems, performance is not necessarily worse with GC and nondeterminism is not a showstopper either. The problem is avoidable by proactively minimizing allocations in the hot paths or arranging 'critical sections' that disable GC temporarily. The deal-breaker is the memory footprint.


Nim memory management is tied to the type you use.

You either use:

- an object. Which is on the stack and is either trivial or uses destructors and is suitable for embedded due to deterministic memory management

- a pointer object. Which is a raw pointer like C *. Managed directly via raw malloc/free or Nim malloc/free. Suitable for embedded

- a reference type. Which is managed by one of Nim GC or is an error if you use gc:none


That's cool but arc is a form of GC.


Nim could always work without a GC, and it’s getting better at it all the time. Most of the standard library does depend on GC availability, but this is being worked on.


Yeah having 2 ecosystems where some stuff depends on GC and some stuff doesn't is always going to be a turnoff to me. Glad that Odin and Zig made the decision to not.


Gemalto, PTC, Aicas, microEJ, Astrobe seem pretty fine selling Java and Oberon based tooling and runtimes for embedded devices deployments across industrial automation, robotics and military.

FSecure's foundry USB security key, runs bare metal Go, TamaGo.


This is incorrect. Nim has 5 default GC methods and the upcoming automated reference counting.

It runs even on the smallest microcontrollers.


> It’s a trend of “Better C than C” languages that don’t go as far as Rust or C++ in terms of either complexity (or safety) but are more opinionated and modern than C.

I don't see C going away anytime soon, but it's exciting to see people attempting to replace it over the long term. Fun to see where it goes!


Zig has a very small "core" idea. Explicit allocator, basic C semantics, clear UB, errors and comptime. All other features implemented in stdlib in zig itself.

Odin fills like go with "special" runtime. For example dynamic arrays are implemented as compiler component in C++.


Odin's '"special" runtime' as you call it is extremely minimal and is still easy to swap out with your own, and also only gets used if you use those features.

Zig has a runtime of its own now since it has added coroutines (async/await) as a core feature, but that is still extremely minimal aw well.


I wonder the same why is everyone trying to replace C, but hardly anyone beside Rust is trying to replace C++, does everyone think C is an easier target?


Being "opinionated" is a step back IMHO, one of the big selling points of C is that it doesn't get in the programmer's way.


C doesn't get in your way in the same sense that PDP-7 Assembly doesn't get in your way.


Nim has an advantage in being so mature. Unfortunately with it maturing, my feeling of it sharing Python’s simplicity slowly faded away.


I don't think anything has beat the designs of Clay or V, but they aren't really usable. Then again zig intentionally breaks on windows text files so I couldn't say it is widely usable either.


Even notepad can do \n-only newlines these days. Zig can't do UTF-16 files either, but no one complains about that.


Cyclone was definitely there. Sadly no longer maintained, and not at all practical to get working on modern systems (I did some work on it, then gave up).


> Then again zig intentionally breaks on windows text files

Too true. I ran into this after about 10 minutes of trying the language. Copy pasting from a GitHub issue? Better hope those invisible newlines are what you expect.


I view this as a good idea!


You think alienating 90% of your potential users is a good idea?


Whenever I see a language like this (niche and new yet more than a toy, minimal ecosystem/tooling support, and evolutionary - not revolutionary - improvements over a much more widely-used language), I have to wonder, who's actually using these languages, and in what contexts? Where does the build complexity/hiring difficulty/lack of tooling tradeoff make sense for the advantages that one of these languages offers in return?

I'm genuinely asking; any anecdotes out there?


One of the big users of Odin at the moment is JangaFX's EmberGen, which does real-time volumetric fluid simulations for games and film. https://jangafx.com/software/embergen/

Odin has aided them with a huge amount of productivity and sanity of life which other languages such as C or C++ cannot offer, such as a strong and comprehensive type system, parametric polymorphism which is a pleasure to use, the implicit context system, extensive support for custom allocators, the `using` statement, the `defer` statement, and much much more.


Given the extent to which real-time fluid simulation consists of GPU computation, is there anything that makes Odin particularly suited to that domain apart from the basic requirement of control over memory allocation and alignment?

I'm more on the scientific side of CFD (so CUDA more so than shaders), but always interested in improving upon the kind of orthodox C++ I usually write.


Odin doesn't have CUDA-like GPU interoperability, so the benefits are going to be purely client-side. If you were working in say OpenCL + C the benefits are more obvious, with CUDA you're not going to get an analogous experience.

I would say in my experience that Odin is definitely preferable as a client language for the simple reason that it's a nicer systems programming environment than other languages, but it's not (at least right now, future developments could always change this) equipped to run native GPU code.

Things like SPIR-V and "shader ASM" open up some interesting possibilities once the custom backend is implemented, but the complexity of integrating GPU execution means making it a first-class feature it probably unlikely.


As most of the heavy lifting in Embergen is done on the GPU via shading languages the main benefit in using Odin isn't necessarily anything to do with memory control compared to the usefulness of its remaining feature set. The simplicity of the language and its ergonomic set of features does a good job at reducing friction and it is a joy to program in. Simply not being bogged down in all the complexities of C++ in all their forms, nor being held back by the arcane annoyances of C is quite refreshing.


is 'orthodox c++' a thing? i feel like there is a joke somewhere there...


It's a reference to [1] and a mindset arguably most common in the gamedev world that emphasizes simplicity, pragmatism and performance (including compile time performance) over many aspects of "modern C++". I've found it useful and productive, YMMV.

[1] https://gist.github.com/bkaradzic/2e39896bc7d8c34e042b


That's an interesting (and I think worthwhile) read! Just how much it upset the denizens of r/cpp[1] is a sight to behold:

> This starts ridiculously, then it gets worse

...

> looks like a solid way to produce abysmal quality code

...

> just learn the fucking language like everybody else

...

> The trolls from r/javascript are brigading us again It seems

1: https://www.reddit.com/r/cpp/comments/41b3o3/orthodox_c/


> which does real-time volumetric fluid simulations for games and film

With all of the advantages you listed, what do you think the possibility of Odin becoming more widely accepted in other fields who could benefit from using it?


I think you are operating from the bias of big programming departments where hiring for a specific language is a thing. Some companies don't assume that you need to know their specific stack before hiring. And some programmers program outside of a work setting and hirability isn't part of their decision making.


I do lots of programming outside of a work setting and I still appreciate good library and tooling support. And even if the programmers at a company are expected to be able to jump around, combining multiple languages makes building and interop more complicated.


I can't speak for Odin but Zig (also a niche language) is compatible with all C libraries so the ecosystem side is very good.


That makes sense. I guess I assumed the interop story would necessarily be arduous and crufty, but if both languages are native I can see how you might get more of it "for free".


Odin requires writing bindings (for the moment, this will likely be automated with third-party tooling), Zig actually has full first-class C importation.

In both cases the interoperability is excellent imo.

I can't speak to Zig as much, but Odin's type system makes working with C APIs much more of a pleasure than doing so in C itself.


throwaway because I don't want to DOX myself

I'm working with language development where we're deploying a custom language built with another custom language.

>who's actually using these languages, and in what contexts?

Compare it to using C in 1970 or Ruby in 1995. It's people who have a real problem they're trying to solve for which existing tooling and widely used languages are a poor fit - as well as people who genuinely enjoy developing their own PLs that have their own ways of doing things.

>Where does the build complexity/hiring difficulty/lack of tooling tradeoff make sense

With VC money, anything is possible...

>build complexity

Fairly nil. Contemporary build systems are language agnostic, and the steps to make one that functions good enough are well known and can be thrown together in a week or so. Not to mention it's become increasingly common for "widely used" PLs to fallback on things like python scripts to handle builds.

>hiring difficulty

It's actually a fantastic way to weed people out and find good candidates. In my (limited) experience, people willing and able to learn a new language (or have a few under their belt) are generally the exact kinds of people you want to hire anyway. Code switching between languages on a daily basis is a fantastic skill, and leads to good developers and mindshare, helps create a diverse technical background.

>lack of tooling

This is a legit problem, and one that a lot of folks stumble on. But luckily building tooling is a great way to dogfood your language - and it's never been easier to integrate with various tool frameworks via standard protocols.

But that said - every company just getting off the ground lacks tooling. For language devs it's a language server, debugger, disassembler, and what have you. For your bootstrapped startup it's build systems/procedures and code review, ERPs and CRMs, etc. Building your own tooling is a part of the growth of a business, tech or otherwise, it's just that when you work on a custom language you have to build some different tools than a SaaS might.

I'll throw one more in though that's the biggest hurdle - wheel reinvention. Stuff you take for granted in any language or have great packages for are the biggest blockers to being productive. But you do the ROI math and decide if it makes sense, so hopefully your lang has good FFI.

Also worth mentioning - if you're working in an esoteric language with heavy contribution by your own team, you're probably doing it because you have a lot of work that looks nothing like putting together a CRUD app or developing a backend for your latest API for your SaaS idea.


Odin's `foreign` system (FFI) is very easy to use and allows for a lot simpler build system in the process too.

I highly recommend you read the overview (https://odin-lang.org/docs/overview/) and many of the Odin wrappers and bindings for more information on this topic https://github.com/odin-lang/odin-libs


I wasn't commenting on Odin, but replying to OP's question about language development.

It's good to see you took FFI seriously and made it low friction, it's absolutely critical for younger languages.


The most critical point you brought up is in finding good developers. Reminds me of this quote by Eleanor Roosevelt: "Great minds discuss ideas; average minds discuss events; small minds discuss people."

In this case, small minds discuss libraries/frameworks, average minds discuss language details, great minds still discuss ideas.

All of the best, most interesting, and most insightful programmers I have talked to express ideas in language and tool-agnostic terms, with written language, and only use code as an implementation or demonstration detail. This isn't to say that tooling and languages are not important of course, but the best programmers can pick up new tools and languages quickly and express their ideas in any given medium. If a programmer is limited by thinking in terms of the language(s) they know, they're basically a bricklayer. Still respectable, but they're doing tedium rather than manipulating the idea space.


Even among "great minds", narrow domain-specific knowledge can matter. The nuances of how to squeeze performance out of the JVM, the exact assembly generated by some C code, etc. There's a difference between being able to use a language and being able to take full advantage of it.

There's also something to be said for language idioms as a means of communication and shared documentation within a team. Someone who speaks French and Italian may have no trouble picking up Spanish, but might say things that lose meaning when translated literally until she's gotten more immersed in the culture surrounding the language. One of your "great minds" may pick up C# and try to use it like Clojure, or vice-versa, and create enormous friction within the codebase.

Not that this is necessarily the dominant factor in these decisions, but it is a factor, no matter how intelligent and experienced your hires are.


> Even among "great minds", narrow domain-specific knowledge can matter.

Yes, but it never matters at first. The first thing you do, always, is hack a prototype together. Computers are fast. Basic algorithms and data structures are usually good enough. You don't optimize until you've got what you want in principle. And sometimes you don't optimize at all. The internet is full of prototypes that escaped into the wild and then attracted enough users that they became irreplaceable.

Javascript is a good example. The version we have is several versions too early, and it's warty. JS would be a much better language if someone had paid more attention to the low-level details. But I would argue that JS is a huge success in spite of this. An awful lot of work has been done in JS, including work that couldn't have been done in most other languages.

The grandparent of this post is right that ideas are the important thing. JavaScript succeeds because it borrows good ideas from Scheme and Self.


Yeah, the "Great minds" are still going to have a bigger lead when it comes to understanding language details, which are trivial semantic implementation concerns compared to understanding concepts. Programming itself is the boring part; it's merely transcription of concepts into machine-legible code. The concepts themselves, many of which are not even limited to the domain of computer science, are where all the interesting content is to be found.

When I say boring, I mean relatively; there is some "puzzle solving" merit to programming per se. But this is mere mechanics and minor compared to the experience of manipulating ideas into practicable forms before writing them out in code.


> Compare it to using C in 1970 or Ruby in 1995

That's not the category I'm talking about, though. C (and I believe Ruby) was a major shift from what came before. Those two are better compared to Clojure or Go: dramatically different value-propositions that can make it worth the time and effort to take a chance on a new, green technology. What I'm talking about is languages that are "like X but with improvements to Y and Z".


I'm not sure about this. Algol 60 was a strong influence on most of the programming languages from around this time.

Algol 60 had a number of obvious deficiencies, which needed to be addressed. Both Algol 68 and CPL were designed to fix these issues, but had a level of complexity that made them impractical to implement.

Pascal, which Predates C slightly, grew out of of Wirth's much simpler (compared to Algol 68) set of changes to Algol 60.

The CPL people were having trouble with implementation, so Richards created a stripped down version (BCPL). Thompson apparently didn't have access to the full specification, but samples of it greatly informed the the design of B, which eventually begat C. Apparently the "killer-feature" of C over B at the beginning was byte-addressing, which is arguably no more or less niche than what any of these new languages claim.


C was a major shift in ignoring security, that is all.

Almost memory safe systems programming languages (not fully safe due to use-after-free) were already a thing since 1961.

In fact ESPOL was probably the very first systems programming language with unsafe code blocks.


These days programming languages are going the way of JavaScript frameworks.


Nim, Zig, Jai, Odin, Beef

I wonder if a large company will throw into this royal of small and productive system languages. It seems like Nim, Zig/Odin, and presumably Jai all fit into their corners well. Someone with funds to generate tooling and combine the static structure of Zig, the productivity of coding in Nim. I think these language have all shown the potential for a great new language in this space.


I don't think Nim is in the same category. It has too many features and defaults to garbage collection. Nim is more in the category of D and Go.

For me a C-replacement has to be a very simple and disciplined language, with 100% focus on low-level programming.

Nim is great though, I've used it to make some small tools/programs

Zig is also wonderful, hope to use it more once it's more stable


Odin has a 100% focus on (low-level) systems programming, allowing the user a very high control over memory layout and memory allocations which many other languages do not express as easily.

If you have tried Zig and/or a C or C++ programming, you might really enjoy programming in Odin. Odin feels wonderful to use and has solved all of the issues that C had and more.

I recommend reading the Overview (https://odin-lang.org/docs/overview/) and the Demo (https://github.com/odin-lang/Odin/blob/master/examples/demo/...)

P.S. Odin is now fully recognized by GitHub and supports syntax highlighting.


The list should probably be: Zig, Jai, Odin, Beef, V in the category 'C replacement'.

Nim should be with D in the category 'C++ replacement with a GC by default'.


Arguably, Go was once in this spot, except they had the weight of the Googleplex behind it to swiftly boost it out of niche status.


Go (and probably Nim from the GP list) is much to high level to be compared with Zig, Odin, or C.

Go was designed to fill the need that C was originally developed for (writing Unix applications), not for what C is more commonly used for today.


V should probably be in that list too: https://vlang.io/


In case you don't understand the downvotes: previously on Hacker News, people (including, but not at all limited to, Ginger Bill and Andrew Kelley) got upset with V's creator for writing a lot of pro-V propaganda in advance of releasing code implementing the claimed features and indeed, in advance of releasing any code. Those threads are a fun, if tense, read, just search HN for "V programming language".


From a brief excursion into these smaller languages recently, it seems to me that since those arguments things have settled down and statements made about the language is either now available (e.g. open source, fast compile time) or marked as unfinished.

IMO it does deserve a place in that list today and is definitely worth a gander.


Thank you so much, I missed all of that. :)


I wonder why this type of language is so attractive to bike shed. Just the right balance of complexity and usefulness perhaps?


There is huge potential for a modern light weight C replacement. Golang went in this direction, but it has a runtime with a GC. Rust is low level, but is quite a complex language. So, there is a gap in the market.

One cool aspect of all of these new programming languages entering the arena is that most of them only have a minimal (or no) runtime. That should make it much easier to interoperate between all of these languages, something which has traditionally been difficult with most programming languages.

Disclosure: I'm the author of Muon [1], a low-level language that embodies similar design principles.

[1] https://github.com/nickmqb/muon


Maybe because it's the layer of the stack most ancient, unsafe, and in need of replacing.


Its not! The vast majority of languages are garbage collected high level languages. How many non-GC'd C replacement languages have been created? Seems like there are only a handful. Hardly bikeshedding.


I think it's a mistake to think that in 10,000 years C will remain the golden standard for a language that still is reasonably close to the hardware.


It's no longer close to the hardware


Not necessarily contrary to my point.


To hide any les of the machine you start giving up portability. c- might be a counterexample.


There is one more programming language that "loves C's efficiency for running code." Not tried it myself, but anyone interested can have a look at: https://github.com/crystal-lang/crystal


Not to be confused with Oden (http://oden-lang.github.io/), a functional language which compiled to Go. May it rest in Valhalla!


Such a great landing page, clear examples and documentation. Good job!

By the way, there seems to be a css bug in the FAQ/quote section (The text shows dark grey on a dark grey background)


Thank you.

I have fixed the css bug now.


With that name I expected a esolang build around the idea of everything being somehow globally accessible, why else name it after the all seeing god?


Odin was originally the code name for this project, as names are not something to worry about when developing. However, people liked the name of the language so it has stuck.

Most of my internal projects are usually named after a mythological figure before I give it a finalized name. It just makes me stop worrying about it and get on with solving the problems at hand.


Positively surprised by such an early language supporting multiple platforms already, having a clear Getting Started guide and FAQ, language spec and examples. Even aware about the Visual Studio Command Prompt in Windows despite supporting *nix — what is this!


I commonly develop on Windows first, and thus support Windows as the main platform. My general philosophy is if it works on Windows, porting to nix is usually pretty simple.

One thing that did take some time was getting System V ABI support for the Odin compiler so that nix platforms could support C-ABI 100%.

As for VCVARSALL.BAT, there is work currently trying to remove the requirement for it once the compiler is built on Windows. However, Microsoft being what it is, it's quite annoying to get it working on everyone's machine correctly.


We need less languages, more IDEs.


Looks promising and I haven't found an obvious pitfall (designs that I don't like) except the coupling with LLVM.

I wonder if there is any feature that makes it unable to be translated to efficient C and therefore preventing it from bootstrapping with a C backend like Nim and V. I'll look forward to the self-hosting compiler.


Any comment on how Odin would work as a target for a compiler? I have a toy compiler project that I’m exploring different languages to compile to. So far I’ve tested compiling to C, Go, Zig, and Python.


My problem with most of these new languages (Zig, Odin, V, Nim, etc.) is that they lack an ecosystem.

Lots of niceties don't exist yet.


And that's one of the reasons they are posted about here: to help find developers who will work on the ecosystem.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: