Hacker News new | past | comments | ask | show | jobs | submit login
OpenD, a D language fork that is open to your contributions (dpldocs.info)
235 points by mepian on Jan 14, 2024 | hide | past | favorite | 311 comments



At least Walter, and presumably others in the D leadership, are active here. There's a good chance they will see your comments. This is just a reminder that they are human, too, they care a lot about D, and in my experience they are basically decent people who are trying their best.


More than 20 years passed and nothing really changed. What makes you think it will be different now. It's a "Tango" part 2 all over again. Except now, the ship has long sailed.


They didn't say it would be different now, they're just reminding people that Walter is here on HN so to try to be respectful and realize that he's a human being too. Unlike most here, he's attached his real identity to this project, so drive by anonymous pot shots would be inappropriate.


While true, and I agree that everyone is human, Walter isn't the only one here who cares about D. If the project would thrive under different leadership then that's something that needs to be considered. I for one really hope for a D comeback, it's by far my favorite language. It's not an attack against Walter, or anyone else. Just people trying to save something that they love.


Unfortunelly this was bound to happen, after so much remarks on forking D throughout the years.

Almost everything I liked in D when Andrei Alexandrescu's book came out in 2010, have made its way to C#, Java, C++.

Yeah, maybe the implementation isn't as nice as in D, but that hardly matters when the implementation is available in some form, with much better tooling and library ecosystem.

Too many years lost chasing the golden feature that would bring people in, without stabilitizing those features.

Even Andrei is nowadays apparently mostly busy with C++ and CUDA than D.

And then there is the whole compile to native programming languages renaissance from the last decade, adding even more competition.

Which is a pity, as the community itself is full of great folks to talk with.


If that means the language will revive and all the work will not be lost, then it is fortunate, I would say.

I am using Python every day. And there are many things from D which I miss. Not to mention the performance.


I witnessed a similar case while touring D as an outsider. When Rust was new and the concept of lifetime was brought to the D community, it was deemed unnecessary by Walter. A few years later, he brought his own lifetime proposal that is sufficiently different to Rust's and thus even less verified compared to the previous suggestion from the community. Now that I lost my interest in D, I'm not sure of the maturity of the new lifetime feature, but I would be surprised if it is as useful as Rust's.


I'm not too familiar with either Rust's or D's approach to lifetimes, but from some quick forum searching, it appears that adding Rust-like lifetimes to D would have required significant changes to the design of the language. Every potential feature has tradeoffs in terms of how it interacts with other language features, adds cognitive overhead, decreases compilation speed, complicates the design of the standard library, breaks backward compatibility, etc. I don't think it's reasonable to expect a large-scale overhaul just to support one feature you like from another language.


D's ownership/borrowing system does not require any language changes. It is opt-in at the function level to preserve compatibility with existing code. It does not break backward compatibility. It works as a prototype now, you can try it out.

It does decrease compilation speed, because O/B requires data flow analysis. However, this is speed slowdown only happens for the functions marked as being O/B functions.


Given that Sean Baxter already has lifetimes in Circle that work very similarly to Rust's, I'm skeptical that the very similar D language wouldn't be able to express the same thing.


Generously, you've confused somebody saying they're interested in working on a problem with them having a working solution to the problem.

Circle does not, in fact, have working Rust style lifetimes. Sean splits the Circle documentation into features which work and "Research" features Sean is working on and lots of interesting ideas, including lifetimes, are in the second category. Maybe Circle will implement them some day, or maybe it will not.


My understanding is that he has been implementing them? I remember seeing screenshots like this go around back when I was on twitter:

https://twitter.com/seanbax/status/1744403554155041047

https://twitter.com/seanbax/status/1685671828767879168


They've been the primary focus of Circle for the better part of a year now. He's demo'd them working many times already.


So now you can have a project with someone else's opinion instead of Walter's.


Not someone else, but many others. I value projects governed by multiple people because one person cannot be an expert in every field, even though that person is smarter than most people.


There are two sides to that. Once you start running a project by committee you will find all sorts of odd features winding their way through the system. Politicking has as much influence as technical merit when it comes to new proposals.


My research group left D for Rust several years ago due to nonresponsiveness and poor language development trajectory.

While I wish Adam and the others success with OpenD, I hope they can take the opportunity to pick a more unique/memorable name.


I nominate "dope", short for "D, OPEned".


"Dopen" then?


DopeN


upgrayeDD with two Ds for a double dose of pimpin'


D'oh!


They could go with TheD, to fill in the niche left by Coq now Coq's getting renamed.


> ... now Coq's getting renamed

Oh, didn't know that they are working on this.

https://github.com/coq/ceps/blob/coq-roadmap/text/069-coq-ro... and, of course, https://news.ycombinator.com/item?id=38779480

069-coq-roadmap.md? Oh, you!

So now we've got Rocq and Roc https://www.roc-lang.org/


The TenaciousD?


GoblinD


kobolD was right there, dammit.


If they want to focus it more on systtems programming, maybe SystemsD...


Given one of their stated goals is "Embracing the GC and improving upon it, disregarding betterC and nogc in the process", it seems like their goal is explicitly to move it away from being a system programming language.

And no, I didn't miss the joke :)


LiberateD

FreeD


I read TFA, how about screeD?

Too bad it's not focused on a windowing system, they could go with Dfenestration :)


Community Edition D, or Com-Ed-D.


FOKD - FOrmally Known as D.


New governance models start with discussions about governance by a quorum of interested parties, not by dictating a new set of features or the decision to leave some out. That's just a change of regime, not a change of model.

I'm sure that there are legitimate gripes and D's reliance on Walter as gatekeeper may be too strict for some but the way this fork has started out doesn't bode well for the long term. Forks succeed only if they are carried broadly, you'd need to more or less pull a majority of the folks in the D community along rather than just a handful of prolific but ultimately low in number individuals because there is a fair chance that both your fork will fail and that you further fragment the mindshare of the original to the point that it too looses any viability that it still had.

You also need to be able and willing to support it for decades.


Looks like there’s an update https://dpldocs.info/this-week-in-d/Blog.Posted_2024_01_08.h... and they’ve gotten some interest.

Personally, I think it’d have been more fun if they named it ‘Died’ — you get the benefits of sweet taglines “It never Died,” a statement about it being a fork, and the permission to take the language in new directions if that makes sense in the future. OpenD implies full compatibility, etc..

(I have no skin in this game, though. Good luck!)


There's a big thread about this on the D forums: https://forum.dlang.org/thread/beykokfitddfdsjyqjjy@forum.dl...


clarify: the big thread referenced here is started by the people doing the forking (2 people so far), and there's a fair amount of good detailed response by D people which is lacking in TFA above. The orig article is just beefing about personalities, and this link has more technical discussion

sounds from this like the people forking basically want a different language, for which they're willing to make a number of breaking changes.


no, the D forum thread has several github links which pretty much self-explanatory of what has been going on.


Personally, it's confusing to me what D's niche would be if they go all in on the GC. I get that it didn't really have a solid niche before either, because it was basically just a 10% better C++, without any really distinguishing killer feature, so it was kind of a hard sell, because most people are either going to stick with C++ for the existing support and code bases and industry, or are going to use Rust for greenfield projects, or Zig if they want simplicity, especially since with C++ 20 a 10% better C++ already exists, but I don't think focusing on having a garbage collector really helps so much? All this means is that it's going to be competing with C# and Java, which are already the garbage collected, safe, streamlined C++ successors and have been intended to be such since their inception, and I don't see how you can really compete with either of those languages at this point. Especially C# with its actually pretty excellent low level and unsafe capabilities like raw pointers and control over whether things are reference or value types and control over the layout of things in memory and stuff like that, combined with its pretty okay (for an OOP first language of its age) type system and reflection and all of the ML family features it's been getting.

Honestly, I've always felt really confused about what D's vision is. I've tried reading some of the docs about certain language features like better see and safe mode and lifetimes and they've been grammatically just kind of hard to read and generally poorly explained, despite going into great detail on things, and I've never really been able to get a sense for a coherent design vision or purpose for the language. The author mentions that this has been a problem, but I don't feel their vision is really any clearer. Yes they have a few concrete midterm technical goals, which is great, but it isn't clear to me what those goals are in service of. Honestly, I'm a rust programmer, so you know what I do for Greenfield personal projects, but I'd rather just use OCaml, C++, or even C# if I had to pick something else.


> Personally, it's confusing to me what D's niche would be if they go all in on the GC. I get that it didn't really have a solid niche before either, because it was basically just a 10% better C++

Yeah, for a long time the language was marketed as a better C++. That was wrong for two reasons. First, it's a C++ replacement, but it's not C++, as many C++ developers learned in frustration. Second, it's a lot more than a C++ replacement. It works for scripts, as a general programming language, and especially for C interop (ImportC, BetterC, etc.)

One of the goals of the fork is to stop apologizing for the GC. You still have reference counting, unique pointers, and whatever in the standard library. I'm sure if someone wants to contribute functions that avoid the GC, they'll be accepted. What you're not likely to see with the fork is Adam writing new functions for the standard library that go to great lengths to avoid the GC. He believes fear of the GC is overblown.


Right, but even if this was always the most consistent/accurate direction for D, so it isn't actually a change in direction technically speaking, that still leaves the main point of what I wrote unanswered — how can its new direction find it a niche when it has to compete in the "GC'd but very fast C++ successor language" space with C# and Java?

Also, this idea that fear of the GC is overblown — I buy that for servers and applications and the like for the most part (although I think our obsession with using heavier and heavier runtimes and frameworks to ease development at the cost of efficiency is unsustainable in the long term and has contributed significantly to the bloat of modern systems), but for real-time work, or embedded systems, or low level programming that needs to implement the sort of things a GC or runtime relies on, which are the only things I think most people claim GCs aren't good for, I really don't buy the argument that GCs don't matter. Yes you can do such things with languages that have them, especially if you have cutting edge GCs like Java's ZGC or turn your runtime into bare metal hardware drivers like Miranda did with OCaml, but oftentimes that's just harder to reason about and control, more work, and has costs (like ZGC needing 2x the memory and roping all pointer accesses into doing GC work to amortize processing costs), so it's like that article about using C# to do a <2KB game — why not just use things more well suited for that. This is a common sentiment I've seen in the D community though, so let me ask directly: is this argument a response to people who are claiming not to want a GC for low level / systems programming, in which case I don't think it's very reasonable, or is this a response to people who balk at GC for server and application stuff, in which case I'm not sure anyone disagrees? Or is there a third option I'm missing?


In the last decade, we've had many learning experiences around governance of programming languages.

Most of the changing languages that come to mind, I realize, that they, too, had a big upset or oops related to governance.

I now think one of the key things to look for in a programming language is governance -- what they say about how they do it, what they actually do, how that's working out so far, and how you think that will work out in the future.


While I agree this is an important problem, I don't think there is a satisfactory answer. Programming language evolution is more or less a complexity management, while you also have to balance requirements from various stakeholders. Almost all governance drama came from one group of stakeholders complaining about requirements essential for other group of stakeholders, and you can't always satisfy both. (Rust `async` for example is known to be such case, explaining a functioning but generally unsatisfactory design.) Go is a rare exception where the core team had no reason to honor all stakeholders and enough resource and willpower to do so, and yet some drama did happen.


We might be able say that a given assessment of governance will work out well for our needs. But it's a lot easier to identify showstoppers early on.

For example, if a language designer and implementor says upfront that they want to run it as Benevolent Dictator For Life, and also that they don't want to see feature PRs from people, because they prefer to work through all the details themself... That's their right, and great information to have upfront. That governance might tentatively work for some adopters, and others will decide it's a showstopper without needing to explore further.


You are right about early adopters, but once those have been filtered, there will still be multiple stakeholders within the resulting community. Even when we don't consider new users, I can't see how they can be adaquately catered for in general.


> We might be able say

Correction: "We might not be able to say".


Things have not gotten better. The 1960s-80s model of writing standards was so much better and restricted the politicking to a brief period.


cries in ALGOL-68


I fail to see any negative aspects of having multiple compiler implementations around.

IMHO that's the main reason why C became so popular, compilers are free to explore into different directions, language extensions that have proven their worth will eventually be picked up by other implementations, and sometimes even make it into the standard without being butchered too much by the committee agreement process.


I'm confused, I thought that there already were 3 around? Or at least 2? gcc plus the one from dlang?


the 3rd, LDC, uses LLVM as backend. Both all share the same frontend. https://dlang.org/download.html

open-D is not a 4th implementation it's a fork.


There also SDC which is an experimental D compiler not using DMD code. I don't know how complete it is but it is actively updated.


As a contrary perspective, a language which enthusiastically welcomes patches from anyone is likely to degenerate into an incoherent mess very quickly.

Language design is difficult. Features interact in complicated ways. Fixing mistakes is tricky - you break existing code.

Forking D sounds fine. It's easier to start with an implementation than to go from scratch. Ideas which work out well get to have a existence proof when proposing them to the original. Ideas that crash and fail don't add to the debt of the original language.

I hope the fork goes well and this proves a net gain for the original ecosystem.


D is a real anomaly to me because it should have had the same trajectory as Rust, it vastly improved upon other systems languages at conception, the authors evangelized it, including at FAANG, and yet, Rust seems to have gained traction everywhere D failed to do so. Even in places where C++ has historically been shunned, Rust has some traction (Linux Kernel). I now believe that language adoption is just a product of the right news cycles and timing, and perhaps hype over the creators or a certain feature. I am sad we got Rust and not D. D is so much easier to grok as a C++ person, and I think Rust is incredibly verbose looking.


I don't find it particularly surprising. D uses a garbage collector while C, C++ and Rust do not. D's GC can be disabled but that isn't that useful when most D code including the standard library until just a few years ago were not written with that in mind.

D is much more closely a competitor of C# than it is C++. D has a few nice features like advanced compile time programming but the actual nuts and bolts that Staff engineering management looks isn't really solid. D's GC is a design straight out of the 80's. Dmd has good compiler throughout but code quality isn't very good. Ldc is much better but compile times are much longer.

Adopting languages at FAANG beyond a single team just yolo deploying them to production requires integrating dozens of engineering systems for everything from post mortem debugging to live profiling to authentication systems. The cost to do this is in the order of tens of millions of dollars.

D just isn't suitable as a C or C++ replacement in the places that actually require it and the cost to enable it in large companies isn't worth it for the incremental improvement it does offer in some areas.


Rust has memory safety without GC, and a from-scratch language design. D is an evolutionary development of C++ (which is also gaining new features of its own) with little to recommend it besides. A comparison with Carbon and cppfront is also instructive, note that both of those have not added GC to the language.


Culture matters. "Culture eats strategy for breakfast". Rust has a safety culture. Yes it has a bunch of safety technology but the technology doesn't decide how things are used. It would be legal using the Rust compiler to implement the IndexMut trait on slices such that it just YOLOs like a C++ index. Rust doesn't do that, not because somehow the technology forbids it - it does not - but because culturally it's anathema to them.


When I heard about D it was often in combination with issues that seemed rather basic. Like multiple mutually exclusive runtime libraries that made D libraries incompatible with each other from the start, or hard version breaks that where trying to solve fundamental issues but also caused projects to lag behind for years. Have you seen how long the Python 2 to 3 migration took? The news cycles didn't do anything to fix that mess either.


What year did you come to that conclusion?

D could have been something but most people avoided it because of the commercial nature, i.e. not being Free software.


Timing is important. D was like year 2000, Rust 2015 or something? A lot had changed in the meantime.


Community is everything, and D leadership smothered theirs like a bag of kittens in the river.


I know that calling it "FreeD" is probably a bad idea but I would have thought it was real clever. I've toyed with the language before and it was always enjoyable, just lacked the library/compiler support that I need for the sorts of things I'd use D for so I've stuck with C, looking forward to trying again in a couple years if this sticks!


who is exactly behind this, and how much support is he getting from the d community ???

D is a very small community , so this seem to be a big bet


The person that wrote the post is Adam Ruppe. He's a very prolific D programmer, best known for these libraries https://github.com/adamdruppe/arsd and for publishing a book on the language.

It's too early to judge how much support there will be. I don't expect current users to split into camps though. My prediction is that the relationship will end up being similar to Ubuntu vs Debian. An example is string interpolation. Walter wants to stick to his own proposal, which nobody else likes, while Adam's already implemented his proposal in OpenD.


The name is unfortunate, there is nothing "closed" about D and OpenD ends up from the start with no rules as to what gets merged.


what D imo would benefit most of is a cohesive standard library that comes with batteries included and makes it easy to ship real world apps and services. basically what Go did, having many standard protocol implementations within stdlib.



Funny stuff.

> Again, remember, I have other work and responsibilities, so I can't do a great deal of work myself, even if I wanted to.


What's Walther's take on the many gripes with the D leadership?


He pretty much derailed the D forum forking thread into remotely related tech discussion. This was weird to observe tbo, in the end my take is that he doesn't seem to bother much and would rather continue living in his version of the story.


It's worth pointing out that Walter is one of two co-maintainers of the language. The other has not said anything at all. On the other hand, I'm not sure what there is to say if they're not going to make any changes in the process.

I'm far more concerned about community development of libraries, IDE support, and the beginner experience (especially on Windows) than I am about changes to the language, which is already pretty good. As a Linux user working on top of C libraries, the experience is incredible. That's not the case for everyone.


Aren‘t these connected? Its about the contribution culture. If there is so much pain that it led to a language fork, what IDE and quality of life improvements from a shrinking community can we talk about? When I started with D several years ago, there were a few meetups and active contributors. These ppl now have left D for Rust and now we have this fork. Does it look like there is success ahead? I would say it looks more like the last desperate effort to change things.


I view them as two separate issues. On the one side, there are the complaints about language changes. It's hard to convince Walter to change the language, and it's hard to contribute to the compiler and/or standard library (those are the complaints, true or not).

Then there are problems for users of the language not having enough libraries and a sub-par IDE experience (again, true or not). Go and Rust built communities of programmers that aggressively used the language and made their work available to others. I don't see any reason we couldn't have more of that with the language as it currently stands. Adam certainly had no trouble knocking out hundreds of thousands of lines of nice libraries. Edit: And Ilya doing all his good work with Mir.

The first is a long-term problem. We'll see the effects ten years from now. The second makes it hard to have a reason to use the language right now (largely for anyone that doesn't want to write scripts or interact with C libraries). That's most of what concerns me.


I just posted on the forum about it, I didn't do it before because of surgery (I'm still nowhere near 100%).

What changes would you like to see happen?


I recognize that pattern of behavior because I've done something similar at times myself. Or at least, in my pattern it was this: what's done is done, confronting it will just create a lot more drama for me, and I don't feel confident in wading into a fraught social situation anyway. I'm much more in my element working on technical problems. So I let the social thing go and go work on the technical thing. This is not always a healthy thing to do, but to me it's not in the least weird.

That being said, never assume that no response from someone means someone doesn't care. I felt awful when dealing with situations like this. If this were my project, I would probably be extremely frustrated and more than a little bummed, whether or not I thought the fork was understandable.

Those are not things that are constructive to air in public, though, and I think it is a mark of leadership that Walter's dialogue remains as calm and polite through all of that thread as he usually is.


> That being said, never assume that no response from someone means someone doesn't care. I felt awful when dealing with situations like this. If this were my project, I would probably be extremely frustrated and more than a little bummed, whether or not I thought the fork was understandable.

Contrariwise, I read through that forum thread and some of the linked github issues, and it wouldn't be shocking if he were happy to see some of those people walk away. It would not be politic for him to say so.


I’m not sure I’ve ever seen him comment here on a D related post. Which I totally understand.


I've definitely seen him discuss D here when it comes up.


Only when D itself isn't the matter being discussed.



Are there any examples of forks going in a different direction than the original and being (more) successful?


X.Org. (if you know what X is, you are familiar with X.Org, which is a fork of XFree86)

Chromium is a fork of Webkit which is a fork of KHTML. If you use a web browser, that browser is either Firefox or a fork (of a fork) of KHTML. I don't know what percentage of Chromium/Webkit users have heard of KHTML, but I'd recon it's on the order of 1%.

mplayer used to be a very popular, very good media player, mostly for linux but it was cross platform. Development sort of died. These days its fork mpv is much more common.

There used to be many different forks of GCC. In 1997, all of these developers merged their forks together into the EGCS project. This project proved to be very active and very...good. It was so good that the FSF halted development on mainline GCC, forked EGCS into the new mainline GCC, and restructured the community built around the ideas of the EGCS community. If you use GCC, this is the version you use; forked from GCC into EGCS, forked from EGCS back into GCC.

MacOS' kernel is a fork of FreeBSD. Much of its userspace is a fork from ... somewhere. If you use MacOS it's forks all the way down.

Kindle is forked from Android and/or linux and/or... well it's complicated. Amazon forked a lot of stuff.

Much of the Android ecosystem has been forked. OpenSSL was forked into Tink, for instance. Again, complicated, lots of forks.

yt-dlp was forked from youtube-dl when ... the lawyers came.

Ubuntu is a fork of Debian.


I'm not sure the distinction is that clean, but isn't ubuntu still based off debian. As in, not a fork that went its own way, but a (big) collection of modifications on top of current debian?


Sounds like a fork with extra steps.


Derivative, it's less distinct than "fork" would have one believe


> that browser is either Firefox or a fork (of a fork) of KHTML

Firefox is also a fork of Mozilla (later renamed to Mozilla Suite, itself forked into Seamonkey[0] when Firefox became the main browser the Mozilla Foundation decided to stop developing the Mozilla Suite).

[0] https://www.seamonkey-project.org/


LibreOffice, MariaDB, Nextcloud, and OpenZFS might be a few prominent examples.


Jenkins was a fork of Hudson. No one talks about Hudson anymore



Inkscape?


Damn it, so know I gotta choose between which variant of D I want to comply with?



>> Until we get everything written

Now we have 3 D compilers: DMD (the de facto standard), LDC, and GDC.

I guess the OpenD project will have their own compiler, too. Well... let's see.


I wish this project all the success in the world. D has a lot of great ideas, and if getting patches approved is a bottleneck, this is the right choice.


D is such a sad case of how a good technical product can fail due to poor leadership and decision making. A language is more than just a compiler, it's also an ecosystem and a community. D's community is actually very welcoming but man, watch some of the DConf videos and Q&As on Youtube and it's cringe seeing key leaders talking down to people, or dismissing people's concerns, or just have this ego about themselves like they know what's best and everyone should shutup and just go along with them.

Anyhow, D went from a language that I remember back in 2015 was often promoted on places like reddit and here as a fresh alternative to C++ that was constantly evolving, to nowadays you rarely hear anything about it at all. I think even Andrei Alexandrescu has given up on it.

Good luck to these guys on trying to bring life back to it, but frankly I think at this point most people have given up and moved on to alternatives like Rust, Nim, Zig, etc...


I think the fundamental problem with D is much simpler than that, it's the technical context. D is a better C++, but it's not better enough to escape the gravitational pull of C++. Rust is sufficiently compelling, with its focus on memory safety.


Rust is a really frustrating language for me: I understand the safety it provides but find the pain of actually using it makes it uncompelling. Plus, the Rewrite it in Rust movement is very off-putting


Rust is a frustrating language for me too.

I never saw the need for it, really. I just stuck with the language constructs in C++ that makes it basically impossible for memory leaks/use after free/use outside of bounds to occur or if they do occur, explode loudly in debug environments. I haven't used new/delete in a personal project in like a decade; longer than Rust has been around.

Now I'm working on a project where the main developer is...a cowboy. Everything is insane. Not only is new/delete everywhere, but so is malloc/free. In C++ code. You have to navigate to where the thing is allocated to figure out whether to use free or delete. Everything leaks, everything crashes, everything races, everything deadlocks.

Oh and the guy is my boss.

So on the one hand, rust might fix these problems, on the other hand, rust is a non-starter.

So now what. I still don't need rust because I don't code like a crazy person. And the people who do code like crazy people will never use it anyway because it doesn't let them...express themselves.


I am yet to see someone who can't make memory safety mistakes in C++. You might want to start a tutorial series on how to do that. If this skill can be taught that is, and isn't genetic.


Memory safety issues in C++ have two main causes. One is object lifetime. The other is pointers. If at all possible you want to design your program such that object lifetime is completely obvious and predictable. Use sentinels and tombstones instead of null pointers. Use arenas wherever possible. During debug use an allocator that doesn't reuse memory (just mark the pages as no access so you will segfault when trying to read/write in previously freed memory). When object lifetime is complex you can give objects a "color" during allocation and then make rules that you can verify that objects of one color can never have pointers to objects of another color, or that an object can never contain a pointer to a younger object. You can eliminate entire categories of memory problems this way. Instead of pointers use indices and a getter function. In debug mode the getter can check if the right locks are held, scan the heap for incorrect pointers, check for ownership flags, etc. Actually take advantage of the virtual memory tools provided to you by the operating system. Threadlocal memory. Fork tasks into different processes. Actually use mprotect and the like.

Zero memory safety mistakes is a tall order. But the overwhelming majority of memory errors we see in the wild can be easily prevented by good practices. And for the memory errors that do happen stack protection flags make a big difference (https://developers.redhat.com/articles/2022/06/02/use-compil...).


nobody is perfect, but some people create mnay less leaks than others. Modern C++ is the rules those who create few leaks follow.


I don't claim to never make memory safety mistakes in C++. I just claim that it's very rare. The overwhelming majority of my bugs are bugs in business logic.

At my day job, I work on a project that's 75% C++ and 25% C#. In the code that we've shipped, when there's a crash in code that I've written, (as opposed to business logic bugs) it's usually a memory safety mistake in C#. There was an interesting architectural choice written a decade before I joined the company where most classes have a synchronous constructor, then an asynchronous initializer, then an asynchronous uninitializer, then a destructor that gets run by the GC. There's no end to bugs relating to crashes because the initializer hasn't been run yet or the uninitializer has already been run.

When I get C++ bugs put across my desk in code that we've shipped to customers, it's usually a bug somewhere else. For instance, the last crash dump that we've gotten from customers in C++ is because a we called a Windows UTF-8/16 conversion function. The Windows function itself is all raw pointer nonsense, so we put a pretty wrapper around it so you put a std::string_view in and get a std::wstring out, or you put a std::wstring_view in and get a std::string out. Well, it turns out Windows is a fucking dogshit operating system. If you initialize a thread "wrong", ie, by calling the C11 function thrd_create, and your locale is set to a CJK language, it will ignore you when you tell it the code page of the multibyte string is UTF-8 and will assume it's the system locale's code page, and will call abort() when it hits a UTF-8 sequence. (instead of perhaps returning an error or null pointer) Those are the sorts of C++ crash bugs that I deal with.

My 'secret' to dealing with memory in C++ is to not deal with it. Make the STL do everything. Make RAII do everything. Can this object be POD with constexpr accessors? Do that. Can these methods be const? Do that. Can these objects live in a STL/boost container? Do that. Do they need to live in a unique_ptr/shared_ptr instead? Fine I guess but it would really be better off as a value instead. Does this class need to have a special destructor/copy constructor/move constructor? Find some other way to do it. If you must, try to find some other way to do it anyway. If you must, spend like 10x as much time scrutinizing it, the way a Rust programmer would do with unsafe. If thing must have special destructor/copy/move constructors, factor out all of the things that need special consideration into a class with the barest minimum. If it must have a special destructor, explicitly delete the copy/move constructors/operators if you can. Avoid indexing into arrays; use range based for loops, or <algorithm> stuffs like transform, reduce, or transform_reduce. The solution isn't to use vector::at() (which does bounds checks) instead of vector::operator[] (which doesn't) the solution is to not use indexes at all.


i don't make memory safety errors in C or C++.

i'd recommend embracing hungarian naming (if you call it apps-hungarian you've not embraced it fully enough, read older documentation) and be as rigid as rust about using it. i-, c-, p-, and Max are your best friends, don't declare one of those without declaring all the others that you will need.

using hungarian, adopt naming conventions about Open/Close, Init/Finish, Alloc/Free, Start/End, Alpha/Omega, or whatever, and apply them rigidly to any class/struct (with indentation that's obvious). When you put in an Open, you go and put the Close in at that moment, just as you would close any paren you opened. don't return from all over the place in a function, goto endblock and free what needs freeing, you need to work on autopilot not thinking through the twists and turns to play monte carlo with getting it right. If you are going to return an allocation, your name needs to be Open, Init, etc. Use whatever words you want, but it's got to be something that makes you as the caller think "this is an open paren, i need to close it"

using ifdef debug type mechanisms, hook malloc and free and put guard word asserts at the beginning and end of every allocation, and magic number ids and reference counts in all structs, and hook main() and exit() so you check those for leaks. all. the. time.

work in a systematic way that avoids problems and make that your priority, everything else like functionality is slaved to that.


Interesting suggestions.

How do you "hook' a function, like you said about malloc,free, main and exit?

I guess it means intercept calls to it and do something additional to what it already does, something like Python decorators, but how do you do it in C or C++? I used C a lot (but not C++), but much earlier, and don't remember any method of hooking. atexit(), or something like it?


Given that they mention ifdef, I guess they just mean something like `#define malloc(x) (tracing_malloc((x))) ` in every place except the one where the tracing version is defined.


hungarian standardizes procedure/function names anyway (typed and capitalized), so while you're changing the name, it gives you "a place to stand" to do the other things you want to do.

for normal/average operating environments, I prefer to just indirect through an extra procedure call, gives you a place to put a breakpoint, and then if you need streamlined performant code you can #define it all away. But, if you plan to #define it all away, make sure that is a regular part of your work flow beforehand. you can have a procedure version, a heavyweight define and a lightweight define (and use your defines inside your procedure to test them there). On a regular basis you need to make sure your infrastructure is all doing what you think it is.

Same is true for main(), use it for setting up infrastructure, have it call Main() which is "your main".

if you are on a large enough project you can budget time for tools, the actual "__main.asm" code that calls C main() is also accessible and you can hook away in there too. Have to do it for all compilers, and track compilers, but on the scale of a large project, that's not the end of the world.


Why do that with the preprocessor rather than the `--wrap` flag? That's what most malloc libraries do.


I should take this to the security team although I have a hunch that they'll think I am full of shit. What with the CVE count never going down with asan, ubsan, tsan and all kinds of guidelines, analyzers, tests and what not.


Hungarian notation has never been proven to reduce errors compared to strong types, and the CppCoreGuidelines tell you not to use it. clang-tidy can largely automate Hungarian notation, at least.


the goal of hungarian is to ease/automate the cognitive burden on the programmer while in the act of reading or writing code, it's the semantic notion, not the syntactic


> I still don't need rust because I don't code like a crazy person.

This is not a foolproof argument because safety problems can easily result from unforeseen interactions among parts of the code that all seem to be OK and not "crazy" locally. A significant benefit of something like the borrow checker is that it can suss out these problematic interactions in a comprehensive way, even at the cost of forbidding some code patterns that would be perceived as safe in most cases. (Idiomatic use of Rust then requires you to either rewrite such code or stick it in an unsafe block and figure out what the preconditions are for it to be used safely.)


I'm new to Rust, but wouldn't it be possible for the "C++ boss" to start writing Rust code like below? Here I think we have a multi-threading race condition that can lead to a crash (?)

  use std::sync::Arc;
  use std::thread;
  use std::time::Duration;
  
  fn main() {
      let shared_data = Arc::new(42); // Create an Arc
      let weak_ref = Arc::downgrade(&shared_data); // Create a Weak reference
  
      let thread_handle = thread::spawn(move || {
          // Simulate some work
          println!("Thread Data: {}", shared_data);
          thread::sleep(Duration::from_millis(10));
          // The Arc is dropped at the end of this thread
      });
  
      // Give the other thread a little time to start up (this is part of the race condition)
      thread::sleep(Duration::from_millis(10));
  
      // Try to upgrade the Weak reference and unwrap directly in the print statement
      println!("Weak Data: {}", weak_ref.upgrade().unwrap());
  
      // Wait for the other thread to finish
      thread_handle.join().unwrap();
  }


Sure, that does seem to be a race condition that can crash. I'm not sure how valuable this example is though, because it's not very intricate at all: a Weak reference can be invalidated (duh!)

The fix is trivial: just use an Arc instead of a Weak. In addition, `upgrade().unwrap()` should be a sizable red flag (like any unwrap, really) since fallibility is kind of the entire thing of Weak.


Thanks!

But Weak can be needed in case you have self-referential structures, right?

I was wondering about this in the context of the C++ programmer in post above who likes to use both new/delete and malloc/free in his code.

Sure, Rust will give him far fewer ways to screw up. But if he can't be asked to at least not use malloc/free in C++, he probably won't be too careful about unwrap() either?

My point here is mainly that a good programming language won't make good programmers out of bad ones.


> But Weak can be needed in case you have self-referential structures, right?

Well, technically, no, you can just use Arc and leak memory all over the place :^) but that's not very useful either. Most self-referential structures need some kind of "owning" thing too though. For example, a graph could use Weak for the edges, but in order to keep vertices alive you need e.g. a Vec<Arc<Node>>. Then, if upgrading fails, you know that your edge leads to a deleted vertex (and as such should be considered nothing, hence Option).

> Sure, Rust will give him far fewer ways to screw up. But if he can't be asked to at least not use malloc/free in C++, he probably won't be too careful about unwrap() either?

There's one big advantage of messing up with unwrap vs malloc/free: unwrap 'only' crashes your program (and can even be caught in some scenarios), whereas use-after-free can do literally anything. Unwrap is also much easier to debug because you usually get a clear stack-trace, and the program semantics have been well-defined at all points.

> My point here is mainly that a good programming language won't make good programmers out of bad ones.

Very true! But I think the point of Rust advocates tends to be more along the lines of "a bad programming language makes a bad programmer out of a decent one". It is very easy to misuse C++, so mistakes are made with it way more often. 'Dangerous' constructs do exist in Rust, but are generally far less pervasive, making it easier for the programmer to understand what they are doing.

Saying Rust has no benefits over C++ because you can mess up in both is like saying that anti-lock brakes are useless because you can still understeer by pressing the accelerator too hard (sorry for the car analogy, I had to). Preventing entire classes of mistakes is valuable, even if other classes are still possible. (If it's worth the cost is of course another question. My personal opinion is yes, although that's especially for memory safety and a bit less for race safety.)


> Saying Rust has no benefits over C++

Sorry, that wasn't my intent. It was more to point out that bad programmers are surprisingly inventive when it comes to write bad code. New languages are an improvement, but not a panacea.


Rust does not protect against race conditions, since these are mere logic errors that don't impact memory safety. This program can panic when unwrap() fails, but that's also a safe operation.


Most race conditions are prevented by Rust thanks to its "alias XOR mutable" paradigm that prevents having multiple mutable references and enforces the presence of locking or atomics. Not all, but the words "fearless concurrency" aren't meaningless either.


What kind of solution would you propose to folks like your boss?

One positive aspect I see about "rewrite it in Rust" is that you can to some degree expect a random Rust project to not leak and crash and expose vulnerabilities quite as much as a random C++ project. It's silly, but "written in Rust" acts somewhat as a badge of safety and performance, whereas "written in Java" and "written in C++" each only carry one of the two.

Of course, developer skill is also huge. You can write slow Rust code or fast Java code, yadda yadda.


That example is really difficult, because between the lines it sounds like it has been working sufficiently well despite the cowboy style, substituting methodical error-avoidance with that special kind of brilliance of super deep code knowledge and remembering all the pitfalls. Very difficult situation.

Regarding "rewrite in Rust", I believe that it's a smaller part of the appeal of Rust than it appears based on the amount of code actually written: the main excitement is from people who never really ventured from heap+gc languages into manual memory management but who love the idea of being able to write gc-less native code. And many of them would rather write it slowly, one borrow-check at a time, than with all the memory bugs they created dipping their feet in naive C/++. Those people rarely write much Rust, but their excitement infects some in malloc/free land and for them a rewrite is super attractive because it skips the entire explorative part of software development that is really not the strong point of Rust.

(yes, this is pure projection, not only was I describing what draws me to Rust, I also failed to pick up on the codebase of my previous boss, too much "how can we make this entire thing less slow and error prone" and too little "cram in requirement X, even if it might turn out to become requirement Z because it pushed the codebase over the edge to terminal unmaintainability")


> the entire explorative part of software development that is really not the strong point of Rust

This is what keeps me orbiting but never quite landing on Rust. Things I'm pretty damn sure are safe come back red-stamped, and once I've painted things the way Rust deigns, I find I've lost my appetite. When desire finally returns, I find the fixes I've made to enable compilation also disallow the changes I wanted in the first place.

For all its faults and runtime bloat, I find myself going for Go on new things I would've attempted in Rust before. Though, I've never tried Zig; I wonder how that is...


You can do exploratory programming in Rust, it's just heavy on boilerplate code such as .clone() and Rc<RefCell<>> which would be avoided when writing more idiomatically. Rust even includes support for dynamically typed references and downcasting via the Any trait. It's not as dynamic as Python/Ruby/JS etc. but it's not that far off either.


Java is slow is a really old myth. It was once upon a time. Ofc there are caveats but looking at your average developer ... You will be fine.

On the other hand I would forbid an average developer the use of c/c++ and similar. It is just not worth it.


How is a software written in Java having less assurances in terms of leaks/crashes etc compared to Rust? All Rust gives you is a borrow checker that isn't very smart, but means you don't need GC/VM, so you gain a bit of performance compared to Java.

But if it's not software that needs to go fast, then Rust is not adding anything for you. And if you considered Java to begin with you probably don't need top tier performance.


The parent didn’t suggest Java is less safe; their point is “Java is safe; C++ is fast; Rust is both” and they just phrased it unclearly


The most notable advantage of Rust compared to GC languages is low and stable memory use and predictable performance, that will not generally drop due to some random GC pause. Both of these can be of interest even wrt. software that doesn't "need to go fast" in any absolute sense.


This is a persistent myth that’s only true if your performance model requires persistent latency and, even then, only if you’re careful to pay attention to how much is freed or allocated as you change scopes. The lifetime tracking features of Rust’s type system have non-negligible impacts on the sorts of abstractions you can make.


Sure, there is one abstraction that effectively requires GC, namely a general graph of spaghetti references where you can't possibly predict in advance or "pay attention" to" how much is freed or allocated as you change scopes". Many GOFAI problems look like that which is why GC was initially developed in the context of LISP, the most prominent language for GOFAI. But for most real-world programs you can do vastly better than that.


>> What kind of solution would you propose to folks like your boss?

How do you want to play this boss? I can see you’re focussing more on the bigger picture and the code is just a means to an end, given that, how about you delegate technical leadership on the codebase to me to let you focus on the product & commercial aspects?

If no - maybe words come back to the effect of “it’s my pet/child, how very dare you” yadda yadda - then I can’t see a happy path beyond “thanks for the opportunity, all the best for the future” but there’s certainly room for many mediocre outcomes short of parting ways if that’s preferred.

If yes - “hey, we don’t have infinite money and we prob can’t afford to hire the help to do this to a gold standard so how about we agree these commercial milestones (once change delivery falls below X days on average, or once feature Y that you always wanted lands in prod, or once the defect rate drops below Z per week etc etc) - I get bonus $$$”


> What kind of solution would you propose to folks like your boss?

Rewrite it in C#. It’s safer than Rust, because VM. Both standard and third party libraries are often way better. With modern versions of the language, GC allocations are avoidable if that’s what needed for performance reasons. C interop is equally simple.


C# is not more safe than Rust is and falls to prevent null pointer exceptions and modified collection exceptions.


> C# is not more safe than Rust

By design, Rust requires unsafe code to implement any non-trivial data structures (except trivial POD types). This applies to both Rust standard library, and third-party crates.

The issue is not a theory, security bugs actually happened in reality. Here’s an example about the Rust standard library: https://shnatsel.medium.com/how-rusts-standard-library-was-v...

By contrast, thanks to the VM and the GC, C# allows to implement very complicated data structures without any unsafe code or unmanaged interop. The standard library is also implemented in idiomatic memory-safe subset of the language. For example, here’s the hash map: https://source.dot.net/#System.Private.CoreLib/src/libraries...

> falls to prevent null pointer exceptions and modified collection exceptions

Yes indeed, but these exceptions are very unlikely to cause security bugs in the software.


> Rust requires unsafe code to implement any non-trivial data structures

That seems like a gross overstatement.

https://github.com/rust-lang/rust/blob/master/library/std/sr...

CTRL-F: unsafe

Only one result, an optional utility function: "pub unsafe fn get_many_unchecked_mut"


That's a wrapper around the actual implementation (which lives in an external package). Notice "use hashbrown::hash_map as base;" at the top.

There's far more unsafe there: https://github.com/rust-lang/hashbrown/blob/f2e62124cd947b5e...


The entire JIT, garbage collector and most of the C#'s VM are all implemented in C++. This has caused various issues in the past which are exploitable from managed code. The amount of unsafe code used to implement C# vastly outweighs the amount in Rust's standard library.


If you are going that way, Rust's reference compiler is dependent on LLVM, fully written in C++, and the C++ semantics of bitcode have broken Rust's code generation multiple times, forcing regressions and newer compiler releases with desactivated optimization features.

Also plenty of crates are bindings to C and C++ libraries with nice unsafe blocks.

Then was that Axium drama.


Hmm? Dotnet on Linux uses LLVM for codegen so that seems to be a wash. Lots of nuget packages are wrappers around native libraries as well.


Yeah, doesn't make Rust's dependency on C++ go away for its safety.

The point is the "look at what I say, not what I do", when talking about safe languages and dependencies into C and C++ libraries and compiler toolchains.


Which doesn't really have anything to do with GP's incorrect assertion that C# is somehow safer than Rust.


It has to do with yours incorrect assertion that using C++ in the runtime is a disadvantage for C# in regards to Rust, which equally depends on C++, in both of its compilers toolchains, rustc and gcc-rs.

When Rust gets fully bootstrapped in self hosted toolchain you'll have a point.


I think you've missed my point entirely but that's fine.


> The amount of unsafe code used to implement C# vastly outweighs the amount in Rust's standard library.

According to bing.com chat, https://github.com/dotnet/runtime has 3.5M LOC, and https://github.com/rust-lang/rust has 6M LOC. The right panel of https://github.com/dotnet/runtime says 80% of the .NET runtime is written in C#.

This makes me wonder, do you happen to have a link for your “vastly outweighs” statement?


The "link" is just the repos rather than asking AI to hallucinate an answer. Rust's repo contains 2.2M LOC. The dotnet runtime contains 1.5M lines of C++.

Now if we remove in tree tests from the totals, we arrive at 1.5M lines of C++ (most tests are written in C# as you would expect) and 1.7M lines of Rust.

However, this does not exclude safe Rust code. I don't have a tool off hand that can provide a precise count of lines of unsafe code but we can get some general estimates. There are 1958 instances of "unsafe fn" out of 103,205 instances of "fn ". Further there are 11,545 instances of "unsafe " in the Rust repo while there are 10,768 instances of "unsafe " in the runtime repo.

Given that unsafe functions comprise less than 2% of all functions in the Rust repo, I think my claims are reasonable.


Not necessarily VM, depending on how it gets deployed.


Could you expand on how to avoid using GC in C#?


Don’t create new objects, instead use stack and/or unmanaged heap.

This reduces the expressive power of the language, for example LINQ from the standard library is probably out because based on the delegates which require memory allocations.

Still, the language is very usable even without GC allocations. For example, that library re-implements a subset of ffmpeg for Raspberry Pi4 running 32-bit Linux, with no memory allocations in runtime: https://github.com/Const-me/Vrmac/tree/master/VrmacVideo


Value types (structs), stack allocated arrays, spans, native memory allocation, arenas.

You can also enable the warnings/errors that complain about missing using declarations for class and structs with the Dispose pattern.


Hell is other people’s code. Of course our code is perfect, and we would never write a buffer overflow or issue that would be solved by memory safety. But I find consistently that other people seem to, and then I’m forced to contribute to the same code base.


GP isn't wrong though. There genuinely are total hell codebases. Not everyone is as dedicated to their craft like Mozart was. There are levels. And some teams and individuals do indeed produce better outcomes because not only they know how but also show how.


It might not help you today with your immediate problem but think about it this way:

You become the boss or responsible for some product. If you decide at the beginning to start off using Rust or rewrite some ancient, unmaintained library using Rust, then you prevent people like your cowboy developer to mess stuff up with this category of faults.

Either you would not hire him in the first place because he never came to grips with Rust so you only end up with developers on your team that understand and actively chose the tradeoffs that Rust offers.


If the cowboy developer is the boss they can still just wrap everything in unsafe blocks. But I think I'd take that over the C++ new/delete/malloc/free salad.


At least `unsafe` blocks tend to raise other people's eyebrows rather quickly. The word itself is telling you to take a double take after all.

Some rather "questionable" C/C++ code can, and are missed during code review sessions. One could put the blame on code reviewers, but let's be honest, code reviewing is a mindnumbing task for many of us.


Please write a book about this. I'd want to read it!


I'm not a C++ programmer, but aren't there plenty of "Modern C++" books out there already?


I meant about the experience


It's honestly unfortunate that rust has been sold so hard on memory safety, so that when C++ folk don't spend tons of time in memory issues they think rust is pointless.

I don't want rust for memory safety. I want it for things like proc macros, a sane module system, a good and accepted error handling system, destructive move, constrained generics, unified static and dynamic polymorphism, language level customization points, and many more things.


Rust was a language that I firmly embraced for my own projects for about 2 years, then I eventually came to the conclusion that the benefits it gave me weren't really that useful for the things I was working on, and I switched away to other languages that I personally enjoy using more.


Which languages do you enjoy more?


Rust pissed me off until I stopped fighting it. When I finally gave up and started doing things its way, it became a lot more fun. And `cargo clippy` is flat-out helpful for learning Rust’s idioms.

While I’m not yet out to rewrite all the things in Rust, I understand the appeal of rebooting archaeological projects with a modern approach.


The thing is, I just don’t want to write programs the way Rust wants me to. Among statically typed languages, Java is the closest to what I want to use but my true love is Common Lisp. With the arrival of Coalton, I can get the best of both worlds: a type system with nice properties for the rare program that would benefit from them and CL for the majority of programs.


Then I guess you don't have to write Rust? It seems like it doesn't solve any new problems for you so don't feel forced or pressured to use it (for any technology, really), and I say this as someone who really likes and uses Rust.


Yeah, what’s frustrating is (a) I wanted to like it; (b) it’s taking up a lot of the job market relative to other alternative languages I’d rather use; and (c) I like the language Graydon describes as “the language I wanted” https://graydon2.dreamwidth.org/307291.html


Seems like you want something like OCaml then.


B is exactly how I've felt about, in order: C, C++, Java, C#, Javascript. I spent almost 20 years writing languages I did not like because I couldn't find jobs in anything else.

Rust saved some of us from a lifetime of exactly the problems you have with Rust, maybe let us have this one :)


So, the reason Rust took the direction it did is that a lot of people want a language specifically like this, while there were already plenty of languages roughly in the ballpark Graydon wanted. I don't have a clear use for one of those languages, I might use it in some cases anyway, but I was compelled by Rust, it's more or less exactly what I needed.

I can have a proper type system like an ML, and touch the metal like C? Yes please.


Alright, so when I’m appointed Lord Emperor, we’ll all be writing CL. Until then, I’m digging Rust. It’s like garbage collection, but where you can always tell exactly when a thing is going to be collected.


Unless you happen to be using something like Rust/WinRT and to avoid the usual stop the world cascade deletion when smart pointers in a complex data structure all reach 0, have a background thread collecting those smart pointers.


There’s no plausible scenario in which I’d ever write a single line of Rust/WinRT.


Doesn't change the fact it isn't always obvious.


Yeah, there’s enough ways that rust makes really tedious choices in generics, type system stuff, and its particular flavors of affine memory as a resource logic that I’m considering building my own wee language.


As far as specific complaints go, async rust seems to be a trap: I’ve heard of several companies adopting it and then regretting the decision six months later because of the failure of that abstraction to adequately abstract.


For more anecdata, I have been working professionally with async rust for three years and don’t understand why people complain so much about it. It solves the problem well, it provides predictable performance, and most of the time it’s straightforward to write and reason about.


It's difficult to avoid, right? If you are writing a network server or client that manages a lot of connections, you can either use the popular async executor or you can stray far from the beaten path.


Rust is for people who stick around, can come over hurdles, can take in new concepts and see the overall benefits. It has a steep learning curve that pays off.

For me there are three reasons why Rust is suitable for many usecases:

- performance

- safety

- static binary compilation with targeting different cpu architecture

After spending majority of my time with Python and Java in the last 10 years these are things i really learned to appreciate.


> Rust is for people who stick around, can come over hurdles, can take in new concepts and see the overall benefits

Speaking as someone who is learning Rust and really liking it, I just want to note that this comment is sort of emblematic of what is wrong with the Rust community. It comes off as pretty condescending—"Rust is for the people who are smart and don't give up easily, if you're less smart or give up easily you should really go find a language for wimps".


Sorry i do not want to sound condescending. I was just explaining my own experience. The first time I wanted to learn Rust I failed because I gave up.


Performance and static binary compilation are available in 40-year-old languages, not forgetting Dlang, which is the central topic of this post.

The only reason left to really use Rust is safety.


And the safety reason is also fulfilled with Ada/SPARK2014 as an alternative to Rust with a longer legacy in high-integrity applications.


Performance isn’t a problem for Java for just about any business application most programmers will work on and, if it is, you can usually figure out how to optimize Java to hit your performance target. Safety is also not a problem in Java. And fat jars remain the single best way to deploy an application I’ve seen in just about any ecosystem (and nix is working out pretty well to bring their benefits to the rest of the programming world).


Even better than fat jars are jlink‘ed images, maybe even wrapped in a nice shell using jpackage. Bundling the JDK and stripping it of unneeded modules eliminates so many types of issues. All courtesy of your JDK.


I have the exact opposite experience. In fact for a while I lived on optimizing Java projects to meet with performance numbers even my junior Rust code can beat easily. Most Java devs have no idea how to write performant software.


Hopefully with Valhalla landing, memory footprint can also go down enough.


I've had the same experience - I spent 6 months last year really digging into Rust and came to the conclusion that for the software I'm writing it's trying to save me from problems that I just don't run into enough to make it worth it.

I ended up jumping over to Zig and have been really enjoying it. I ported the same hobby 2D game engine project from C++ to Rust, and then over to Zig. A simple tile map loader and renderer took me about a week to implement in Rust and 3 hours in Zig. The difference was a single memory bug that took 15 minutes to figure out.


the frustrating thing for me has been trying to "port over" C++-isms or Python-isms both fail from one direction or another.

I've found that data structures with _lots of helper methods_ (thinking about things like `Result` in particular) tend to be nice. You do have to learn about Rust-specific things to figure out how to nicely structure your code for everything to work. But the payoff is less pain.

There are still a lot of futzy ownership questions, and even when you write out your supposedly performant and cool system, easy outs like cloning end up hiding in your system leading to some awkward performance questions. Fortunately the systems are merely slow, and it tends to show up in profiling.



> I think the fundamental problem with D is much simpler than that, it's the technical context. D is a better C++, but it's not better enough to escape the gravitational pull of C++.

Not quite. The problem with D is that it was a better C++ than C++98.

In the meantime the C++ world woke up from it's freeze and the standardization process started addressing the requests from the C++ community. Consequently once C++11 was out and work picked up on following standard versions, D ceased to have a selling point.


I keep saying this and having the exact same argument with everyone every single time I mention it (and I'm sure here we'll go again), but D's fundamental problem was its GC. Its mere presence colors the code that uses it, and once your dependencies use the GC, you have no recourse. It's impossible to abstract away. I fail to see how such a language could ever substitute for C++.


According to the followup a week later[0] it seems OpenD is doubling down on GC:

> One of the guiding principles of this fork is to embrace the GC as a successful design rather than to shun and avoid it. [...] I have harshly criticized @nogc in the past as putting a disproportionate burden on library authors while being the wrong answer to what can be a perfectly fair question.

[0] https://dpldocs.info/this-week-in-d/Blog.Posted_2024_01_08.h...


The GC works fine. Why waste energy to duplicate the functionality that requires it? Programmers who don't want the GC are already using C++ or Rust instead. The GC is one of those tools that makes programming in D productive. The lack of libraries and modern tooling is the part that ruins it. That's why you get software written in Go and Java instead of D.

I'd love to use it for embedded work instead of C and C++, even in the current state. Embeded meaning 32bit MCUs running some kind of RTOS, not ARM SBCs powerful enough to run Linux. If not D, then maybe Zig. For this it needs to be able to link against and build upon existing C/C++ libraries.


Unfortunately, the D garbage collector is a simple design that doesn't offer the performance and scalability of a modern collector. Someone switching over to D from Java/C#/Go might be surprised to see GC pauses reminiscent of 1998. Hopefully the maintainers of the D fork will address this.


> Programmers who don't want the GC are already using C++

Sure, but this was in a response about D substituting C++.


GC makes CTFE (Compile Time Function Execution) simple and very easy. It's quite normal for projects to use GC for CTFE and other management methods for runtime.


I don’t disagree. Add in the GC and you’re competing with C#, which is an unwinnable battle considering the resources of MS. Dump the GC and you’re competing in a much more winnable battle with C++, Rust, and that’s it (when the decision was made anyway).


And Java, with GC implementations being able to manage TB sized heaps in pauseless server implementations, or tiny heaps with real time GC in embedded bare metal deployments.

D's GC problem is being a very old approach on how to design one.


I think the worst thing was making GC optional. I’ve never seen any language end up with good optional GC.


I think this is the case of a few noisy people putting too much importance in a single feature, and ruining the whole for everyone else.

If that argument were solid, Golang would not be more successful than Dlang, but reality is, Golang numbers are just so much bigger.


> If that argument were solid, Golang would not be more successful than Dlang

I never said anything about being "successful". I said something about substituting for C++. Go isn't any closer to that than D.


A garbage collector may be right for Golang but wrong for D.


Easy, see the use of Oberon, Java, .NET in bare metal IoT deployments.

PTC, Aicas, microEJ, Astrobe, Meadow,...

The problem isn't having a GC, is its poor implementation.


> The problem isn't having a GC, is its poor implementation.

OTOH Java and .Net exist and have large investment (D is younger than C#), so for people who are fine with GC, there are already plenty of "C++ alternatives". What's left is people who can't or won't use a GC, and for them a GC'd language can not be an alternative.


There is a substantial… feel… difference in using a well gced language that isn’t tied to a monolithic runtime. A big part of that is having zero perceptible startup overhead, since it isn’t parsing a class path or trying to load a bunch of dlls or anything beyond what your code explicitly doe + a much more minimalistic than you might expect GC runtime. Often only 20-30kb of machine code.

While it hasn’t really taken off yet, I’ll point to nim, specifically with its very recently stable arc GC, as a very compelling local (at least) maximum in this space.


I felt the opposite back when I was using D (specifically, D1). It was obvious that everything you can do in C++ can be done in D in so much easier way that it could have replaced C++ in a near future if done right. This is even true when we compare D1 with the current C++20, so I believe D did have a good chance that was wasted somehow.


To overcome something as entrenched as C++ it has to be 10x better. Even rust isn't there. Most of it's community isn't ex C and C++ experts, it's people getting into low-level programming for the first time, either from scripting language or pure functional.


Don't we consider a C/C++ replacement "successful" when it was able to capture enough share of prior C/C++ uses, not the entire share? I never thought C or C++ could be completely gone out of sight, even COBOL is technically alive today (on life support). To be clear, I meant that D could have been in the position of Rust today if done right.


I wouldn't consider a C++ (or C) replacement PLₓ successful until the people working on big compiler toolchains—many of whom are committed alternative programming language advocates themselves (consider the origins of e.g. Swift and LLVM)—decide that, moving forward, those will be written in PLₓ rather than C++.

(Note that the bar I'm establishing here is not merely to have a self-hosted compiler (see many toy compilers). I'm talking about a hypothetical future milestone where the software world's core infrastructure comprising LLVM/Clang and* GCC is written in the "replacement language", i.e. the compilers for a bunch of other (non-PLₓ) languages—including C and C++—are written in PLₓ as well. The backends, at least.)

* "or"?


While I do hope that future, it is an unreasonably high bar because the main value of LLVM is that you don't have to make everything again to make your own PL implementation. For the same reason contemporary web browsers are unlikely to be fully rewritten in any other language unless they get completely displaced by newer browsers. (At least LLVM has a better chance of being replaced...)


A corollary of this is that a language probably needs to be able to interop with C++ to have any chance of being able to replace C++, so that the LLVMs and web browsers of the world can migrate incrementally. GCC was able to migrate from C to C++ because of this, and a migration from C++ to some other PL would need to work similarly.


> contemporary web browsers are unlikely to be fully rewritten in any other language

Right.

> it is an unreasonably high bar

It's not an unreasonable standard. We're talking about replacement. If isn't replacing C++ for this use case, then it's not a replacement by definition.


It's unreasonable in the sense that LLVM and a few others may survive even after pretty much everything else got replaced. We need to exempt some outliers with a very high sunk cost.


Yes, by that criteria C++ was successful over C. And C was successful over fortran. There are small communities that prefer the older stuff, but they were better enough to move the entire industry.

D isn't, and neither is Rust.

> I meant that D could have been in the position of Rust today if done right.

My point was Rust is just a completely different audience. I suppose D could have sold itself as a better JavaScript.

(Sorry for several rewrites.)


I think you're correct for D, but for Rust, we'll see. It's a very slow moving industry. I do C++ work all day (embedded) and while there's no current plans to move to Rust (which means it won't happen within the next 3-5 years), I could see it occurring someday.

We're finally (usually) allowed to use C++17. I had to use C++03 at times circa 2019 - we're finally done with that. It's _really_ slow moving.


> I could see it occurring someday

"Someday" is on the order of decades here. By that time Rust will be a hoary legacy language with a list of warts longer than an ISO standard.


Nonsense, WG21 is committed to ensuring C++ has more warts and sharp edges than any other language on the planet.


I don't agree. Rust very much wants to be a C++ alternative in all (I mean all) respects.


Not really. The lack of fields in traits or a classical inheritance system is pretty strongly missed by some people coming from C++.


> To be clear, I meant that D could have been in the position of Rust today if done right.

Exactly. One language died arguing about a feature (Dlang), two different languages, one without it (Rust), another with it (Golang) proved that it was not as significant.


There are a lot of high profile Rust adoption stories where the developers involved are C and C++ experts. E.g. Rust in the Linux kernel, Rust in the Windows kernel, Rust in Firefox, Rust in Android.


I think the Rust in Firefox is probably the strongest in that list. Rust in Linux is just advocates trying to get adoption (not regular kernel contributors electing to use it).


That's not really accurate. Ojeda is a long time kernel contributor and so are many of the folks writing drivers. Maintainers of various subsystems are also particularly interested.

Not everyone is of course, but hardly "just (Rust) advocates" like you suggest.


> It was obvious that everything you can do in C++ can be done in D in so much easier way that it could have replaced C++ in a near future if done right.

Do you have an opinion on how the GC schism within D affected it competitively with regards to C/C++?


While I don't know well about exact arguments for and against GC in that incident, I believe it's worthwhile to consider two cases when you want to avoid GC. One is the case where you shouldn't run GC because you have a very strict time limit and any GC pause is not desirable. The other is the case where GC pauses are discouraged but not the end of the world. D's `@nogc` only caters the former, which is much rarer than the later (i.e. hard realtime vs. soft realtime). So I think `@nogc` should have been more like an `@explicitgc`, where any allocation and thus GC pause should be marked in the syntax, but is not forbidden otherwise. But I do agree that something like `@nogc` was required.


What about the situations where you don't want the memory overhead of the GC?


Again, the answer depends on whether you absolutely don't want that or just want to minimize that. I believe something like `@explicitgc` is enough annoyance to make sure that you are conscious about GC and memory allocation. If you need even more guarantee, you have to make sure that everything you transitively call is i) marked as `@explicitgc` and ii) has no GC call inside. Given enough demand you may have both `@nogc` and `@explicitgc` as well, but I think the hard guarantee is not something usually want.

Maybe you wanted me to answer about some magical sauce that can conveniently avoid such issues, but I want to make a point that there is no such thing. In fact, Rust is praised for its strictness about memory safety, but that owes much to the existence of `panic`, and you know what? What I've described earlier equally applies to the `panic` behavior in Rust. You can not avoid panicking in Rust, though you can avoid panicking as much as possible and make `panic` immediately fail to avoid any further overhead, so you are liable for any panicking in your code. In the same way, if you really don't want to touch GC you are in the minority and should be served well with a placeholder GC that always fails. Anything beyond that is not something you can't expect from mainstream programming in the near future.


It's not strictly impossible to avoid panicing in Rust, but it's pretty painful. The approaches typically involve forcing a linker error if a panic is linked (https://docs.rs/no-panic).

Yes, you'll be (re)writing a lot more code than if panics are acceptable.


Rust is sort-of a skin for C++11 though in terms of semantics, while new languages have GC to increase productivity.


Meanwhile Rust is noncompelling due to its syntax. It made a bunch of weird design choices (like putting class methods in separate `impl` blocks) that have ruined my chances of trying to use it for any personal projects.


You make it sound like Rust made these choices for the heck of it, whereas there are very good reasons why impl blocks are separate from the struct:

1. Make class methods open to extension. This allows adding methods from other contexts, including other privacy contexts. Sure you could have dedicated syntax like C#'s extension methods, but these were added after the fact. If starting your language from scratch, while have two distinct syntaxes, when one will do?

2. Allows to add methods that only exist for some combinations of generic parameters. For example you can have a generic `Foo<T>` class, and then define a method `from_bar` that only exists for `Foo<Bar>`. Again, impl blocks may not be the only solution to this, but it is a highly cohesive one.

3. Lastly this is more philosophical, but it decouples the data definition from the method definition.

Not wanting to try Rust because its syntax is unfamiliar has to be the weakest reason, especially when the syntax has excellent reasons to be different.


I actually quite like that, because it makes it clear that the impl blocks aren't class methods, they're trait implementations. Maybe they could have chosen a better keyword, but having traits (aka typeclasses) instead of classes helps you mitigate the expression problem[1] and some other nasty things quite a bit.

[1] https://en.m.wikipedia.org/wiki/Expression_problem


Putting class methods in separate impl blocks is actually objectively better; it allows you to define new methods on types that you didn't write, like typeclasses in Haskell.


Notice Rust actually forbids you from doing this to types you don't own.

Rust's own standard library gets a pass. For example the [T] (a slice of T) generic type is a built-in, and the core library defines methods on that type, but in terms of sorting it only provides sort_unstable - an unstable sort, and the associated variants of that sort.

Then Rust's alloc crate, which is optional, has a new impl block for [T] and it defines sort, a stable sort on this same type which is much faster than it might be otherwise because it uses a temporary buffer, hence it needs an allocator and can't live in core.

You are not allowed to do this for other people's types. If I make a Goose in crate A, and then you try to write an impl block for A::Goose so that you can add a fly method to it, that won't work. Rust obviously could allow this, but it would invite chaos, so they don't.

What you can do is invent a trait Flying and impl Flying for A::Goose


Yeah I found it somewhat surprising too. Coming from kotlin I had expected this to be similar to extension methods but apparently not. I never fully understood the motivation behind this restriction.


Suppose I use Jim's birds crate, Sarah's noises crate and Hannah's shape crate.

If Hannah is allowed to write

impl birds::Goose { fn fudge(&mut self) { self.counter += 1; } }

and Sarah is allowed to write

impl birds::Goose { fn fudge(&mut self) { self.counter -= 1; } }

... Now what happens when I call fudge on a Goose? Does it increment the counter? Or decrement the counter? If instead the program is rejected because of the ambiguity, whose fault is the ambiguity? Sarah's fault? Hannah's fault? Jim's fault?


I'd get what I import (explicitly). If I import conflicting extensions then its my fault for doing so and (only) my program fails to compile.


You cannot import specific inherent methods. What you can do though is use the extension trait concept and import those traits.

If you import both traits you get an ambiguous function error and are asked to resolve which trait you want `<Goose as noises::Trait>::fudge(...)`.


Ah, I was not aware of extension traits. Thanks for pointing out.


That’s one of the most trivial imaginable reasons for choosing one language over another.


impl blocks are actually more powerful than class methods because if you have a generic struct or enum MyType<T>, you can have a method for all MyType<T> or for just MyType<Something>


There are a plethora of languages that allow implementing methods outside of a class block, including Java, Kotlin, Haskell, C#. And many of them are mainstream. Your argument seems weak.


That’s a subjective matter of taste.


Yes, but I think you'll find that many people tend to resist change, including drastic syntactic changes (as compared to how languages like D iterate on previous languages like C++).


Rust specifically changed its syntaxes to resemble C++. For example the turbofish syntax `::<T>` is actually a highly bad choice by its own, but was deemed more familiar to C++ users who are used to `<T>` (which was too ambiguous so had to be tweaked though). Even `return` was spelled `ret` back then!


Rust syntax isn’t really that novel. It’s not exactly like C, but most of its syntax is pulled from some popular language: some from C, some from JS, some from Python, some from ruby, etc.


A lot more from ocaml than js/Python/ruby.


Lots of concepts come from Ocaml, but syntax-wise, Rust is closer to JS than to anything else.


If you're put off by key leaders talking down to people, I think you'll want to spend some time in Nim's forum and issue tracker before considering it as an alternative. (YMMV, of course, but I was put off by what I found.)


Agreed. Nim could be perfect, were it not for its BDFL.


FYI, some members of the Nim community are working on a fork, for apparently similar reasons as OpenD (community-led development). https://github.com/nim-works/nimskull under active development and not ready for general consumption though, from my understanding.


Thanks for calling my attention to that. I'll have to keep an eye on it.


Take it from me: I disagree with the BDFL of Nim about a lot, but him banning some of the people who are behind this fork was one of the best decisions he’s made. Stay far away.


Some context: dom96 here was the BDFL-in-2nd-command of Nim for quite some time. There is personal beef between dom96 and some of the primary nimskull contributors, which mostly involved a lot of name-calling, personal insults, and critiques of leadership and leadership effectiveness, that eventually lead to dom96 banning one of the aforementioned contributors, stalling quite a lot of interesting compiler work and partially sparking nimskull. It's all quite boring stuff.

dom96 eventually had a falling out with the Nim BDFL too (which I'm surprised did not happen earlier: the BDFL is... brusque, charitably) and so has been inactive in either Nim community because, well, obviously. But: in all my years involved with Nim, I have seen the nimskull developers to be pretty consistently great to work with? I think they handle things professionally, are not rude, and treat other's work with care (you will maybe notice they have a code of conduct exposing an explicit intent to do so). Which makes what happened between them and dom96 all the stranger in my eyes, but I was less active back then, and there is certainly context I am missing.

tl;dr meh


It's no big reveal that I was involved and optimistic about Nim for a long time, though "BDFL-in-2nd-command" is a stretch for someone who couldn't even get the community to instate a Code of Conduct. That's one of the many many reasons I left.

My many years of involvement with Nim is what I believe qualifies me to speak on the individuals in the forked project. That being said, I am no longer involved with Nim. I don't want to get reinvolved. I just want to warn people about individuals who were abusive to me (and others) for many years.

I banned very few people during my time in the Nim community, and never without good reason. I don't know who you could be referring to here. I'm sure you left this comment with the best of intentions, but you say it yourself, you were less active back then and have had minimal interactions with me on the subject.


Ah, I was referring to disruptek. I don't know why you two hated each other's guts, mostly due to being less active back then indeed. For what it's worth I didn't think the banning was undeserved: but the interactions you two had always struck me as strangely animus, and not anything like anything I've ever seen between him and anybody else, and certainly not between any other nimskull developers and anyone, ever.

It is perhaps notable that the nimskull project has a Code of Conduct (and a rather good one at that). I think it is indicative of serious intention to make a welcoming community, and I don't think that contributing members having had in the past what really struck me as personal grievances turning into open-air malice undermines that: particularly, such behavior would fall under the Code of Conduct itself. But again, I have not seen such behavior - quite the opposite! - outside of what happened between you two.


I don't hate anybody's guts and never have. There was a time when I could not post in the Nim community without an aggressive response from the person you mention - and would often enter the space to find many unprompted aggressive messages about me from him in what amounted to a harassment campaign. Contrary to your experiences, I have seen him frequently interact with others in the same way (and some have left the community because of it).

I'm glad that their community has a code of conduct. It is unfortunate that members there have never applied those rules to their interactions with and about me, nor with many others they targeted in the Nim community.

For your clarity going forward: I did not ban him, though I was one of the people who advised and agreed with that course of action. All decisions on whether to ban someone or not were taken by Araq (at the time at least).

It surprises me to hear you say you have never seen him engage that way with anybody else, given that you acknowledge that his banning was deserved. If I recall he was banned for aggressive behaviour towards _multiple_ members of the community.


Why?


Because of experience I’ve had with the abusive behaviour of that community towards me and others.


Your mileage is sadly spot on.


D has some great developers working on it, past and present.

The thing that moved me away from it was a lack of purpose, which isn't exactly how I would have put it at the time, but in hindsight seems obvious.

The way in which I'm certain that this is true is that if you had the leads try to create a Venn diagram of what the project does and what is central to it vs periphery, they would fail to agree. And if they fail to do that, they will fail to coordinate on the features. And that creates the stagnation and difficulty with contributions.

As well, the reason why that failure leads to a retreat into technical digression - which I know happens out of personal experience - is because that deflects questions by adding scope. It promises an escape from the confrontation: "if I just accumulate more work, it will all come together someday". The problem is that adding scope doesn't solve the contradictions that made you retreat in the first place, it can actually deepen them by creating sunk costs. It's a bad adaptation to the challenge.

If we compare it with, for example, the Niklaus Wirth approach, he would have the Venn diagram completely clear in his mind before committing to any real implementation. And therefore the language would do exactly what he wanted it to do, and it wouldn't need to be scoped into a 20-year project. Thus, he made many languages during his life, mostly of a similar flavor, but with a clear intent of adding something specific that he hadn't covered before.


> Good luck to these guys on trying to bring life back to it, but frankly I think at this point most people have given up and moved on to alternatives like Rust, Nim, Zig, etc...

And back in 2015, folks were saying the same thing, but with different alternative languages. I remember twenty years ago when C++ was dead. Oh, and remember the good old days when Java was dead?

I'm not sure why programming language discussions always turn into quantitative statements that have no data, but fully support the commenter's position.


You just refreshed this memory for me:

I remember 25 years ago when StarCraft was dead and Age of Empires was it. It happened at the same moment my university switched from C++ to Java. I hated that change.

And I remember 15 years ago when the biggest e-sports events in the world were done using StarCraft.

Today I am back in university, and we are using C++. =)


Because it's not very difficult to lurk on forums like this and get a pretty accurate idea of which programming languages are on the way in or out. Or look at GitHub/Stackoverflow surveys and data if you want something quantitative.

I don't think many people thought C++ was dead 20 years ago; that seems like revisionism. Same with Java.

People looking for a C++ replacement have definitely moved on to Rust, Zig and maybe Go & Nim. Not D. I don't see how you could seriously argue otherwise.

Some languages are pretty easy to predict, e.g. Ruby is going to decline quite quickly. C++ is going to stick around for a long time because of its current enormous usage. PHP will probably stick around for a while but slowly decline like Visual Basic and Perl. Rust is going to gain in popularity for a long time and probably stick around for a very long time.

Some are more difficult to predict. I'm not sure what will happen to Go or Nim for example.


> I don't think many people thought C++ was dead 20 years ago

Sure they did. Bjarne was in that camp. That's why C++11 came out, and the C++ of 20 years ago is dead.

> Same with Java.

It was very common to hear comments like "The JVM is great, but Java is terrible. Use Scala, Clojure, [language of the month]."

> People looking for a C++ replacement have definitely moved on to Rust, Zig and maybe Go & Nim. Not D. I don't see how you could seriously argue otherwise.

Do you have some numbers on how many moved to Zig and Nim? I'm not looking for "I've seen comments on the internet" but actual numbers. I'd expect it to be rounding error relative to the population of C++ users. And I seriously doubt the numbers would be very high for Go.

But the point still stands that people have been saying this about D for a gazillion years. It might be more credible if they'd at least change the story up a bit when they say it.


> it's cringe seeing key leaders talking down to people, or dismissing people's concerns, or just have this ego about themselves like they know what's best and everyone should shutup and just go along with them.

That's not the case. The job of language designers is to say "no" all the time and Walter is certainly a model with how to speak with users.


I don't think Walter has ever intentionally talked down to anyone in my time, but his tone can be subtly belittling over some details.

I don't think it's intentional but the argument always begins with explaining some detail as if you didn't know it existed even though you'd have to do be able to bring it up in the first place.

This is not just me, I've had this discussion with a few people. One of a short list of complaints of this kind it must be said.


Time will tell how the maintainers of the fork will react to similar criticism. It's very easy to stand by the sidelines as a community member and to piss on the leadership, it's an entirely different thing to be in that position yourself. Ask GvR what it feels like to herd a band of cats over a period of decades. You need to be in it for the long run and you need to be utterly dedicated to make this really work.


> You need to be in it for the long run and you need to be utterly dedicated to make this really work.

There also has to be an escape hatch to make this work.

E.g., Linus can say that the __is_constexpr preprocessor hack is the product of a demented mind, and he'll merge that absolute monstrosity thanking the contributor.

As a leader you get the contributor community that you get. You can either make the development process workable for them, or you can risk the project languishing/forking.

Put another way: if a leader's concept of what their project ought to be/become veers too far from what the community is actively coding up to be merged, it's not going to go well.

No idea if this applies to D. But I'm absolutely certain Python's dev process has had plenty of escape hatches for resolving development issues. (And I'd guess there are also projects that have too many escape hatches, but I'm guessing that probably doesn't apply to D.)


I haven't seen any of that tbh, I have seen a lot of disrespect come in direction of Walter and him responding in a gracious way actually. Criticism existence means it is allowed and thus is "open".


I would never fault Walters graciousness, rather that I and others just find it difficult to establish a common ground quite often. You can see this very clearly in the discussion of the SQL applications of the string interpolation proposals.


That reminds me of Elm. It has (had?) so much potential if it only were more open with accepting contributions.


> D went from a language that I remember back in 2015 was often promoted on places like reddit and here as a fresh alternative to C++ that was constantly evolving

I wrote a blog post in 2012 about D [1]. In my personal experience, the high point of D was around 2008-2009. It was already declining in popularity in 2012, IMHO, due to intense clashes within the D community and the rise of Go, Rust and C++11 outside the community. D remains one of my favorite languages but I agree with you that Rust, Nim or Zig is the better choice for most programmers today.

[1] https://attractivechaos.wordpress.com/2012/02/28/timeline-of...


Or Go... Or among the new kids on the block - Mojo.

Heck, even .NET is pushing NativeAOT (their brand name for native code compilation) hard these days. Still with sizable gaps but closing for every new release, and you can now develop full fledged desktop apps or a web service and have it be self-contained with native code.

I think .NET in particular is a harbinger for D. When even a "managed language" platform at its core goes native code, and with Microsoft's industry backing giving an entirely different angle of attack than D's minimal defense, you just know it's over. D will live on but in the way Amiga, SNES, ZX Spectrum lives on among enthusiasts.


.NET has always had native support, although in various forms.

NGEN, only usable for fast startup, using dynamic linking.

Singularity and Midori respective C# dialects.

MDIL in Windows 8.x based on Singularity Bartok compiler toolchain.

.NET Native on UWP, taken from Project N, inspired by Midori's System C#.

Mono AOT used by Xamarin for iOS and Android.

Homebrew OS like CosmOS.

Unity's IL2CPP toolchain.


> D's community is actually very welcoming but man, watch some of the DConf videos...

Not welcoming for everyone apparently. I don't know much about D, beyond seeing a post from some dev rage quiting D every few months, due some people in the community being terrible and the leadership not stepping in to limit that.


Maybe they shouldn’t bring D back to life… drop the name, make a clean break and come up with a new one.


Digital Mars was an awesome name.

OpenD is not that awesome.


Do


Dew


Let's call it E.


E, F, G, and H are already taken.

It's hard to say whether I is available, or un-googleable.

J through W are also taken.

X++ is taken, but X seems to be available.

Y and Z are taken.

A would be an excellent choice were it not completely un-googleable.

Finally, B is a famous precursor to C.


>A would be an excellent choice were it not completely un-googleable.

APL is named after the book A Programming Language


The ultimate language will be TPL (the programming language)


til chatGPT renames itself "I, programming language"

which is extra clever because the "boot" button on early IBM mainframes was labelled IPL "initial program load".


TUPL, no?

Even better if it only worked with tuples.


D++


--C



[flagged]


D has been Open Source since the Gnu and LVM compilers came out, maybe 15 years ago. The dmd backend was converted to full Open Source several years ago. So now all 3 D compilers are fully open source.

Full open source is what allows others to fork it without any licensing issues.


D is completely opensource already (https://github.com/dlang/dmd). The "open" of OpenD is just ADR saying that OpenD will be more open to new language features than D has historically been.


which just means a different person is guiding the project


It’s about the policies and the culture that the leadership evokes, just like the leader of a country. It does matter who is in charge.


Sure. But the existing option is the inventors who started the project from nothing and have committed decades of their life.

Just like politics like you said, outsiders often think it's easy to come in and fix everything.

Dividing the community makes sense if things are going off the deep end or no longer being supported. It doesn't make sense in hopes of getting a slightly more efficient approval process.


It is open source. Open source doesn't mean that arbitrary pull requests are welcome, see SQLite or Lua for other examples.


That's fair and thanks for the elucidation on the topic


[flagged]


This is a nasty comment, and lacks any references to those github PR discussions. It comes across very much as somebody who has no experience in managing even halfway large projects.

I've been writing a compiler and I put in a lot of work to keep things simple, manageable and comprehensible, and therefore most likely correct. If I hadn't, I'd be stuck in a swamp of bad code and spending more time fixing old problems than extending the language.I'd say are spent about 30% of the time re-factoring just for this.If I open source my work, I will reject code that is not up to my standards for the same reason that Walter does.


It is, and written mostly out of disappointment after reading the pr discussions. I still believe that Walter should have at least addressed it instead of going circles and acting as if nothing is happening. It is clear that what has been behind the scenes for years just poured out now and unsurprisingly it doesn’t look nice. We can talk volumes about how hard it is to manage large projects and keep everyone happy when in the end nobody cares if there is no progress. Weka ppl pretty much upfront said it.


OK, so please show me those discussions and change my mind.


My suggestions: completely get rid of GC(use refcounts or borrowed pointers), add macros(see FreeBasic extensions of C macros), migrate everything complex/optional out of stdlib to specific packages(Rust ecosystem is centered on packages not stdlib), and you will have a solid competitor to C++ and perhaps Rust. Rust of course has better macros and type system, but D looks simpler and more approachable for rapid prototyping/tinkering that will drive adoption.


Your suggestions are moving backward, and GC by default and no macro made D intuitive and Pythonic that are big plus in any modern programming language construct. The non GC is not really needed unless you're working on OS control primitives but again D give you alternative unlike Go. Every modern languages should avoid macro like a plaque otherwise you will sooner or later create a ghetto inside your community not unlike Ruby on Rails. C++ and Rust are complex languages and it's evident by their long compilation time, while Go and D are much simpler hence faster compilation time. D get many things right, and C++ and later languages like Nim have been copying D uniques features left and right since its introduction.


If you don't have state of the art GC, it's a burden to have in your language since it'll cripple performance, GC pausing threads, collection becoming slower as your heap grows and memory fragmentation for example

D doesn't have state of the art GC, it perhaps should focus on its strengths, being a better C/C++ and embrace the concept of allocators

https://dlang.org/phobos/std_experimental_allocator.html

To me D shine with its -betterC mode, it completely strip the runtime, giving you a great low level language to work with, a great better C/C++, I would have quit D a long time ago if it didn't have this compiler flag


One unexpected benefit of GC is it makes compile time function execution easy to write code for. The use of the GC there does not transfer to the runtime.

Removal of the GC makes for CTFE code becoming much klunkier.

One can see this in betterC mode - the GC is allowed for CTFE code.


Then, D will compete with GC languages that outclass it in ecosystem diversity and metaprogramming.(i.e. it will remain same niche language it was for decades). Having a huge runtime with GC doesn't seem appealing or efficient.


Is the field of competition so ripe here? I'm seeing Go, but what else?

It seems a lot of the other languages:

- don't have an easy compilation story (interpreted, VM, compilers as secondary implementations)

- lean more on the functional side (Ocaml, Haskell)

- otherwise clash with the common BCPL-family mindset (Oberon, arguably Go)

There definitely seems room for an "easier C++". Heck, given how popular Rust is due to backing and support from the functional crowd, leaning into ease of use and imperative programming might be a sufficiently large niche.


Fret not, Python just a fringe programming language for more than two decades, it only then become popular. If you have the right fundamentals success will eventually come your way.


Python is a interpreted language, and is both strongly/dynamic typed

D is a native strongly typed language

You don't make game engines in Python, and certainty not drivers, python devs fall back to C when they need performance, this should give you a hint


Sorry don't get your points, latest D compiler does support C native in addition to the much safer DasBetterC. D is even better you can do all things seamlessly in the same D ecosystem supported by GCC compiler suite by not even going to another language eco-system. Heck, D can be even be faster than Fortran and C++ for native HPC library seven years ago where Rust and Julia still revert to them for HPC routines until now [1].

[1]Numeric age for D: Mir GLAS is faster than OpenBLAS and Eigen:

http://blog.mir.dlang.io/glas/benchmark/openblas/2016/09/23/...


Exactly, you achieve this by not embracing the GC, wich is not what this fork wants to do..

And Python's selling point isn't the fact that it has a GC, it's its interpreted/dynamic nature

D is pragmatic about memory allocation strategy, this fork won't be


> D is pragmatic about memory allocation strategy, this fork won't be

Better than a pragmatic approach to memory allocation is to not have to think about it at all. For many applications, you can use the GC and not even know anything about memory allocation.

If you embrace the GC, you make a lot more progress, since you can add things to the standard library and the language quickly and easily. Users not wanting a GC should use something else. Catering to users that don't want a GC imposes a big tax on everything and everyone.

Embracing the GC doesn't mean they're going to strip out the reference counting and unique pointers already in the standard library.


D is general purpose programming language similar to Python but most would agree that in computing sense it is more general than Python since the latter is rather limited in hardcore HPC domain (bit twiddling, etc).

The fact that most of today's programmer or in the future will be just fine and better off with GC based languages compiled or interpreted, as modern drivers are better off with auto transmission than the manual. Heck, even F1 drivers now compete in auto transmission (DCT).

The fact that D is embracing GC as a default is not a disadvantage but it is advantage for majority of the programmers [1]. Imagine a world in the near future where D is as popular as Python due to its intuitive syntax thus more library will be written for D and thus the library eco-system is flourishing regardless you are using enabled the GC or not with D [2]. Imagine a world where you don't need to write in Python and interface in a wrapper for library in foreign language that only selected few can understand because most of original library authors have passed away since the core are using language originated from the 50s and the library was developed in the 70s [3]. Imagine a world where only programming language with GC can be JIT to wasm [4].

[1]Go: What we got right, what we got wrong:

https://news.ycombinator.com/item?id=38874952

[2]Stop Designing Languages. Write Libraries Instead:

https://lbstanza.org/purpose_of_programming_languages.html

[3]Programming in Modern Fortran:

https://cyber.dabamos.de/programming/modernfortran/blas.html

[4] WebAssembly Garbage Collection (WasmGC) now enabled by default in Chrome:

https://developer.chrome.com/blog/wasmgc


Julia depends on OpenBLAS by default, but faster pure-Julia packages are available, too.


> intuitive and Pythonic

"Pythonic" hasn't been "intuitive" for 15 years now. Even old-school Perl is more cohesive and coherent than modern Python.


Yeah that is why Python popularity is on the downward trend and Perl is the opposite /s


Or perhaps "intuitive" and "pythonic" was never Python's biggest selling point.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: