I dimly remember a fairly deep analysis/comparison of C++ and D that went into how opaque or transparent abstractions were to the compiler, and what that means for performance (models) and compile times.
I thought it was by you, but can no longer find it. Might you be able to point me at it?
See here: http://www.drdobbs.com/parallel/the-case-for-d/217801225?pgn...
Are there any reasons - in your opinion - which should make one start tinkering with D before spending time with rust or go ?
The idea of BetterC is to not link to the D runtime library at all.
So, the parts of Phobos that are usable from it are the parts that:
1. are "header only", meaning all the functions are templates
2. don't use exceptions or the GC
3. don't use OOP classes (which rely on the GC and the runtime)
Not much of effort has been expended cataloging this, though.
1. Would it make sense to work on (for the community in general, not necessarily you personally) a header only standard library, a sane STL if you will, for betterC?
2. Is there any interest in language support for GC and runtime independent classes/interfaces? I understand this is partly doable already by abusing templates, but I imagine the results aren't necessarily pretty or concise.
Regardless, thanks a lot for the responses. I haven't managed to write much D beyond small experiments due to ecosystem and related issues, but the ability to incrementally integrate small parts of D in a C++ codebase as freely as betterC lets me is very tempting.
It's a very good question. Most of Phobos is already templates or the GC, so that wouldn't be much change. A more complex problem is working without exceptions. I don't have a ready answer for that, it's a good area for investigation.
D already has support for "COM" classes/interfaces which work in BetterC.
Here is Shachar Shemesh's introduction:
I was not aware of this at all, do you ever see a time where you could compile D with classes an all with no garbage collection? I am not bothered by the GC coming from C# and Python but it is interesting to know where D is headed since I know there were many efforts at making D compile without any GC.
Basically do you think it will ever make sense for D to just compile without GC by passing a compiler flag? Irrelevant of what D features are used, and maybe it can tell you which features cannot be used without GC (sorry if this already is a feature, I use D I'm not a guru yet).
Suppose I was writing systems software. Considering I know neither, why would I invest time learning D instead of C++? Does D offer anything in that regard?
Plus: none of the sanitizers offer quite-the-same runtime protections. They're intended for use in testing to flush out bugs, not production for avoiding bugs. As is, MSan and ASan are mutually exclusive.
And on the C++ compilers that support such language extension, how does it work with binary dependencies?
why would it matter ? AFAIK D doesn't even have an ISO standard.
> And on the C++ compilers that support such language extension, how does it work with binary dependencies?
you at least get whole-program safety for libc calls since it works by replacing such calls through dynamic loading
Sure it does.
D is defined by the DMD compiler as the reference implementation.
Whatever DMD allows for minus implementation bugs, is what the D programming language means.
C++ is defined by an ISO standard, a document that specifies what every implementation is required to implement.
So any D developer can be sure their code is compliant as long as DMD accepts it, while a C++ developer needs to pay attention how close their favorite compiler is to ISO C++.
> you at least get whole-program safety for libc calls since it works by replacing such calls through dynamic loading
Libc is a very tiny portion of any industrial grade C++ application, and only relevant in open source systems, as it is shipped in binary form in other platforms.
That's a cute, but if that's "the letter of the law" then Walter can replace DMD with emacs in the next release and it'd still be a conforming implementation, and you'd just have to rewrite your codebase in lisp to keep it functioning. All the options available to the D programmer in that case (like "keep using the previous version") are also open to C++ programmers when their compiler makes a change that breaks their code.
In practice everyone cares about stability. The C++ standard provides a signal that "this behaviour is (not) likely to change in the next release". All other features not mentioned in the standard like compiler flags and implementation details like the stack and the heap are on par with features in D when it comes to compatibility from version to version.
The C++ programmer who writes "to the implementation," not "to the standard" is in at least as stable and predictable a situation as the comparable D programmer.
Which is a contrived argument
I do think it's a bit crazy to say that it can increase uncertainty about future compatibility, though -- on that front I think it standardisation has a strictly stabilising effect.
That says everything. Thanks for this ;)
In your expert opinion would it make more sense for a WASM backend to be part or the DUB build process or embedded into DMD itself?
I think Rust has some approaches that let you build as an addon to Cargo (the Rust equivalent to DUB) so maybe a similar path might make sense? I do know there are efforts to bake WebAssembly into Rusts compiler down the road though.
In an ideal world, good programming languages would win or lose on their own merits, but that doesn't seem to be the case in the world we actually live in.
D is really nice and very effective, even for a soft real-time system like ours. Most of our code didn't do any GC but where we don't care too much we can make our life a little easier.
Well I suppose that depends on what you mean by "merits". D has struggled with things like IDE support and breaking changes. A lot of that is now cleared up or being worked on as the language has seen increased activity over the last few years. I expect remaining issues to get cleaned up in the near future.
Another thing that kept folks away was the garbage collector. Quite a lot of work is being done to make it possible to use as much of the language as possible without the GC. Nonetheless, for many potential users, fear of the GC was a reason to not use D.
from Walter's post: https://news.ycombinator.com/item?id=17192702
D might have been an important evolutionary step, possibly showing what does not work well for the niche it was trying to fill.
 I realize that this is not a universally held opinion, but it certainly seems to be pretty widely held.
Kotlin has reasonable compile times, GC of course, and also a reasonable concurrency story.
Go apparently has other special features, like making a single binary, and having very low-delay GC. Also, it's very simple; it's basically a pared-down Modula-2 with the Oberon's methods thrown in. It also got non-fitting magic features added, like returning two values (not a tuple), or built-in generic functions (like `make(chan)`) without generics anywhere else, etc. These do not fit well, which makes many people sad.
They even have branding best practices and so on.
Of course, Google, being an advertisement company, has a lot of know how in that area.
Too bad Go is being held back by Rob Pike's ego, in my opinion.
Plus, Rust has a thriving community whereas thanks to Adacore, Ada has a dying one. Community is critical for programming languages.
But otherwise, Ada is a brilliant and ultra safe language whose safety features go far beyond the memory and type safety that Rust features.
It's a shame.
It is the only Ada compiler available for free, fully updated to Ada 2012, with all the remaining ones are still on their 90's style prices.
https://www.ddci.com/products_score/ (frozen in Ada 95)
Thanks to them Ada has become a regular presence at FOSDEM and is being teached at quite a few European university.
Also Ada allows for compiler assisted runtime allocation. For example, you can declare a datastructure with the size you want, and if it fits the stack it will be allocated, otherwise you get an exception.
Deallocation C style requires the Uncheked_Deallocation package.
Ada 95 removed GC from the standard, as no compiler ever bothered to implement it.
Here is an overview presented at FOSDEM 2016.
With SPARK and Ada 2012, Ada allows for better security constraints than Rust.
"Real-Time Critical Systems: HRM Prototype & Ada Integration"
I'm no expert in either Ada or Rust, but I _suspect_ that type system is more expressive in Rust, allowing for more precise static constraints.
Ada, of course, has decades of prod experience, though.
If you combine Ada with Spark you get some (still) amazing compile type proof checking of projects. 
Rust _could_ develop to have similar abilities (I hope!). But it’d need more generalized linear types or dependent types. Though one could write macros with stargeted higher level type checking DSL for, say, embedded development targets.
Including compilation to WebAssembly as target.
Support from a big corp is one way to get things like IDE support ironed out, as they can throw extra resources at it.
The RLS is used in VSCode, I think Atom, I’ve seen a port for Emacs, and I think Sublime.
The community effort is behind the RLS, the Jetbrains plugin for IntelliJ doesn’t use any of the Rust compiler, as far as I’m aware, but it does have a lot of fans, so they’re doing something right.
As a Java fan of IntelliJ, I still prefer the RLS+VSCode, but it’s great seeing all the IDEs being developed for Rust!
MSFT is also contributing to Haskell by supporting Haskell-related research, and employing Simon Peyton Jones.
Jane Street Capital is also heavily investing in OCaml.
While these languages are occupying a different niche than D, they are more arguably innovative/ integrate way more research / plt theory.
Suppose I am a c++ guy who does a lot of time sensitive code. Who else has made the leap, and what were your impressions? How often do you turn up nothing on stackoverflow? How often do you find there's no lib where you'd expect to have a few in c++?
"This is a touchy topic that already has filled entire blog posts. Virtually everyone in real-time audio is using C++ and it's probably still the sanest choice to make. [...]
I worked with both languages for years and felt qualified enough for the inevitable bullet point comparison. The most enabling thing is the D ecosystem and package management through DUB, which makes messing with dependencies basically a solved problem. Development seems to "flow" way more, and I tend to like the end result better in a way that is undoubtedly personal."
Huh. Now I'm interested.
One constant in my professional (and hobby) careers has been "DLL Hell". I'd pretty much do anything to be hassle free.
Thanks for the tip.
You add the "derelict-sdl2" package to your DUB dependency list. DUB will download and build that "derelict-sdl2" library. This is a small library that will load functions pointers in the SDL2 dynlib (aka dynamic loading).
You'll still have to distribute your cross-platform app with SDL2, however there is no linker option, the linker doesn't need to know about an import library. Voila, same build for all platforms.
Conversely in C and C++ you would probably only have static bindings as a choice, and it can be a pain for cross-platform builds. On Linux, it has to be packaged. On Windows, you must find the right .lib.
There can be considerable list of libraries in linker settings, essentially because there is no package manager that makes dependencies something _composable_: dependencies would then leak on linker settings across the chain. C++ forums are full of people failing to build, that's not the case with D.
No, but nor can you with regular ol' linking against shared libraries. Or against static libraries (except for some mundane stuff like function reordering). LTO requires toolchain support and occurs before code generation.
Yes it's just that.
> Is it possible to make link time optimization work with dynamic loading?
Not that I know of (not sure).
However - it's another big language with a lot of accumulated history to it and a lot of stuff that isn't immediately needed in the day-to-day. That makes it intimidating to learn and more geared towards "megacorp" codebases when I really want something small and suitable for hobby/education projects that go near the metal. So I've also searched for a dedicated "better C" which led me to Zig. It's still in its early stages so it doesn't face the tightrope act of D development. And the language is uncomplicated, polishing up pretty much every wart and major source of error that's in C(even manual memory stuff - not by eliminating the footguns, but by making safe usage idioms easy), while playing really well with existing C code.
D's forum/mailing list was established well before SO was a thing, so the answer is "pretty often", but with the addition "you don't need SO because you can talk directly to the D developers". I've not seen another major language where the core developers, and even creators of the language, were so available.
Doesn't even require registration.
Calling C++ directly is very much an option, although interlingual exception handling is a WIP.
In fact only two production cases of primarily background GCs that I can think of at the moment are some Lisp Machines (IIRC LMI) with HW assisted incremental copying GC and Azul's GC, which uses quite similar approach.
That said, since Rust hit 1.0 and the language got cleaned up, I’ve stopped writing D at all. Not plugging Rust, just the facts. Better language for my space.
Having to recreate everything for scratch is a massive effort, however gradually refactoring an application and having access to huge amounts of third-party libs is an enabler. (while all those libs will use non-idiomatic patterns, so it has to be contained)
I wonder why that is? Maybe it just needs a cooler name (:
It's certainly convenient that an `import` inside a D function body can tackle both issues at once.
I don't know D, and I had a quick Google, but couldn't find anything by exactly that name - apologies for the lazy question.
If so, I believe imports in Python and Rust work similarly, and, say, JS's 'require' seems to too.
from module import name, name, name
import cElementTree as ElementTree
Try a simple language, maybe devise an emulator to go along with it.
So far I have
The emulator -> https://github.com/Lerc/kwak-8
An assembler -> https://github.com/Lerc/AvrAsm
A texbed playground -> http://fingswotidun.com/avr/AvrAsm/Testbed/
In hindsight, attempting an assembler before a full compiler is a really good idea, It lets you encounter a lot of the pitfalls before you hit them in a much more difficult environment.
Of course, a compiler backend is a much deeper concept, getting into things like register colouring and such. If you were to write a simple stack-based VM, it'd be much simpler. I've thought of doing exactly this for embedded systems projects where all I have is Assembly, or a very bad C compiler.
Interestiny, the toy language you’re writing a compiler for doesn’t have operator precedence except for parentheses, so it sidesteps that problem.
Also Ply has been surpassed for many years by better parsing tools in Python. Sly, which was demonstrated, doesn't seem to be production grade yet.
So really, you're surprised that not many people have taken an intro to compilers course?
I think my reaction was a knee-jerk reaction against how easily people become impressed by things these days. It's ok to be encouraging towards others but in an atmosphere of overly-positive attitudes (especially in SV), there is a certain lack of discernment with respect to ideas and technologies which leads to bandwagonning and susceptibility to marketing (case in point, the early adopters of MongoDB). I'm convinced this has a cost to the tech community.
But that is no excuse for me to have a holier-than-thou attitude. It's condescending and discouraging toward people who are just trying to learn . I retract my words.
Is this true though? I mean Google famously didn’t give a shit about hiring Max Howell (creator of homebrew) because of hazing-style pointless whiteboard trivia .
I’d guess the number of circles where a thoughtful side project that digs into deep computer science fundamentals as well as difficult implementation specifics would even get you the slightest attention for an interview is close enough to zero as to render it meaningless.
If you think about a lot of possible different self-study options, ranging from cramming and shallow memorization of leetcode garbage or Cracking the Coding Interview, all the way to patiently applying craftsmanship and self-learning to a side project that exercises and demonstrates core fundamentals in a pragmatic way, the shallow memorization garbage will earn you money, while the side project will earn you an email with a hyperlink to HackerRank.
It is one of the worst aspects of our industry, and a major reason why I would advise younger people to treat the idea that there are direct, stable, good-paying jobs with job security in the tech industry with a lot of skepticism. It’s an industry constantly trying to invent new ways to shift the demographics more and more into the least experienced and least expensive quantiles of candidates, and to arrange businesses where actual software labor productivity doesn’t have a strong tie to the company’s bottom line, except maybe for a tiny population of experts.
: < https://www.quora.com/Whats-the-logic-behind-Google-rejectin... >
And I would do it again in a heartbeat.
As Howell wrote: "I am often a dick, I am often difficult, I often don’t know computer science, but. BUT. I make really good things, maybe they aren't perfect, but people really like them".
My belief, which it seems you do not share, is that a project like homebrew is perfectly analogous to a project like a self-made compiler. They illustrate broader scale systems thinking, although the specific computer science fundamentals will be different for each of those two problems, the a system like homebrew likely also has to have more focus on deployment, user interface, and project management tasks (which, I would argue, makes it more relevant for most hiring situations).
I can't comment on whether Howell's self-deprecation was meant to be tongue-in-cheek criticism of Google or if he sincerely meant it, and of course those interpersonal reasons could have been at fault for a failed interview (although it doesn't seem that anyone disputes that it was actually just a straightforward result of some binary tree trivia).
Writing a C compiler is not like other side-projects. It explicitly exercises a whole bunch of basic Computer Science skills, so if you can write a compiler, you obviously got a good grasp on those.
So, no, a project like homebrew is not perfectly analogous to a compiler. The hardest CS problem in there in my opinion is the dependency resolution, which can be a bitch, but it just doesn't have the breadth of problems that a compiler has.
Creating a package manager, which is mostly "gluing things together without understanding how they work", is a very different skillset than being able to work from first principles. I suspect Google was looking more for the latter.
Speaking as someone who has written pieces of compilers, emulators, assemblers, and related software in my spare time, I can say that even mentioning those things casually is enough to get some "you did what?" looks.
Most HR departments will only care about latest set of projects and education, barely giving any value to side projects.
Yes, there are those that care about side projects, passion and such, but most of them tend to be startups and sometimes not with the desired set of benefits or requiring relocation.
Edit: wrote "talk to me" instead of "hire me"
If you refer to this:
I also don't know why would anybody ever need to implement "inverting" a binary tree except for the Google exam.
> Most likely he implemented more binary trees in his live
"Most likely"? Actually he "just" invented (not a complete list):
- Unix (with Ritchie) (in assembly first)
- the language B, based on which Ritchie developed C.
- the notation for the regular expressions everybody uses today
- the algorithm for fast regular expression handling
- the chess computer that won the 3rd World Computer Chess Championship 1980
He implemented a BCPL dialect, which he called B, in fact if you compare B and BCPL programming manuals they are quite similar.
> "Most likely"? Actually he "just" invented (not a complete list):
Please mind the full quote of me "... than I thought of" -- I thought (and implemented) about trees quite a bit myself and grant him a few factors on top of that.
I know I thought it would be a lot harder than it was. I spent some hours checking example interview questions, and it was definitely enough. If anything, I might have been too focused on thinking in terms of established algorithms, and too little focused on reasoning on my own.
But it's irrelevant to what I wanted to point to: that somebody already accepted Ken Thompson to Google. Then some absurd bureaucracy decided that Ken Thompson has to pass the exams given to the young beginners. Probably reasoning something like somebody here "explained" "so that developers can be moved around."
Whoever is not aware of these "details", please see my another post for what Ken Thompson invented before coming to Google. Such "minor things" like the Unix OS, the language that preceded and inspired C and the currently used kind of regular expressions.
Would you agree with me that if you brought Ken Thompson for that to your company (or using that excuse to require from him to "pass the C exam") something is fundamentally wrong?
He actually co-invented Unix and worked on the "first C" before C was even called C.
It's like hiring J. K. Rowling and then requiring from her to pass the Harry Potter trivia quiz in order to allow her to write text messages.
2) C can be written in many styles, whatever he writes should be an easy uptake for later developers, thus adhere to the common style
I have no doubts that he can express anything in C and I assume this would be of good quality, but might take a different approach, thus makes it harder for the junior who is trained heavily in Google style and has to fix a bug or add a feature a bit later.
That screens out a few employers, but not nearly as many as you think. Most are desperately looking for a way to determine that a candidate is good. Sitting back while they explain in increasing detail how their pet project works is close to ideal here.
People usually omit the part that Homebrew was at that point a quite shitty
reiteration of ideas well-implemented in many other places (RPM/Yum, DEB/APT,
and half a dozen other package managers). It was a common complaint that
installing one package could break several others, because Homebrew updated
some common dependency and the new one was missing symbols or had otherwise
incompatible ABI. There also was this hilarious bug report that Homebrew
served everything over plain HTTP (and Homebrew had no digital signatures
whatsoever), and the ticket got closed by a Homebrew team member saying it's
not a problem, which spoke tons about their competency in package management.
I don't know how the situation looks now, but when Howell was interviewed by
Google, Homebrew was not something you could be proud of writing.
Anyone should be proud of writing a widely used tool that provides value to so many individuals and companies, regardless of nitpicks from the peanut gallery.
I realize we might just disagree in our assessments, but I think this is an unfair characterization. At that point in time, homebrew was basically the only way I could get cross-platform support to work for several image processing projects that my lab was working on. To us, despite its obvious rough-around-the-edges needs for incremental improvement over time, it was a complete lifesaver.
I think this speaks to a larger point too about the way that engineers cut down and compete with other engineers in petty ways, instead of acknowledging that getting a complex working system up and running that adds value for people is a huge accomplishment that utterly dwarfs any sort of algorithm trivia in terms of whether a candidate is worth hiring, especially in a company like Google.
The perspective that,
> "when Howell was interviewed by Google, Homebrew was not something you could be proud of writing."
is just so incompatible with my world view about what counts in software engineering that it likely means we are coming from perspectives so far apart that there's little hope we could agree.
> [...] likely [...] we are coming from perspectives so far apart that there's little hope we could agree.
You see, I come from Linux administration field. We have package managers
deployed in the field for two decades now. These package managers usually
don't silently break your library dependencies on update (though Fedora and
desktop Ubuntu break them with a lot of noise) and have integrity ensured
cryptographically. I take these features almost for granted, as I work with
them every day for a long time. Any package manager that provides anything
less than that is, in my view, badly designed, especially because there are
many examples of how it should be done, it just takes some effort to learn
On the other hand, you, coming from macOS angle, didn't have a prior
experience of working with many package managers -- or at least that's my
guess. If I am right, it's quite clear why a poorly designed but popular
package manager (i.e. one that has many packages available already) was a huge
improvement for you. It was the same for Debian's APT twenty years ago, but
APT had pretty much no prior art and the internet was not
a viruses/worms/spyware-driven machine it is nowadays.
No, as a user of every previous package manager for macOS, I can tell you that Homebrew was successful because it actually worked, and worked really well, and it kept working even for prerelease versions of macOS.
This doesn’t mean that it’s well designed or secure. But it’s not a marketing cream puff.
There were package managers before Homebrew, such as MacPorts and Fink–which went as far to use APT.
Just to clarify this: I personally have always used Debian and Ubuntu, and spent a lot of time with APT (including helping to build a PPA for wrapping a custom wrapper for NetworkManager for my company because their third-party VPN solution prevented all of us who use solely Linux from being able to work remotely for a while.
Despite my personal views that Linux provides me with better tooling, many of my colleagues don't share that view, and they like to use Mac. At times, homebrew has given us a super easy way to solve some problems that arise from that situation.
This doesn't mean it's perfect. Just that it solves specific problems that a lot of users have in a way where they are not adversely affected by its downsides. Every tool out there will have downsides, and a subset of the population who hates the tool because of those downsides.
It just strikes me as incredibly myopic to privilege your own experience in system administration as oh so much more enlightened than a person manage the overall problem of a package management tool that users are actually using with success.
It's just tone deaf to me to say, "but for these engineering reason that I care about, I discredit the entire achievement of the project because the author didn't design it the way I want."
And even if it's not merely "the way you want" but includes broader and more established criteria from history of package management solutions, it's still nonsense to use that to dismiss it.
It's a lot like the shortsighted dismissals of early MongoDB, which went after very specific customer use cases at the expense of omitting big design considerations that the history of database engineering had come to accept as standard.
MongoDB used this to build a customer base who was happy with what they engineered. And then began investing in going back to fix the things that were glaringly suboptimal from historical database perspective.
Lots of people wallowed in their constant criticism of MongoDB, turning their noses up at it. Yet now it's a public company, releasing lots of features that bring it into alignment with those earlier best practices that it was forced to omit for the sake of addressing shorter term workflow needs of its customers.
To me, this is effective engineering. Sticking my nose up at MongoDB because of its earlier choices to omit accepted database designs would be a silly way to look at it.
I'm not saying that example maps perfectly to homebrew, but it is similar in spirit.
It's one thing to say, "I don't like that homebrew prioritized XYZ for its users short term experience instead of addressing big, underlying, sysadmin design considerations ABC."
It's totally different to say that this choice entirely discredits the project.
Users use with success plenty of ill-suited tools for various purposes. It
doesn't make the tools any better, merely more widely used. There are many
examples of better products dying, while worse products take over the market.
My experience with system administration gives me better standing ground for
assessing technical aspects of package managers. These technical aspects have
little to do with success of any particular package manager, though, which
I find unfortunate.
> To me, this [MongoDB; Homebrew too?] is effective engineering.
Effective marketing. Engineering, not quite, especially with MongoDB's track
of losing real data because of operations tuned for benchmarks.
> I'm not saying that example maps perfectly to homebrew, but it is similar in spirit.
It is a similar case indeed. Especially in how marketing was much more
important for adoption than technical grounds.
Still is afaik.
It’s hard to take a package manager seriously when it refuses to run as root and installs system-wide tools as user-owned.
A few of the posts in that sub-feed are about D videos and confs, so you can skip past those if you want. Some of the videos are interesting too, though.
Is D really a top 20 language? I don't remember seeing it anywhere close to the top 20.
https://www.tiobe.com/tiobe-index/ has them in 31
They just decided to keep the website running.
Well I'm just about to restart an opengl project so I think I'll go wholeheartedly into D this time
It’s a 10” iPad Pro (maybe you can check the log and see which advertiser is being naughty).