Hacker News new | past | comments | ask | show | jobs | submit login
How I Came to Write D (2014) (drdobbs.com)
317 points by jxub on June 3, 2018 | hide | past | favorite | 188 comments



Logged in today and saw this! Walter here, AMA!


Hi!

I dimly remember a fairly deep analysis/comparison of C++ and D that went into how opaque or transparent abstractions were to the compiler, and what that means for performance (models) and compile times.

I thought it was by you, but can no longer find it. Might you be able to point me at it?


I don't recall writing such an article. It sounds intriguing, if you do find it, I'd like to see it.


Sorry! I’ll keep looking.


Might possibly be Andrei Alexandrescu's The Case for D -also in Dr.Dobbs.

See here: http://www.drdobbs.com/parallel/the-case-for-d/217801225?pgn...


What is your opinion of Rust? It seems D and Rust can both be seen as saner successors to C++, so I imagine they attract a similar crowd, and maybe compete for mindshare. Any ideas from Rust you wish you'd incorporated in D? Any ideas you think Rust ought to borrow from D?


I have been intrigued to learn D for a long time now but I never had the time to tinker with it before learning other languages that are more widely used.

Are there any reasons - in your opinion - which should make one start tinkering with D before spending time with rust or go ?


What's the plan for Phobos (and more generally the ecosystem) to be usable from the betterC subset of the language? In your opinion, how much of the development focus is aimed in that direction? Are there any parts of the language itself that will be made available to betterC that currently aren't?


Most of the effort for BetterC has come from myself. Based on the buzz about it, it's usage is steadily expanding in the D community.

The idea of BetterC is to not link to the D runtime library at all.

So, the parts of Phobos that are usable from it are the parts that:

1. are "header only", meaning all the functions are templates

2. don't use exceptions or the GC

3. don't use OOP classes (which rely on the GC and the runtime)

Not much of effort has been expended cataloging this, though.


This begs two questions, it seems to me:

1. Would it make sense to work on (for the community in general, not necessarily you personally) a header only standard library, a sane STL if you will, for betterC?

2. Is there any interest in language support for GC and runtime independent classes/interfaces? I understand this is partly doable already by abusing templates, but I imagine the results aren't necessarily pretty or concise.

Regardless, thanks a lot for the responses. I haven't managed to write much D beyond small experiments due to ecosystem and related issues, but the ability to incrementally integrate small parts of D in a C++ codebase as freely as betterC lets me is very tempting.


> Would it make sense to work on (for the community in general, not necessarily you personally) a header only standard library, a sane STL if you will, for betterC?

It's a very good question. Most of Phobos is already templates or the GC, so that wouldn't be much change. A more complex problem is working without exceptions. I don't have a ready answer for that, it's a good area for investigation.

D already has support for "COM" classes/interfaces which work in BetterC.


Weka has recently open-sourced their no-GC library Mecca: https://github.com/weka-io/mecca

Here is Shachar Shemesh's introduction:

https://www.youtube.com/watch?v=xNWRgEHxOhc&feature=youtu.be...


> 3. don't use OOP classes (which rely on the GC and the runtime)

I was not aware of this at all, do you ever see a time where you could compile D with classes an all with no garbage collection? I am not bothered by the GC coming from C# and Python but it is interesting to know where D is headed since I know there were many efforts at making D compile without any GC.

Basically do you think it will ever make sense for D to just compile without GC by passing a compiler flag? Irrelevant of what D features are used, and maybe it can tell you which features cannot be used without GC (sorry if this already is a feature, I use D I'm not a guru yet).


The compiler flag would be -betterC. You can use classes with no GC if you're willing to allocate and initialize them manually.


Hey, thanks for the opportunity :)

Suppose I was writing systems software. Considering I know neither, why would I invest time learning D instead of C++? Does D offer anything in that regard?


D has checkable memory safety. In C++, memory safe code relies on following best practices.


-fsanitize=address in C++ is pretty much checked memory safety.


This is not the case. Not sure what guarantees D's checked memory provides but ASan doesn't offer protection from uninitialized reads. MSan, does, though.

Plus: none of the sanitizers offer quite-the-same runtime protections. They're intended for use in testing to flush out bugs, not production for avoiding bugs. As is, MSan and ASan are mutually exclusive.


_If_ your test coverage is good. I've seen it fail to catch a few bugs as well.


I think you two may be talking past each other, one speaking of "safety of checked memory access" and the other of "compile-time checking of memory safety".


In which page of ISO C++ it is described?

And on the C++ compilers that support such language extension, how does it work with binary dependencies?


> In which page of ISO C++ it is described?

why would it matter ? AFAIK D doesn't even have an ISO standard.

> And on the C++ compilers that support such language extension, how does it work with binary dependencies?

you at least get whole-program safety for libc calls since it works by replacing such calls through dynamic loading


> why would it matter ? AFAIK D doesn't even have an ISO standard.

Sure it does.

D is defined by the DMD compiler as the reference implementation.

Whatever DMD allows for minus implementation bugs, is what the D programming language means.

C++ is defined by an ISO standard, a document that specifies what every implementation is required to implement.

So any D developer can be sure their code is compliant as long as DMD accepts it, while a C++ developer needs to pay attention how close their favorite compiler is to ISO C++.

> you at least get whole-program safety for libc calls since it works by replacing such calls through dynamic loading

Libc is a very tiny portion of any industrial grade C++ application, and only relevant in open source systems, as it is shipped in binary form in other platforms.


> any D developer can be sure their code is compliant as long as DMD accepts it, while a C++ developer needs to pay attention how close their favorite compiler is to ISO C++.

That's a cute, but if that's "the letter of the law" then Walter can replace DMD with emacs in the next release and it'd still be a conforming implementation, and you'd just have to rewrite your codebase in lisp to keep it functioning. All the options available to the D programmer in that case (like "keep using the previous version") are also open to C++ programmers when their compiler makes a change that breaks their code.

In practice everyone cares about stability. The C++ standard provides a signal that "this behaviour is (not) likely to change in the next release". All other features not mentioned in the standard like compiler flags and implementation details like the stack and the heap are on par with features in D when it comes to compatibility from version to version.

The C++ programmer who writes "to the implementation," not "to the standard" is in at least as stable and predictable a situation as the comparable D programmer.


>That's a cute, but if that's "the letter of the law" then Walter can replace DMD with emacs in the next release and it'd still be a conforming implementation, and you'd just have to rewrite your codebase in lisp to keep it functioning.

Which is a contrived argument


It is, and I said so too -- I was responding to some nonsensical lawyerism by showing that it was ridiculous on its own terms, and then I spent the rest of the post (the bulk of the post) talking about actual effective, pragmatic ramifications of having a language standard.


Well, you don't need a standard that much if there's a single compiler, isn't that true?


Sure, I agree with that. I think it's a coordination mechanism, and it tends to make language progress more conservative. I don't know if/when it's a net win, and I certainly wouldn't expect (or want) D or Python or Ruby to do it -- I think they do a better job of managing their own growth and development.

I do think it's a bit crazy to say that it can increase uncertainty about future compatibility, though -- on that front I think it standardisation has a strictly stabilising effect.


“I've never been comfortable programming in a language I hadn't written a compiler for “

That says everything. Thanks for this ;)


Welcome back Walter! Which is your favourite D compiler? Do you ever play around with other compilers too and compare the executables?


Are there plans to add a WASM backend?


No, but anyone is free to contribute to one. It's Boost licensed, after all!


I have not yet looked but have any in the community tried?

In your expert opinion would it make more sense for a WASM backend to be part or the DUB build process or embedded into DMD itself?

I think Rust has some approaches that let you build as an addon to Cargo (the Rust equivalent to DUB) so maybe a similar path might make sense? I do know there are efforts to bake WebAssembly into Rusts compiler down the road though.



Thanks for the links, I only ever used DMD and GDC (when I had some odd issue with DMD on some distros) but never LDC definitely checking it out. I'm glad there's a Snap package for it, seems to be the cleaner approach to installing dev tools for me.


Back when Andrei Alexandrescu was working at Facebook, I had hoped that Facebook would adopt D as a standard language. That didn't happen, and it seems unlikely that D will get a powerful corporate sponsor. I think that would have greatly helped D not just in terms of resources, but also in terms of popularity.

In an ideal world, good programming languages would win or lose on their own merits, but that doesn't seem to be the case in the world we actually live in.


We are not yet up to Facebook power at weka.io but we fully use D and support some development of it. We have also released recently a library we use internally for user space threads and other support stuff, called mecca.

D is really nice and very effective, even for a soft real-time system like ours. Most of our code didn't do any GC but where we don't care too much we can make our life a little easier.


> good programming languages would win or lose on their own merits

Well I suppose that depends on what you mean by "merits". D has struggled with things like IDE support and breaking changes. A lot of that is now cleared up or being worked on as the language has seen increased activity over the last few years. I expect remaining issues to get cleaned up in the near future.

Another thing that kept folks away was the garbage collector. Quite a lot of work is being done to make it possible to use as much of the language as possible without the GC. Nonetheless, for many potential users, fear of the GC was a reason to not use D.


You can use DasBetterC now with no GC whatsoever.


What is DasBetterC?


D as a Better C

from Walter's post: https://news.ycombinator.com/item?id=17192702


So, not the German "Das"...I was aware of -betterC, but was curious about Walter referring to it everywhere of late as DasBetterC.


I was in Muenchen when I noticed the "Das", so yes, it works as the German Das.


Subset of the D language that acts as a “better C”. No GC, manual allocation etc. and no runtime I believe.


Corporate support or not, Rust seems to be a better contender for the crown of the sane successor to C++.

D might have been an important evolutionary step, possibly showing what does not work well for the niche it was trying to fill.


I think Rust is the clear winner for OS-level programming. I'm not at all sure that it's the right solution for application-level programming. I think Go got two things right for the statically compiled application-level space: Fast compile times and garbage collection. Of course I feel like they got everything else wrong [1], but that's what we've got in that space right now. I think that D might be a much better fit for this space, but it doesn't matter because Go has sucked up all the oxygen.

[1] I realize that this is not a universally held opinion, but it certainly seems to be pretty widely held.


JS, Python and Ruby have zero compile times and GC; at least JS and Python 3 have a reasonable and widely understood approach to concurrency (if not true parallelism).

Kotlin has reasonable compile times, GC of course, and also a reasonable concurrency story.

Go apparently has other special features, like making a single binary, and having very low-delay GC. Also, it's very simple; it's basically a pared-down Modula-2 with the Oberon's methods thrown in. It also got non-fitting magic features added, like returning two values (not a tuple), or built-in generic functions (like `make(chan)`) without generics anywhere else, etc. These do not fit well, which makes many people sad.


JS, Python, or Ruby have zero ahead of time compile & link time. They need some time to generate byte code, to compile the code "just in time", to load source or byte code from disk etc.


Go certainly has a very focused marketing effort.

They even have branding best practices and so on.

Of course, Google, being an advertisement company, has a lot of know how in that area.

Too bad Go is being held back by Rob Pike's ego, in my opinion.


What's so bad about Rob's ego? If you want a language by people who say yes! to everything, we already have C++ and C#.


You can have lots of expressivity without having that many features. But go has neither, and things like no operator overloading just plain sucks when dealing with vectors, for example.


How does Rust have a clear hand over Ada?


Rust's cutting edge compiler is free, for one thing. I love Ada, but Adacore may well have killed any chance it had of being a commercially viable language (its use declines precipitously each year, even in the defense and avionics industries where it used to be strong).

Plus, Rust has a thriving community whereas thanks to Adacore, Ada has a dying one. Community is critical for programming languages.

But otherwise, Ada is a brilliant and ultra safe language whose safety features go far beyond the memory and type safety that Rust features.

It's a shame.


If anything Adacore has helped Ada a lot.

It is the only Ada compiler available for free, fully updated to Ada 2012, with all the remaining ones are still on their 90's style prices.

https://www.ptc.com/en/products/developer-tools/objectada

https://www.ghs.com/products/ada_optimizing_compilers.html

https://www.ddci.com/products_score/ (frozen in Ada 95)

Thanks to them Ada has become a regular presence at FOSDEM and is being teached at quite a few European university.


according to wikipedia, their 'free' compiler is GPL without a linking exception. the comparison chart on their page also says it is 'for open source GPL software'.


Apparently people keep forgetting without GPL, Linux would never happened.


GPL for an OS is I've thing. GPL for a compiler and runtime is entirely another, since only the latter forces any project you create with it too carry the GPL license.


Ah you mean gcc, the compiler that everyone ignored until Sun started the trend of UNIX vendors to charge for their compilers.


GPL's runtime isn't GPL, it's got the runtime exception:

https://www.gnu.org/licenses/gcc-exception-3.1.en.html


Sorry, I meant GCC's runtime isn't GPL.


No one is arguing that there is a place for the GPL.


How did Adacore kill Ada?


One technical reason is Rust has safe dynamic (de)allocation (and without a GC), whereas I believe Ada misses that large segment of the "market" (it theoretically allows a GC, but IIRC the implementations do not support that meaningfully). I could be wrong, but I don't think there's anything quite to the same degree in Ada but missing from Rust.


Ada has RAII via controlled types and allows for memory pools.

Also Ada allows for compiler assisted runtime allocation. For example, you can declare a datastructure with the size you want, and if it fits the stack it will be allocated, otherwise you get an exception.

Deallocation C style requires the Uncheked_Deallocation package.

Ada 95 removed GC from the standard, as no compiler ever bothered to implement it.

Here is an overview presented at FOSDEM 2016.

https://archive.fosdem.org/2016/schedule/event/ada_memory/

With SPARK and Ada 2012, Ada allows for better security constraints than Rust.

"Real-Time Critical Systems: HRM Prototype & Ada Integration"

https://www.amazon.com/Real-Time-Critical-Systems-Prototype-...


Community. That’s the big answer. Even if Ada is better, Rust has mindshare.


Mmm.. lifetimes analysis?

I'm no expert in either Ada or Rust, but I _suspect_ that type system is more expressive in Rust, allowing for more precise static constraints.

Ada, of course, has decades of prod experience, though.


Not familiar with the deps of Ada memory management. It has a form of checked RAII, but no idea if it’s done via linear types. However Ada still has a more sophisticated type system for refined typed, aka sub-integer or float types. See [0] for a discussion on Rust issues.

If you combine Ada with Spark you get some (still) amazing compile type proof checking of projects. [1]

Rust _could_ develop to have similar abilities (I hope!). But it’d need more generalized linear types or dependent types. Though one could write macros with stargeted higher level type checking DSL for, say, embedded development targets.

0: https://github.com/rust-lang/rfcs/issues/671 1: https://docs.adacore.com/spark2014-docs/html/ug/en/source/ty...


Rust isn't as expressive as SPARK, which got merged back into Ada 2012.


Do you mean winner in terms of popularity or in terms of quality? In terms of popularity, it's pretty clear we are in for at least a decade of C++ & Javascript.


Thankfully Java and .NET are finally making AOT also part of the standard toolchain, instead of depending on third parties.

Including compilation to WebAssembly as target.


That's been a common thread about java for 20 years


I'm curious, what are your thoughts on Nim? I've used it a little, and compile has always seemed fast. It has garbage collecting on by default, and other language niceties. I feel like it fairs pretty well as an application language.


Rust has Mozilla.

Support from a big corp is one way to get things like IDE support ironed out, as they can throw extra resources at it.


The IDE support has come from various places including Jetbrains.


Lots of people use RLS, and AFAIK that's created by folks working at mozilla, though no doubt now it's mostly a community effort.


IDE support came from the community as an intellij addon, which jetbrains then officially sponsored. But they didn't decide to throw their own resources at making such a thing from scratch.


Jetbrains has sponsored an alternative to the RLS, Rust Language Server.

The RLS is used in VSCode, I think Atom, I’ve seen a port for Emacs, and I think Sublime.

The community effort is behind the RLS, the Jetbrains plugin for IntelliJ doesn’t use any of the Rust compiler, as far as I’m aware, but it does have a lot of fans, so they’re doing something right.

As a Java fan of IntelliJ, I still prefer the RLS+VSCode, but it’s great seeing all the IDEs being developed for Rust!


At least Facebook is currently heavily investing in Haskell (using it for some core infrastructure, and having hired some core GHC contributors).

MSFT is also contributing to Haskell by supporting Haskell-related research, and employing Simon Peyton Jones.

Jane Street Capital is also heavily investing in OCaml.

While these languages are occupying a different niche than D, they are more arguably innovative/ integrate way more research / plt theory.


Netflix opensourced a neural network library that runs some critical piece of code but they are mostly a polyglot shop.

https://github.com/Netflix/vectorflow


I hear so many great things about D, but I haven't started coding in it yet. A friend of mine has gotten in there, coding up his new trading platform in it. Also virtually all his social media posts are about D, so he's keen.

Suppose I am a c++ guy who does a lot of time sensitive code. Who else has made the leap, and what were your impressions? How often do you turn up nothing on stackoverflow? How often do you find there's no lib where you'd expect to have a few in c++?


There is this guy who does VST plugins in D and sells them: https://www.auburnsounds.com/blog/2016-02-08_Making-a-Window...

"This is a touchy topic that already has filled entire blog posts. Virtually everyone in real-time audio is using C++ and it's probably still the sanest choice to make. [...]

I worked with both languages for years and felt qualified enough for the inevitable bullet point comparison. The most enabling thing is the D ecosystem and package management through DUB, which makes messing with dependencies basically a solved problem. Development seems to "flow" way more, and I tend to like the end result better in a way that is undoubtedly personal."


(author of the above quote) I'll add that shorter compile times is the gift that keeps on giving. It's easy to underrate the importance of shorter build-test cycles.


"...makes messing with dependencies basically a solved problem."

Huh. Now I'm interested.

One constant in my professional (and hobby) careers has been "DLL Hell". I'd pretty much do anything to be hassle free.

Thanks for the tip.


On that point. In the D world there is a habit to use "derelict" bindings which are dynamic loaders for dynlibs. This has the advantage of supporting multiple dynlib versions, and also does not complicate the build with a linking option... If you are making anything cross-platform it is a boon.


Can you please expand on that? Are there examples to look at of derelict bindings?


For example you want to depend on SDL.

You add the "derelict-sdl2" package to your DUB dependency list. DUB will download and build that "derelict-sdl2" library. This is a small library that will load functions pointers in the SDL2 dynlib (aka dynamic loading).

You'll still have to distribute your cross-platform app with SDL2, however there is no linker option, the linker doesn't need to know about an import library. Voila, same build for all platforms.

Conversely in C and C++ you would probably only have static bindings as a choice, and it can be a pain for cross-platform builds. On Linux, it has to be packaged. On Windows, you must find the right .lib.

There can be considerable list of libraries in linker settings, essentially because there is no package manager that makes dependencies something _composable_: dependencies would then leak on linker settings across the chain. C++ forums are full of people failing to build, that's not the case with D.

Documentation: http://derelictorg.github.io/loading/loader/


How is this dynamic loading implented? Is it effectively a cross-platform wrapper over LoadLibrary/GetProcAddress/etc. (and their POSIX equivalents) or is there more to it? Is it possible to make link time optimization work with dynamic loading?


> Is it possible to make link time optimization work with dynamic loading?

No, but nor can you with regular ol' linking against shared libraries. Or against static libraries (except for some mundane stuff like function reordering). LTO requires toolchain support and occurs before code generation.


> Is it effectively a cross-platform wrapper over LoadLibrary/GetProcAddress/etc.

Yes it's just that.

> Is it possible to make link time optimization work with dynamic loading?

Not that I know of (not sure).


I've evaluated D for games. It's good in terms of giving you a palette of data modelling options from "I don't care yet" to "I need to track every byte", and having the support of multiple compiler implementations gives it a trustworthy "not going to disappear" feeling. You really don't have to lean on the GC given all the other options for memory management, but it is the canonical default and is likely to stay that way unless everyone keeps piling on the -betterC train.

However - it's another big language with a lot of accumulated history to it and a lot of stuff that isn't immediately needed in the day-to-day. That makes it intimidating to learn and more geared towards "megacorp" codebases when I really want something small and suitable for hobby/education projects that go near the metal. So I've also searched for a dedicated "better C" which led me to Zig. It's still in its early stages so it doesn't face the tightrope act of D development. And the language is uncomplicated, polishing up pretty much every wart and major source of error that's in C(even manual memory stuff - not by eliminating the footguns, but by making safe usage idioms easy), while playing really well with existing C code.


> How often do you turn up nothing on stackoverflow?

D's forum/mailing list was established well before SO was a thing, so the answer is "pretty often", but with the addition "you don't need SO because you can talk directly to the D developers". I've not seen another major language where the core developers, and even creators of the language, were so available.

https://forum.dlang.org/

Doesn't even require registration.


Time sensitive code can be written fully in D. Avoiding the GC is trivial, if you have sensible expectations: The Standard Library (phobos) largely requires it, although it should be said that the D garbage collector currently only runs when (say) explicitly called to allocate memory so it's not as pervasive as Java's (say).

Calling C++ directly is very much an option, although interlingual exception handling is a WIP.


GCs that run somehow in background and are not primarily trigerred by allocation are quite rare, IIRC all GCs shipped with LVM are also only triggered by allocation.

In fact only two production cases of primarily background GCs that I can think of at the moment are some Lisp Machines (IIRC LMI) with HW assisted incremental copying GC and Azul's GC, which uses quite similar approach.


Java also doesn't run collection unless there's interaction with the GC. This is why people put a lot of effort into avoiding allocation in inner loops via primitives and object pools, because doing so ensures that the GC won't cause problems.


FWIW, I only played around with D briefly (I was not unhappy with D, but I prefer Go for its simplicity), but the community on the D Forums[0] is awesome - very welcoming to newbies and very helpful, too.

[0] https://forum.dlang.org/


This is sort of a sideways answer... I wrote some D code for a bit and I enjoyed it. The FFI story is solid and that made interop nice. That said, GC may be a concern for timing sensitive code.

That said, since Rust hit 1.0 and the language got cleaned up, I’ve stopped writing D at all. Not plugging Rust, just the facts. Better language for my space.


You can link to c++ code in D (you do need to declare function signatures on the D side, etc.) but overall c++ compatibility is good, so it's less of a problem than you'd expect


Yeah but then I need a compelling reason to glue those things together in a new language?


The reason is that you are more comfortable or efficient in the new language, but want to access legacy.

Having to recreate everything for scratch is a massive effort, however gradually refactoring an application and having access to huge amounts of third-party libs is an enabler. (while all those libs will use non-idiomatic patterns, so it has to be contained)


If you want checkable memory safety, for example.


Be aware this may ruin you as a C++ programmer, since you'll hardly want to use C++ after. That is pretty common.


D is a great language. It simply lacks in hype like other languages in the same space. Otherwise I find little to complain about it as a casual user


That's been my experience as well. I haven't done a great deal of work with it, but the work I did felt very nice. It's a refreshing improvement upon older, flawed languages...but doesn't seem to have the evangelical community that, say, Rust has behind it.

I wonder why that is? Maybe it just needs a cooler name (:


The Dust programming language? Joke aside, I fear "D" came before Search Engine product names were considered. Sadly sometimes I google "dlang" and still get some odd results.


D is one of the few languages that allows named scoped imports.


It's tricky to do an apples-to-apples comparison. Imports in D conflate two distinct concepts (in a helpful way): declaring a module dependency, and introducing names into the local namespace. Not all languages work the same, though. Ocaml, for example, has great support for namespace updates (`let module M = TheModule in ...` and `let open TheModule in ...` and `module MyModule = struct open TheModule ...`). But they are slightly less significant in Ocaml, because any module that has been linked into the program is available everywhere in the program: you can just write `let result = TheModule.call_something () in ...`, anywhere you like, with no "imports". Linkages (dependencies) are handled externally at the tooling level.

It's certainly convenient that an `import` inside a D function body can tackle both issues at once.


What are those?

I don't know D, and I had a quick Google, but couldn't find anything by exactly that name - apologies for the lazy question.


void foo() { import std.stdio; writeln("sup") }


I think that was a thing that Alex highlighted as an unexpectedly great thing in the language in one of his D talks (on mobile, can't find it right now) because it enabled all kinds of neat organisational tricks


You can do that in Ocaml too, let open Foo in expr.


Are you referring to https://dlang.org/spec/module.html#scoped_imports ?

If so, I believe imports in Python and Rust work similarly, and, say, JS's 'require' seems to too.


Python can also import only some names from a module:

    from module import name, name, name
and can also run a module import under control of a try/except block, so you can trap the ImportError and do something about it. A typical use is to try to import the C version (for performance) of a module, and if that fails, fall back to the pure Python version of the same module; e.g. cElementTree and ElementTree:

    try:
        import cElementTree
    except ImportError:
        import ElementTree


Sorry, mistake in the last snippet above, it should really be:

    try:
        import cElementTree as ElementTree
    except ImportError:
        import ElementTree
because otherwise, you have to know which import succeeded, because that changes the name by which you reference the module and the items in it.


Scala is the only other one I'm aware of.


Take note: "Who the hell do you think you are thinking you can write a C compiler?" If you take that challenge it will seriously elevate your computer programming skills. In some circles, this will get you immediately hired.

Try a simple language, maybe devise an emulator to go along with it.


I'm taking this path.

So far I have

The emulator -> https://github.com/Lerc/kwak-8

An assembler -> https://github.com/Lerc/AvrAsm

A texbed playground -> http://fingswotidun.com/avr/AvrAsm/Testbed/

In hindsight, attempting an assembler before a full compiler is a really good idea, It lets you encounter a lot of the pitfalls before you hit them in a much more difficult environment.


It'll also help you when your language needs a built in assembler!

https://dlang.org/spec/iasm.html


One day I decided to write a simple lexer-parser. It's really not black magic as it's commonly made out to be. You read in one character at a time to build "tokens" (e.g. one-to-n character sequences meaning something like "assign", "equals", "add") and then you use recursive methods to return data structures built out of the tokens. Honestly, the hardest part is remembering mathematical precedence, which doesn't even have much to do with parsing per se.

Of course, a compiler backend is a much deeper concept, getting into things like register colouring and such. If you were to write a simple stack-based VM, it'd be much simpler. I've thought of doing exactly this for embedded systems projects where all I have is Assembly, or a very bad C compiler.


Lexing/parsing is the easiest part of writing a compiler, by far. Even writing the C preprocessor is much, much harder.


Yes, that was my experience on the nand2tetris course, which teaches that as project 10 and 11:

http://nand2tetris.org/course.php

Interestiny, the toy language you’re writing a compiler for doesn’t have operator precedence except for parentheses, so it sidesteps that problem.


A recommended article for those who want to know more about how to parse expressions with precedence very simply and easily: https://www.engr.mun.ca/~theo/Misc/exp_parsing.htm


Bisqwit on Youtube has a nice series called "How to Create a Compiler": https://youtu.be/eF9qWbuQLuw


Annoying thing I learned from doing that: source code does not follow a recursive structure. Preprocessor directives allow for some really strange structures. Of course it only caused my problems for a single file at work. I think a custom stack I can read through when the parsing gets weird makes slightly more sense in retrospect.


never wrote a c parser but I don't see why the preprocessor would change it from being recursive as long as C is; after preprocessing finishes, you should just have normal C?


David Beazley gave a talk on the exact topic a couple of weeks ago in PyCon. https://www.youtube.com/watch?v=zJ9z6Ge-vXs


I was at that talk. Beazley is an entertaining speaker (always has been), but frankly I was surprised how many people were oohing-and-aahing over the material as if it was new. A lot of the stuff covered can be found in any intro to compilers course.

Also Ply has been surpassed for many years by better parsing tools in Python. Sly, which was demonstrated, doesn't seem to be production grade yet.


>A lot of the stuff covered can be found in any intro to compilers course.

So really, you're surprised that not many people have taken an intro to compilers course?


Upon reflection, I think my attitude was wrong. I really oughtn't to have been surprised. There were many developers in the crowd who may not have had a computer science education (developers today, especially web developers, often come from a multitude of disciplines). Sure, there were a ton of CS people, but were also a ton of non-CS people (Math, EE, psych, design, etc.). If the latter group managed to learn something new and interesting from the talk, I ought to be happy for them.

I think my reaction was a knee-jerk reaction against how easily people become impressed by things these days. It's ok to be encouraging towards others but in an atmosphere of overly-positive attitudes (especially in SV), there is a certain lack of discernment with respect to ideas and technologies which leads to bandwagonning and susceptibility to marketing (case in point, the early adopters of MongoDB). I'm convinced this has a cost to the tech community.

But that is no excuse for me to have a holier-than-thou attitude. It's condescending and discouraging toward people who are just trying to learn [1]. I retract my words.

[1] https://news.ycombinator.com/item?id=14225033


Yes. This was at PyCon. A substantial number of people there have computer science degrees. A compilers course is part of most standard CS curricula.


It was a required course in my BSCS program, and I didn't even go to a big name tech school. So, yes, I'd expect someone with a CS degree to have written a simple compiler.


Do you expect most programmers to have CS degrees?


> “In some circles, this will get you immediately hired.”

Is this true though? I mean Google famously didn’t give a shit about hiring Max Howell (creator of homebrew) because of hazing-style pointless whiteboard trivia [0].

I’d guess the number of circles where a thoughtful side project that digs into deep computer science fundamentals as well as difficult implementation specifics would even get you the slightest attention for an interview is close enough to zero as to render it meaningless.

If you think about a lot of possible different self-study options, ranging from cramming and shallow memorization of leetcode garbage or Cracking the Coding Interview, all the way to patiently applying craftsmanship and self-learning to a side project that exercises and demonstrates core fundamentals in a pragmatic way, the shallow memorization garbage will earn you money, while the side project will earn you an email with a hyperlink to HackerRank.

It is one of the worst aspects of our industry, and a major reason why I would advise younger people to treat the idea that there are direct, stable, good-paying jobs with job security in the tech industry with a lot of skepticism. It’s an industry constantly trying to invent new ways to shift the demographics more and more into the least experienced and least expensive quantiles of candidates, and to arrange businesses where actual software labor productivity doesn’t have a strong tie to the company’s bottom line, except maybe for a tiny population of experts.

[0]: < https://www.quora.com/Whats-the-logic-behind-Google-rejectin... >


To your first point, the answer is yes, because I hired the kid. And it wasn't for a compiler job.

And I would do it again in a heartbeat.


Max Howell created a distro, not a compiler, so that's not relevant to parent's comment. Howell said that he couldn't do core computer science things (like compilers, and NOT whiteboard trivia), that he was better at cobbling together scripts that solve user problems well. Maybe Google was wrong to pass him over, but if they were it wasn't because "they didn't care that he could write a compiler".

As Howell wrote: "I am often a dick, I am often difficult, I often don’t know computer science, but. BUT. I make really good things, maybe they aren't perfect, but people really like them".


My feeling is that your reply is excessively focused on the specific example of a compiler from the parent comment. Replies don't need to be specifically focused on compiler projects to be relevant to the general comment about how a side project like a compiler project could get someone hired.

My belief, which it seems you do not share, is that a project like homebrew is perfectly analogous to a project like a self-made compiler. They illustrate broader scale systems thinking, although the specific computer science fundamentals will be different for each of those two problems, the a system like homebrew likely also has to have more focus on deployment, user interface, and project management tasks (which, I would argue, makes it more relevant for most hiring situations).

I can't comment on whether Howell's self-deprecation was meant to be tongue-in-cheek criticism of Google or if he sincerely meant it, and of course those interpersonal reasons could have been at fault for a failed interview (although it doesn't seem that anyone disputes that it was actually just a straightforward result of some binary tree trivia).


No, the point is that compilers are a different sort of beast.

Writing a C compiler is not like other side-projects. It explicitly exercises a whole bunch of basic Computer Science skills, so if you can write a compiler, you obviously got a good grasp on those.

So, no, a project like homebrew is not perfectly analogous to a compiler. The hardest CS problem in there in my opinion is the dependency resolution, which can be a bitch, but it just doesn't have the breadth of problems that a compiler has.


Is this true though? I mean Google famously didn’t give a shit about hiring Max Howell (creator of homebrew) because of hazing-style pointless whiteboard trivia [0].

Creating a package manager, which is mostly "gluing things together without understanding how they work", is a very different skillset than being able to work from first principles. I suspect Google was looking more for the latter.

Speaking as someone who has written pieces of compilers, emulators, assemblers, and related software in my spare time, I can say that even mentioning those things casually is enough to get some "you did what?" looks.


The one topic from “hard” computer science that applies to package management is graph theory. Inverting a binary tree is one of the simplest operations you can do on a graph. Google may have been interested in Howell because of Homebrew, but not understanding this simple algorithm probably showed them that he didn’t have the skills they originally thought his project exemplified.


In my experience, you are quite right.

Most HR departments will only care about latest set of projects and education, barely giving any value to side projects.

Yes, there are those that care about side projects, passion and such, but most of them tend to be startups and sometimes not with the desired set of benefits or requiring relocation.


HR people often know little about programming, and will just compare bullet lists of requirements with the resume bullet points. People with unusual skills don't get hired through the HR front door, they come in the side door.


This is so true. I got my first (and current) programming job cause someone I knew at college recommended me. I didn't talk to HR till after they decided to hire me. So networking is important. Keep in touch with developers you meet at different jobs. You never know when they might land you your next job. It can be easier to skip HR and go straight to the person who will actually be your boss (directly or indirectly) for the hiring process imho.

Edit: wrote "talk to me" instead of "hire me"


To whatever extent you can get hiring decisions moved out of HR, I would recommend doing it.


true. though the problem extends to developers and developer managers as well. I've seen so many colleagues say stuff when discussing a candidate like "Well they just could write <x> algorithm on the board correctly. I found an edge case" or whatever. I've actually not been offered interviews in part because of failed whiteboard coding questions where the developers just weren't impressed. Yet once I land a job I usually tend to be in the top 25%, because most of the requisite skills are not algorithms, which I'm not particularly practiced at ...


Google byrocracy required from Ken Thompson to pass a C language exam.

https://www.theregister.co.uk/2010/04/21/ken_thompson_take_o...


C exam aside, I don't think he would pass a regular Big G interview, unless he'd allocate some 6 months for preps which I doubt he did.


Considering that he is not one of the young hackers starting with JavaScript, but in a time were looking at raw memory dumps was normal operation and memory was sparsely available and CPUs were slow I assume he well knows about algorithms and data structures by heart. Most likely he implemented more binary trees in his live than I thought about and I assume he would consider a question about inverting a binary tree more of a joke than a serious question.


> I assume he would consider a question about inverting a binary tree more of a joke

If you refer to this:

https://www.quora.com/When-do-you-need-to-invert-a-binary-tr...

I also don't know why would anybody ever need to implement "inverting" a binary tree except for the Google exam.

> Most likely he implemented more binary trees in his live

"Most likely"? Actually he "just" invented (not a complete list):

- Unix (with Ritchie) (in assembly first)

- the language B, based on which Ritchie developed C.

- the notation for the regular expressions everybody uses today

- the algorithm for fast regular expression handling

- the chess computer that won the 3rd World Computer Chess Championship 1980


Agree with everything except for B.

He implemented a BCPL dialect, which he called B, in fact if you compare B and BCPL programming manuals they are quite similar.

https://www.bell-labs.com/usr/dmr/www/bcpl.html

https://www.talisman.org/b-manual.html


I don’t understand with which detail of my statements you don’t agree.


His B "invention" in reality was closer to copying and dropping a few features from BCPL than actually inventing something new.


Well, it can be argued that most of the “structured” languages were “dropping something” from Algol. Or later from Ada... yes, nothing fundamentally original happens often. But people give the names to the languages they make... even “dropping” is something “creative” often worth naming. Apparently Unix kept only minimum of Multics. Sometimes exactly the “bigger” variants aren’t fit to survive.


> > Most likely he implemented more binary trees in his live

> "Most likely"? Actually he "just" invented (not a complete list):

Please mind the full quote of me "... than I thought of" -- I thought (and implemented) about trees quite a bit myself and grant him a few factors on top of that.


Someone who knows how to program and is at least superficially aware of different data structures and algorithms would not need six months of preparation to pass the interview. That's very exaggerated and it might discourage people who would enjoy working at Google from applying.

I know I thought it would be a lot harder than it was. I spent some hours checking example interview questions, and it was definitely enough. If anything, I might have been too focused on thinking in terms of established algorithms, and too little focused on reasoning on my own.


> That's very exaggerated and it might discourage people who would enjoy working at Google from applying.

But it's irrelevant to what I wanted to point to: that somebody already accepted Ken Thompson to Google. Then some absurd bureaucracy decided that Ken Thompson has to pass the exams given to the young beginners. Probably reasoning something like somebody here "explained" "so that developers can be moved around."

Whoever is not aware of these "details", please see my another post for what Ken Thompson invented before coming to Google. Such "minor things" like the Unix OS, the language that preceded and inspired C and the currently used kind of regular expressions.


I came mostly unprepared, and passed just fine. In fact they extended it with 2 hours more “followup”. And I’m no superhuman. It may depend on the part of Google you apply to + it was compiler team and I love compilers ;)


How one would apply to such a specific part of Google? e.g. Compilers team?


It's worth noting that this is not a simple C language exam, but a Google C style exam. Purpose is that all Googlers write as similar code as possible, so that developers can be moved around and find a common approach everywhere. Also this simplifies code reviews for the same reason.


> so that developers can be moved around and find a common approach everywhere. Also this simplifies code reviews

Would you agree with me that if you brought Ken Thompson for that to your company (or using that excuse to require from him to "pass the C exam") something is fundamentally wrong?

He actually co-invented Unix and worked on the "first C" before C was even called C.

https://en.wikipedia.org/wiki/B_(programming_language)

It's like hiring J. K. Rowling and then requiring from her to pass the Harry Potter trivia quiz in order to allow her to write text messages.


1) since (by the time of the article) he didn't take the test he wasn't hired for writing C code

2) C can be written in many styles, whatever he writes should be an easy uptake for later developers, thus adhere to the common style

I have no doubts that he can express anything in C and I assume this would be of good quality, but might take a different approach, thus makes it harder for the junior who is trained heavily in Google style and has to fix a bug or add a feature a bit later.


Attitude is important. If you have a solid codebase that you created, insist on being interviewed about that codebase.

That screens out a few employers, but not nearly as many as you think. Most are desperately looking for a way to determine that a candidate is good. Sitting back while they explain in increasing detail how their pet project works is close to ideal here.


Google apparently is not an element in the set 'some circles'.


saved the best reply for last. full circle. zing!


> I mean Google famously didn’t give a shit about hiring Max Howell (creator of homebrew) because of hazing-style pointless whiteboard trivia [0].

People usually omit the part that Homebrew was at that point a quite shitty reiteration of ideas well-implemented in many other places (RPM/Yum, DEB/APT, and half a dozen other package managers). It was a common complaint that installing one package could break several others, because Homebrew updated some common dependency and the new one was missing symbols or had otherwise incompatible ABI. There also was this hilarious bug report that Homebrew served everything over plain HTTP (and Homebrew had no digital signatures whatsoever), and the ticket got closed by a Homebrew team member saying it's not a problem, which spoke tons about their competency in package management.

I don't know how the situation looks now, but when Howell was interviewed by Google, Homebrew was not something you could be proud of writing.


>Homebrew was not something you could be proud of writing.

Anyone should be proud of writing a widely used tool that provides value to so many individuals and companies, regardless of nitpicks from the peanut gallery.


Nitpicks, Glaring security issues that are design “features” and thus will effectively never be fixed... sure those are the same thing.


> "People usually omit the part that Homebrew was at that point a quite shitty reiteration of ideas"

I realize we might just disagree in our assessments, but I think this is an unfair characterization. At that point in time, homebrew was basically the only way I could get cross-platform support to work for several image processing projects that my lab was working on. To us, despite its obvious rough-around-the-edges needs for incremental improvement over time, it was a complete lifesaver.

I think this speaks to a larger point too about the way that engineers cut down and compete with other engineers in petty ways, instead of acknowledging that getting a complex working system up and running that adds value for people is a huge accomplishment that utterly dwarfs any sort of algorithm trivia in terms of whether a candidate is worth hiring, especially in a company like Google.

The perspective that,

> "when Howell was interviewed by Google, Homebrew was not something you could be proud of writing."

is just so incompatible with my world view about what counts in software engineering that it likely means we are coming from perspectives so far apart that there's little hope we could agree.


Success of Homebrew was based on marketing, not on the tool being well designed on technical level (because it was not designed well) or its authors' technical competence (because they haven't displayed it much). Google was hiring for a technical position based on technical prowess, not on marketing one (maybe the skills tested were unrelated to the job, but they were still technical), but people pulling out the Google vs. Homebrew case mistake one for the other, which certainly needs correction.

> [...] likely [...] we are coming from perspectives so far apart that there's little hope we could agree.

You see, I come from Linux administration field. We have package managers deployed in the field for two decades now. These package managers usually don't silently break your library dependencies on update (though Fedora and desktop Ubuntu break them with a lot of noise) and have integrity ensured cryptographically. I take these features almost for granted, as I work with them every day for a long time. Any package manager that provides anything less than that is, in my view, badly designed, especially because there are many examples of how it should be done, it just takes some effort to learn them.

On the other hand, you, coming from macOS angle, didn't have a prior experience of working with many package managers -- or at least that's my guess. If I am right, it's quite clear why a poorly designed but popular package manager (i.e. one that has many packages available already) was a huge improvement for you. It was the same for Debian's APT twenty years ago, but APT had pretty much no prior art and the internet was not a viruses/worms/spyware-driven machine it is nowadays.


>Success of Homebrew was based on marketing

No, as a user of every previous package manager for macOS, I can tell you that Homebrew was successful because it actually worked, and worked really well, and it kept working even for prerelease versions of macOS.

This doesn’t mean that it’s well designed or secure. But it’s not a marketing cream puff.


Seconded. I like and use homebrew because it works better and causes much less hassle than other Mac package managers. (Before homebrew I used fink.)


> On the other hand, you, coming from macOS angle, didn't have a prior experience of working with many package managers -- or at least that's my guess.

There were package managers before Homebrew, such as MacPorts and Fink–which went as far to use APT.


> "On the other hand, you, coming from macOS angle, didn't have a prior experience of working with many package managers"

Just to clarify this: I personally have always used Debian and Ubuntu, and spent a lot of time with APT (including helping to build a PPA for wrapping a custom wrapper for NetworkManager for my company because their third-party VPN solution prevented all of us who use solely Linux from being able to work remotely for a while.

Despite my personal views that Linux provides me with better tooling, many of my colleagues don't share that view, and they like to use Mac. At times, homebrew has given us a super easy way to solve some problems that arise from that situation.

This doesn't mean it's perfect. Just that it solves specific problems that a lot of users have in a way where they are not adversely affected by its downsides. Every tool out there will have downsides, and a subset of the population who hates the tool because of those downsides.

It just strikes me as incredibly myopic to privilege your own experience in system administration as oh so much more enlightened than a person manage the overall problem of a package management tool that users are actually using with success.

It's just tone deaf to me to say, "but for these engineering reason that I care about, I discredit the entire achievement of the project because the author didn't design it the way I want."

And even if it's not merely "the way you want" but includes broader and more established criteria from history of package management solutions, it's still nonsense to use that to dismiss it.

It's a lot like the shortsighted dismissals of early MongoDB, which went after very specific customer use cases at the expense of omitting big design considerations that the history of database engineering had come to accept as standard.

MongoDB used this to build a customer base who was happy with what they engineered. And then began investing in going back to fix the things that were glaringly suboptimal from historical database perspective.

Lots of people wallowed in their constant criticism of MongoDB, turning their noses up at it. Yet now it's a public company, releasing lots of features that bring it into alignment with those earlier best practices that it was forced to omit for the sake of addressing shorter term workflow needs of its customers.

To me, this is effective engineering. Sticking my nose up at MongoDB because of its earlier choices to omit accepted database designs would be a silly way to look at it.

I'm not saying that example maps perfectly to homebrew, but it is similar in spirit.

It's one thing to say, "I don't like that homebrew prioritized XYZ for its users short term experience instead of addressing big, underlying, sysadmin design considerations ABC."

It's totally different to say that this choice entirely discredits the project.


> It just strikes me as incredibly myopic to privilege your own experience in system administration as oh so much more enlightened than a person manage the overall problem of a package management tool that users are actually using with success.

Users use with success plenty of ill-suited tools for various purposes. It doesn't make the tools any better, merely more widely used. There are many examples of better products dying, while worse products take over the market.

My experience with system administration gives me better standing ground for assessing technical aspects of package managers. These technical aspects have little to do with success of any particular package manager, though, which I find unfortunate.

> To me, this [MongoDB; Homebrew too?] is effective engineering.

Effective marketing. Engineering, not quite, especially with MongoDB's track of losing real data because of operations tuned for benchmarks.

> I'm not saying that example maps perfectly to homebrew, but it is similar in spirit.

It is a similar case indeed. Especially in how marketing was much more important for adoption than technical grounds.


> was at that point

Still is afaik.

It’s hard to take a package manager seriously when it refuses to run as root and installs system-wide tools as user-owned.


And as an example of someone who took this particular bull by the horns: https://news.ycombinator.com/item?id=16900938



For anyone new to and interested in D, here are a handful of command-line utilities and other small programs written in D (on my blog), which may help give an idea about (just some of) the kinds of things that D can do, and whet your appetite for more:

https://jugad2.blogspot.com/search/label/DLang

A few of the posts in that sub-feed are about D videos and confs, so you can skip past those if you want. Some of the videos are interesting too, though.


> The path that led Walter Bright to write a language, now among the top 20 most used, began with curiosity — and an insult.

Is D really a top 20 language? I don't remember seeing it anywhere close to the top 20.

https://www.tiobe.com/tiobe-index/ has them in 31


It was at #12 in March 2009: https://www.tiobe.com/tiobe-index/d/


So it's getting less popular over time (assuming the ranking is accurate)? That's not very felicitous.


Isn't this based off google searches? How can they distinguish real requests for "D" ?


And Visual Basic is above JavaScript! You can forget Tiobe.


Exactly, just read the first line of the article and closed it.


Should have read the publish date just above the first line ;P


Dr Doobs looks to be pretty much dead anyways?


It is dead, their last article was quite long time ago.

They just decided to keep the website running.


That's nice for those who haven't stumbled upon it yet!


D is the language I really like, and always intend to use for my next project, and have been saying this for the last 5 years. Every time there is something that makes me decide to use C or C++ just one more time. I don't even think it's mostly valid reasons.

Well I'm just about to restart an opengl project so I think I'll go wholeheartedly into D this time


Just so you know, when I visited the article from my iPad I was prompted to download one of those scam iPad cleaner apps.

It’s a 10” iPad Pro (maybe you can check the log and see which advertiser is being naughty).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: