Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Go’s hidden pragmas (cheney.net)
221 points by spacey on Jan 8, 2018 | hide | past | favorite | 115 comments


For once, I'm gonna be the one sticking up for Go here. :) Pragmas or annotations are kind of unavoidable, and I don't think that it was a mistake to include them. I wouldn't have used comment syntax, but whatever; that's a minor quibble.

Actually, I wish Rust had done one thing that Go did: namespacing the pragmas. That would have been a more future-proof thing to do, because macros effectively give us user-definable pragmas, and once you have user-definable pragmas you have name collision problems. Kudos to the Go team for being forward-thinking there. I suspect we'll have to figure out some kind of solution here, but it won't be as pleasant as if we had just used namespaces in the first place.


> Actually, I wish Rust had done one thing that Go did: namespacing the pragmas.

Yeah, of all the papercuts that the Rust 1.0 macros system had the idiosyncratic modularization/namespacing rules were the most unfortunate. Happily there's already an accepted RFC for the design of macro modularization (https://github.com/rust-lang/rfcs/blob/master/text/1561-macr...) that simply makes all macros operate under the same namespacing rules as all other items, and it looks to be mostly implemented as well (though it won't hit stable Rust until the larger macros 2.0 initiative is finished). And as for future-proofing, I'm not too concerned: all the "standard" macros can be exported from the stdlib prelude without a problem, and any libs that update to macros 2.0 can easily export the updated macros just like any other public item and consumers can update their code with only a single `use` declaration (it's not like the old macros system doesn't require explicit importing anyway, it's just unique and janky). Very much looking forward to the simplicity and consistency of the new system.


cool, I didn't realize this applied to attribute-style macro application too.


This does sound like a really good idea. Have you made an RFC or proposal to the team to see if it is possible?


As I mention in a sibling comment, the RFC to make macros work with the module system like any other item has not only been accepted, but mostly implemented. :)


You've missed the chance to include namespaces on the standard set, but is it too late to reserve un-namespaced annotations for official use?


It's not too late. The Rust standard library has what's called the "prelude", which is a set of items (functions, types, traits) that are imported by default into every Rust program. So for example the complete Rust program `fn main() { let x = String::new(); }` works despite the fact that at no point did we import any type named `String`; this is because Rust programs implicitly link the stdlib by default, and the stdlib publicly exports the items in the prelude (https://github.com/rust-lang/rust/blob/master/src/libstd/pre...).

So in the future when macros are namespaced just like every other item (which is actually already designed and largely implemented, see my other comments here for links), all that needs happen is to export the newly-transitioned macros from the prelude and all will continue to work as usual. As for potential collisions with third-party macros, the old system already requires anyone who wants to import macros to stick a hacky "macro_use" pragma on their import statement, and old-style macros are specified to shadow rather than collide so there will be no need to be cautious with updating the stdlib. Third-party libs will be free to update to "macros 2.0" at their leisure (though the need to have users explicitly import macros will require those libraries to issue a breaking change when they do so), and old-style macros will be supported for quite a while though eventually they will be deprecated and discouraged (and presumably removed in some future epoch).


Poor Rob Pike. He always tries to make things simple, and over they time entropy always does its thing. You can hear his frustration in his cited comment.


Yeah but Rob Pike's idea of simplicity is him personally not having to implement things. If every one else in the world has to implement the same thing a thousand times a day he still thinks his thing is simple.


I think this is confusing simple (as in simplex) with facile.

Simplicity implies a small set of features; it's a property of design.

Facility implies ease of use; it's a property of operation.

Go unambiguously favors simplicity, at the expense of facility. This is why we're given raw CSP primitives (sharp edges included, e.g.: goroutine leaks) instead of a full-fledged actor model.


What are the differences between raw CSP primitives and ull-fledged actor model?


In an Actor model, you can explicitly send messages to actors (send), monitor their state (executing, failed, completed), and take actions against them (kill, spawn).

A fallout from this is that you can have one actor supervise another, to protect your program from errors: https://hexdocs.pm/elixir/Supervisor.html

The Golang runtime provides only these actions: (spawn). It’s up to the author to create communication channels, and it’s impossible to query for the status of a Goroutine. This means you have no explicit control over goroutines, so you can’t kill misbehaving ones like you could in another language.


I feel raw channels provide more possibilities than Actor model. In fact, you can build such an Actor model by warpping CSP channels. It is not too hard.


Yes, I think this is precisely the idea.

Anecdotally, 99% of the complaints I read WRT Go can be solved by using libraries that already exist. If one wants a batteries-included-in-the-stdlib form of facility, one should look at languages like Python.


Yes, one principle of Go design is try to use as few syntaxes/concepts as possible to support as many use cases/features as possible.


Actor model cannot be implemented wrapping CSP, because CSP is bounded and synchronous, while Actor model is unbounded and asynchronous. But you can implement CSP on top of Actor model.


In Go I can write to a channel that nothing will ever read, and I can attempt to read from a channel to which nothing will ever write.

In a pure actor model raw channels wouldn't be directly available and I would be required to push to and read from a concrete goroutine.


I wish I could upvote this comment a dozen times.

Much of go’s “simplicity” is a Faustian bargain that comes at the cost of unnecessary complexity in each and every project that winds up being written with it.


This is called trade-off. The reality world is never perfect.


Okay, but a trade-off that forces complexity into hundreds of thousands of programs to avoid complexity in one is poorly-conceived.


Care to illustrate your point with an example? I’m wondering what kind of “unnecessary complexity” you’re talking about.


I mean, generics are kind of go's whipping boy. Lacking generics means you end up with copy/pasted code for utility functions that should be part of the go stdlib in virtually every project. It also means copy/pasted code for any data structure you might want to use that's fancier than an array.

Go's "simplicity" of error handling (read: lack of any actual error handling abstractions) means you don't get useful things like stack traces and have to manually grep through code for nested error messages. It also makes go code difficult to read at a glance, since virtually every statement winds up wrapped in repetitive error-handling code that doubles or even triples the amount of code in the happy path.

The error-handling pattern of using tuples, but no syntactical ability to operate on data within a tuple means you almost never have the ability to chain function calls like `a.b().c().d()`. Instead you have to manually unwrap the value and error, return if there's an error, call the next function, manually unwrap the value and error, ad nauseam. The "idiom" of gift-wrapping error messages is absurd — you are replacing machine-based exception handlers with expensive, slow, error-prone, and less-capable meat-based exception handlers.

Having a half-baked type system means you end up having to frequently write type-switches which are checked at runtime to do any sort of generic code. There's no functionality in the language to ensure that all possible options for that type switch are exhausted, so you are virtually guaranteed to get runtime bugs when a new type gets written and later is passed in.

Speaking of type switches, they interact poorly with go's indefensible decision to have interfaces implemented implicitly rather than explicitly. I have seen types get matched to the wrong typeswitch in producion code because a new method implemented on one type caused it to accidentally "implement" an interface used elsewhere in a typeswitch. Good luck ever catching this before it hits you in production.

Go's concurrency primitives are useful, but the lack of ability to abstract over them means that you have "advanced go concurrency patterns" dozens of lines long and involving multiple synchronization primitives for what amounts to `a | b | c` (https://gist.github.com/kachayev/21e7fe149bc5ae0bd878). God help you if you want to implement something like parallel map. God help you if you want to implement something like parallel map for n > 1 types.

Go requires you to manually remember to release resources you've acquired with `defer`, instead of sanely having There is no capacity in the language to enforce that you've done so, and it is virtually impossible to find e.g., a missing `defer fd.Close()` in a large code base. God help you if you leak file descriptors and need to track down the source.

Go's inability to perform any meaningful abstractions also means that you have to know all the details of code you import. It's difficult to make code a black box. Case in point: to do something as painfully simple as reading a file, you need to import bufio, io, io/util, and os.

During the course of writing this post, I forgot more examples than I listed — I literally could not remember them all in my head as I was writing them down. This isn't simplicity, this is utter madness.


> you end up having to frequently write type-switches which are checked at runtime to do any sort of generic code.

This baffles me. I think I basically never use any type-switches, with the exception of interfaces being used as a sum-type - in which case the problems you mention with type-switches just don't come up.

> Case in point: to do something as painfully simple as reading a file, you need to import bufio, io, io/util, and os.

I don't know what you mean here. `ioutil.ReadFile` reads the whole file, done. Even if you prefer linewise-scanning, you still only need `bufio` and `os`.

But even if you'd need all those packages to read a file: So what? Like, I honestly don't understand what's the problem with that.


The problem with that is I have to care how `file` works.

Here's how to read a whole file then loop over the lines:

    file, err := ioutil.ReadFile("data.txt")
    // some error handling
    for _, line := range strings.Split(file, "\n") {
        fmt.Println(line)
    }
Here's how to stream a file one line at a time:

    file, err := os.Open("data.txt")
    // some error handling
    defer file.Close()
    scanner := bufio.NewScanner(file)
    for scanner.Scan() {
        fmt.Println(scanner.Text())
    }
    if err := scanner.Err(); err != nil {
        // more error handling
    }

Here's how I, at least, would like it to work:

    fileContents := file.Read("data.txt")
    for _, line := range fileContents  {
        fmt.Println(line)
    }
    if fileContents.Err() {
      // some error handling
    }

    fileContents := file.ReadStreaming("data.txt")
    for _, line := range fileContents  {
        fmt.Println(line)
    }
    if fileContents.Err() {
      // some error handling
    }
The critical point is that I don't want to care whether `file` is a byte slice or a byte buffer, and Go doesn't let me not care. I want to be able to write code that deals with "enumerable data of some sort", once, and then works no matter how the caller decides to provide that data.

Go (intentionally) makes it exceedingly difficult to obscure how a piece of code works from the rest of the codebase, and I personally think that's a fatally poor design decision. In my experience, it makes it difficult to decouple modules since code often has to be at least somewhat aware of quite a few implementation details of a library in order to use it correctly. It makes it really difficult to build higher level abstractions that don't leak. I have a much harder time in Go getting away from thinking in `int`s and `floats` and staying terms of the domain objects that I actually do care about.

"How" a piece of code functions is at best the third, and probably only the fourth most important question (behind "why", "what", and probably "when" if you use any concurrency at all), but Go forces it to be front and center at all times.


> Here's how I, at least, would like it to work:

I still don't get what your problem is. It seems what you want is to just take an io.Reader and pass that to bufio.NewScanner, solving your problem and letting your caller figure out what Reader to pass you? I mean, to me, this seems to be a solved problem and exactly one of Go's major strengths.

> it makes it difficult to decouple modules since code often has to be at least somewhat aware of quite a few implementation details of a library in order to use it correctly.

You still haven't described a single piece of your code that requires, in any way to know any implementation details of any of the libraries you are using. Like, you don't have to care how os.File is implemented, it just gives you a Read method that you can use to read from it, just like a thousand other Readers. And then you can use that in a bufio.Scanner to read lines (words, whatever tokens), without that having to care in any way about how the Read method is implemented. You want to scan lines from a byte-slice, use bytes.Reader, that's it's sole purpose and your scanning code does not have to care what Reader it gets passed.

Like, I seriously don't understand your problem here. It would seem to me, what you are describing is exactly how Go works.

> "How" a piece of code functions is at best the third, and probably only the fourth most important question (behind "why", "what", and probably "when" if you use any concurrency at all), but Go forces it to be front and center at all times.

Sure, I agree that Go does not encourage you to build deep abstractions. But I fundamentally disagree that you have to know any implementation details - anymore than any other language. Yeah, the type system doesn't lend itself to build extra abstractions, but "having to care about implementation details" just is not one of the symptoms of that o.O


> You still haven't described a single piece of your code that requires, in any way to know any implementation details of any of the libraries you are using. Like, you don't have to care how os.File is implemented, it just gives you a Read method that you can use to read from it, just like a thousand other Readers. And then you can use that in a bufio.Scanner to read lines (words, whatever tokens), without that having to care in any way about how the Read method is implemented. You want to scan lines from a byte-slice, use bytes.Reader, that's it's sole purpose and your scanning code does not have to care what Reader it gets passed.

These are literally implementation details. Except you're having to implement them yourself. Reading a file delimited by tokens is a solved problem. There is zero reason why I should be having string together code from four different modules to accomplish this myself. This is the entire reason we have come up with the concept of abstraction.


> Except you're having to implement them yourself.

This is plainly wrong. All components I mentioned exist in the stdlib.

You have to glue them together yourself, sure, but that's the point of having components with separated concerns, which is usually considered a good thing in software engineering.

> There is zero reason why I should be having string together code from four different modules to accomplish this myself.

You don't. You have to use at most 2. And also, to repeat the question: who cares? Like, what is the actual downside of having to import 2 packages? I also took the liberty of looking for solutions to how to do this in other languages. Here is a Java solution, which is in line with what's requested that has 4 imports and one of them is third-party: https://stackoverflow.com/a/1096859. Here's rust code with 4 imports: https://users.rust-lang.org/t/read-a-file-line-by-line/1585. Python and Haskell get away without imports; because they just make reading files a language-builtin/part of the prelude, which TBQH is pretty cheaty.

Like, even if I'd buy into the notion that modularity and composability are bad things, it's not even as if Go would be in any way an outlier here. And even if it where then at best this is the mild complaint that no one has yet wrapped this in a ~10-line library; the language certainly does allow it, contrary to what's claimed.

I'm sorry, but this complaint is just forcibly trying to make up a problem where none exist to fit your narrative of Go being a bad language. It's not productive.


(In the Rust code, the Path import is unnecessary, and rustc will warn you that you should remove it, so it ends up having three.)


In this case, I suppose my specific complaint is that I'm unable to make `range` work transparently with an arbitrary type.

I guess you could chalk it up to a difference of aesthetic opinion. I don't want to think about buffers or readables. I want to think about loops and strings. Looping over a collection means `for range`, so I'm gonna assume that Just Works. Maybe `for range` is just syntactic sugar for `bufio.Scanner` under the hood, but I don't want to care while I'm using it.

I want to think of a file as a black box full of strings. What's actually in the box? Don't care. How do I get lines out of the box? `for _, str := range blackBox`, same way I loop over every collection. How does that actually work? Don't care. Whoever implemented the box has to care, of course, but I sure shouldn't. I've got more important things to worry about, like whatever it is I actually want the code to do. Every character that isn't about whatever it is I actually want the code to do is a problem.

Having primitives and builtins that only work sometimes (specifically, with a short list of builtin types and aliases for same) means I can't just use the builtins without thinking about what I'm using them on. Having to crush down to a lowest common denominator means that what's in my brain while I'm reading and writing code isn't strings and what I actually aim to do with them, it's how the Reader API works, and whether I read a full or a partial line, and whether I need to handle errors before or in or after the loop this time. I want to think about my problem domain, but Go keeps dragging me down into the weeds.


> In this case, I suppose my specific complaint is that I'm unable to make `range` work transparently with an arbitrary type.

Sure, fair enough. But note that you've now shifted the criticism from "I have to import 4 packages" (which was wrong) over "I need to know implementation details of packages" (which was wrong) to "I don't like that Go doesn't have operator overloading".

> I want to think of a file as a black box full of strings.

And what exactly is preventing you from doing that? Like, how exactly is the language preventing anyone from providing this much higher level API? You could even make it work with range, if you so desire (it would be considered very unidiomatic, but presumably you don't care).

The stdlib provides you with composable pieces to achieve the job you want. I still find this complaint incredibly weird, unless you assume that everyone wants to view files as just a bunch of lines (I'd argue, these days, the overwhelming majority of files probably aren't). Like, you will still need the lower-level API; where's the problem with having an stdlib which focuses on providing composable pieces and then having some library do the composition for higher-level concerns?

Last time I checked, having small composable units of code with clearly separated concerns was pretty much universally considered a good thing.

> it's how the Reader API works, and whether I read a full or a partial line

This is just a random aside, but: You never have to care about that, unless you specifically want to. But I would argue that code that calls io.Reader.Read is likely wrong - unless it does so to wrap it. Use io.ReadFull.


Go can offer simplicity and flexibility only once you use function literals. Like reading a file line by line could be just a single function call that calls your function literal each time the line is read.


Yes, thank you! This is the point I was trying to get across. In a sense, Go makes all abstractions shallow. Perhaps worse, Go makes all abstractions several times leakier than they need to be. While this is apparently something the "all code must be explicit" crowd seems to love, to me it means I can't just focus on the high-level business problem I'm trying to solve. Instead I have to get bogged down in the nitty gritty details of everything.


You forgot the part where, for the sake of simplicity, their stdlib directly invokes syscalls (rather than going through libc). Which breaks on platforms where syscalls are not considered a stable API, like the BSDs and macOS:

https://github.com/golang/go/issues/16606


I don't think its for simplicity, Cgo has real overhead.


Doesn't forcing your self into a corner where you have to either depend on unstable system interfaces or suffer severe performance penalties indicate poor design?

Obviously any set of abstractions is going to wind up making tradeoffs, and choices that make one particular set of problems easy can cause negative consequences in other areas. Just the more that I used go, the more I grew to question the logic behind these design decisions. Perhaps they make sense at Google, but to me they don't seem to make sense outside of its internally-controlled confines.


It really sounds like you might like Rust. Have you looked into it?

/snark

But seriously, for a long time before Rust was fully baked, I kept wishing it would be done, so that people hating on Go could go use Rust instead. Now it's fully baked, which is awesome.


Rust is an example of a phenomenally well-designed language.

When I first started go, I was incredibly excited to start learning it. From everything I'd heard, it was everything I was hoping to find in a language. The more I started using it, the more and more its poor design decisions and the hollow defenses of these decisions by its community started to grate on me.

Rust, on the other hand, I was loathing learning. I'd already just learned go and was extremely disappointed with it. I really didn't want to learn something else in this language space that I assumed overlapped so much, and I really entered into it hesitatingly. But I am exceedingly glad I did — unlike with go, every day I used Rust I came to appreciate its design more and more. Features of the language seem to have been designed to coordinate and work with one-another, instead of all being bolted on separately without regard to how they'd interact.

On top of that, Rust has some of the most friendly, dedicated, and talented developer community I've ever seen.


Aside from the occasional partially misquoted and much-publicized opinionated statement from Go Team members, Go also has some of the most friendly, dedicated, and talented developer community I've ever seen.

I'm more than fine with people preferring Rust. I'm fine with Go's choices rubbing people the wrong way. But the frequent caricature of the Go Team being condescending or clueless simply rings false to anyone who has enjoyed being part of the Go community, where kindness and ability abound.

I'm always curious about the frequent ravings about how friendly the Rust community is. I'm super glad, and I love seeing (hearing) the reports from the Increasing Rust's Reach program on the New Rustacean podcast, for example. I'm glad Rust is so welcoming and friendly.

But I guess for this (increasingly) greybeard, it's funny, because the whole thing seems so cyclical: the Ruby community back in the PragProg-induced-popularity days, and the Perl community back in the pre-Perl6 days had a similarly lovely feel (minus the more recent but oh-so-welcome proactive diversity bent).

Anyways, enough ramblings. I'm glad Rust fits so well for you. I keep meaning to learn it properly by porting an Apple II emulator or something, but… time :-)


Thanks for writing this up; it was something of a wake-up call from a recent bout of fanboyism, and I found myself agreeing with most of your examples. I'm still learning and I haven't found any of these things to be dealbreakers yet, but they're absolutely questionable design decisions when you consider that other languages already solved many of these issues (with good reason). I think there's a lot to like about Go but I'm worried that the language will die young because of religious inertia.


That's the trap of Go. None of these are deal breakers. Yet.

It's notable that the vast majority of the above are complaints by experienced developers working on large projects that need to be maintained and expanded for years. They're not the sort of thing you notice over a weekend of tinkering on something fun, and they're not the sort of issues that beginners will run into under any normal circumstances.

All of that adds up to a honeypot for newcomers. The language has some deep issues, but they're not the kind of things you're likely to notice until you already have 200k lines written and it's too late to switch to something with a steeper learning curve. Saving two weeks of confusion when you're learning ends up causing a lifetime of headaches down the road once you're an expert.

In other words, Go is PHP for systems programmers.


I hate this kind of "if only you'd write large programs and not just toys, you'd understand how bad Go is".

I write Go at work. I haven't counted in a while, but I guess the code of my team comprises significantly more than 200k lines (and that doesn't even count all the code I'm working on that isn't from my immediate team).

IMO, the more LOC you have, the more the design of Go becomes a godsend. Because it becomes much easier to dive into code you haven't written yourself.

But anyway, that's just my opinion, I just wish that people would stop this "anyone who disagrees with me just doesn't have the correct set of experiences" FUD.


I wouldn't call it madness, just limits Go usage somewhat, especially in domains involving a lot of concurrency.

I once had hopes for Go. But the team working on it decided not to fix any of its flaws that became obvious over time and even outright denied their existence. So, not much Go for me anymore, but I'm still looking forward to see what such approach can bring in Go 2.


In the same spirit of my above comment, I think you're conflating the terms simple and easy.


Yes, it is a tradeoff. Why do you think it's a good tradeoff?


I think the point he's trying to make is that it's a poorly-justified trade-off.


The cost of complexity in a programming language is paid not only by the developers of the language but also by the programmers that use it.


The cost of poorly designed complexity maybe.

This is the same mentality as people who throw up their hands and say government is broken, so we should deprive it of resources to make it as small as possible. Doing this just winds up makes the problem worse, when there’s plenty of evidence that well-funded governments can work well.

It’s also the same broken mentality behind schemaless databases. Schemas are hard, so let’s get rid of them. This backfires because you haven’t actually rid yourself of schemas, they’re just implicit and now you lack any tools to operate on them meaningfully.

“Hard problems are hard, so let’s just avoid dealing with them” is not a sustainable solution in the long term. Sometimes they’re really hard and ignoring them makes it worse. Sometimes they’re only hard because we haven’t thought about them in the right context. And sometimes hard problems can be sidestepped entirely with a bit of cleverness. But outright ignoring them and hoping they go away just punts the hard problems to others.


If every one else in the world has to implement the same thing a thousand times a day he still thinks his thing is simple.

Smalltalk all over again!


Actually even Smalltalk is more feature rich than Go.


The community had a real "Not Invented Here" problem in our earlier years, which we never really overcame.


I see, as I only used Smalltalk during university for project assignments (Digitalk Smalltalk/V), I never got to experience that.


All languages that tried to fight complexity either grew up to adopt complexity and stay relevant, or faded away.

Programming languages don't get complex just for fun, their designers are tackling actual relevant issues.

Go community doesn't seem to have learned much from the past.


> All languages that tried to fight complexity either grew up to adopt complexity and stay relevant, or faded away.

> Programming languages don't get complex just for fun, their designers are tackling actual relevant issues.

Yes and yes.

> Go community doesn't seem to have learned much from the past.

Well... if you start with something simple, and complexity comes, you can still try to keep it as simple as possible. But if you started with something that was already complex (but complex in ways that your theory said it needed to be, not in the ways that the real world said it needed to be), and you try to fix that, you wind up with something really complicated. Ditto if you start off with complexity to handle all the use cases of the past.

Go started off simple, and is letting real-world use push them into becoming more complicated. That's a defensible approach, even today.


> That's a defensible approach, even today

Not if you are dishonest about it. Not if you refuse to learn from the past couple decades.


In what way is Go dishonest about it? In what way have they refused to learn from the past couple decades?


I think insisting that they need more use cases for generics is dishonest when you consider they used generics to implement library data types themselves.

I don't mean they are outright lying, or are bad people or anything like that.


I suspect the Go developers just have a very different idea of what the past is. Given the state of software engineering, I'm not quite as likely to put a positive spin on the things we've built since then.


This is true for languages that try to be all things to all people (a la Java). All languages are DSLs, and if you target just a few specific domains and beat back the masses who want the language to expand beyond its intended purpose, than simplicity remains possible.


DSLs shouldn't be turing complete and turing complete languages shouldn't try to be DSLs.

Ant was a DSL that managed to become turing complete and the results were pretty horrible.


Turing completeness is a symptom, not a cause. No one would argue that SQL is bad because some implementations are Turing-complete [0].

[0]: https://stackoverflow.com/a/7580013


>No one would argue that SQL is bad because some implementations are Turing-complete

They do actually. Though when people do say that it tends to be phrased "keeping business logic in stored procedures is a bad idea".

People argue that all the damn time.

Accidental turing completeness usually signals a design flaw somewhere (would you also consider it too controversial to argue that C++ templates mentioned in your link are nasty and people complain about them a lot?).


>All languages that tried to fight complexity either grew up to adopt complexity and stay relevant, or faded away.

And yet, to this day, C is just as, if not more popular, than C++. Why is that? I can do so much more in C++, but I, and my colleagues, pick plain-old C every time.


Thanks Linux.

C was already on the way out when Linus created Linux.

Apple was migrating from Object Pascal to C++.

IBM had CSet++ for OS/2.

Borland, Microsoft, Zortech, Symatec were selling C++ frameworks.

UNIX vendors were playing with Taligent and CORBA.

BeOS and Symbian were developed in C++.

Then came Linus, made Linux with GNU on top.

GNU project for a long time always mentioned that the go to language for GNU projects should be C.

All major C compilers are written in C++ nowadays, there is hardly any reason to stay with C outside UNIX world.


C was already on the way out when Linus created Linux.

That seems a little... fanciful. There was a lot of C++ and it was a great way to show how modern and forward-looking you were (and to sell compilers, tools, frameworks) but standardization hadn't got far, interoperability was poor, problems great and small abounded. A number of the things you mention above were spectacularly unsuccessful.


Yes, C++ was a pain to write portability before 2000, but so was C, in spite of having been standardized in 1990, most compilers were a mix of K&R C and ANSI C.

Nevertheless, all major desktop OSes were going C++ for their application frameworks, before the widespread adoption of GNU software.

> A number of the things you mention above were spectacularly unsuccessful.

Mostly due to politics between corporations and very little to do with C++ itself.


> but standardization hadn't got far

That's true, but not really fair: even ANSI C standard was only 2 years old at that time, despite C being way older than C++. Standardization takes a lot of time …


You're in the minority. For new projects, C is much less popular than C++.


It might not be that easy to tell. The C++ I write (for myself) is essentially plain C. No templates, dynamic dispatch, constructors, or exceptions. Most of the standard libs I use begin with the letter 'c'.

It uses some C++ features, but it's philosophically much closer to C code.


If it only compiles with a C++ compiler, it is C++, regardless of the amount of language features being used.


That's fine, but I don't think that's what people mean when they say C++. I certainly wouldn't call myself a C++ coder and, if I applied for a C++ job, I'm pretty sure that, after I had explained that I don't do exceptions, virtuals, or the STL, I'd be politely shown the door.


> Programming languages don't get complex just for fun, their designers are tackling actual relevant issues.

Have you ever used C++ templates? I mean every popular languages have issues related to design complexity.


Some complexity is avoidable, other isn't.

Besides C++ templates are a result of creating a conceptually simple, one size fits all solution for generics, metaprograming, library tuning, and some dozens of other problems that other languages have specialized tools to solve. Turns out that the complex set of features works better.


Is it a complex set of features, or rather a set of focused tools?


It is a large set of simple tools. It is conceptually complex because each tool is different and you must learn them all.


I did my first C++ steps with Turbo C++ 1.0 for MS-DOS.

My first use of C++ templates was in Turbo C++ 3.5 for Windows 3.1.


Go's community doesn't learn from the past, but good languages fight complexity even as they add expressive power. (They at least try to get the most expressive power per complexity cost.)

Pragmas, incidentally, aren't really a source of bad complexity. Per the abstract definition of the language, they really indeed do have no effect at all and are just comments. Yay!

Implementations have properties too—they aren't just rude practicalities. Compiler's, in particular, connect one language (the input) to another (the output). Programas mediate how those additional properties apply to the language at hand. It's an interesting mental challenge to formalize them in the absence of a compiler having an comcrete stable ABI.

So in conclusion, go people once again don't understand good design. Pragmas are not an ugly wart, but actually a great example of layering—a rare example of an abstraction that doesn't leak!


I wouldn't go as far to say that having undocumented magic comments doesn't add complexity. From a very surface level, sure, the parser is the same, but now every tool that works with the Go language needs to be aware of these. Linters, for example, need to not complain about the missing space in front of the comment, but only if the comment starts with "go:".

Ultimately anything that changes how the program is executed is going to add complexity, so they might as well "make it official" and add a keyword for it.


In Perl and Javascript pragmas are language level and are used to help people avoid some mistakes at certain stages of software development. This is fine, no leaky abstractions. In Go they are lower level and therefore are side effects of leaky abstractions in compiler and language design. So they should be fixed, not kept or turned into pragmas in the spec. The choices I can think of: either make Go lower level itself or move low level stuff into another intermediate lower level language.


There's another choice, which is to keep doing things the way they are being done, simply not add another 20 pragmas, and get on with life because there isn't actually a problem here. None of the problems with pragmas I've seen in other languages are present in Go, since the pragmas are simple and mostly used only by the implementation and/or compiler itself, and there's no interactions, or massive code complexity from ifdefs, or string-concatenation-based macro disasters, or any of the other real problems caused by pragmas, with the possible faint exception of pragmas not being cleanly delineated from the comment syntax, which is still not causing any huge problems I can see, nor is that likely to change in the future.

The problems that C has with pragmas, and that C++ imported from pragmas, can not be naively imputed to other languages without demonstrating there's actually a problem here. This wouldn't even make my top 10 issues with Go; I'm not sure it's even an issue at all.


> In Perl [...] pragmas are language level

No, actually. The syntax that people use to invoke the pragma ("use strict [arg]...") is not a pragma at the language level, it's just the syntax for importing symbols from modules. For example,

  use strict ('vars', 'refs');
expands to

  BEGIN { require 'strict'; strict->import('vars', 'refs'); }
because that's how the "use" statement is defined. `BEGIN{...}` cause the statements in the block to be executed as soon as the BEGIN block has been fully parsed [1]. `require 'strict'` loads the module `strict.pm` from the library path (the source code is on CPAN at [2], if you're interested), then its `import()` method is called with two string arguments. The implementation of that method is:

  sub import {
    shift;
    $^H |= @_ ? &bits : all_bits | all_explicit_bits;
  }
There's a lot of weird Perl syntax in there, but the gist is that it modifies the $^H variable. And THAT is the actual pragma which is defined by the language. [3] The module strict.pm is just a wrapper around $^H to make things a bit more user-friendly.

I know that's sorta kinda off-topic, but since we're talking about language design, I figured I'd contribute this small anecdote that illustrates really well how the more recent parts of Perl are designed: a ton of metaprogramming on top of relatively small changes to the core language. If you want another example, have a look at how object-oriented programming was tacked on to Perl as a tiny afterthought, yet the way it interacts with all the other parts of the language makes hugely powerful OOP frameworks like Moose possible. (OTOH, that approach also makes the language pretty messy, but it always gets the job done for me, at many scales.)

[1] Usually, execution only begins when the entire file has been parsed, but this code needs to run earlier because it changes the parser's behavior.

[2] https://metacpan.org/source/SHAY/perl-5.26.1/lib/strict.pm

[3] Notably, $^H behaves differently from other variables: Every assignment to it is scoped only to the current block, whereas regular variables need to be shadowed explicitly. This is particularly useful to temporarily lift a strictness requirement for a single statement, similar to how `unsafe` is used in Rust:

  use strict;
  ...
  my $function_name = 'implementation_' . ($x + 2 * $y);
  $function_name(); //error: cannot call string value
  {
    no strict 'refs'; //"no" is like "use", but in reverse (calls the module's unimport() instead of import())
    $function_name(); //works: calls the function with the name stored in the variable
  }


Sure, this is all correct from implementation perspective as levels are not strictly defined and are open to interpretation. However, we were talking about users' perspective, which is kind of the whole point of leaky abstractions. In this context language level features are those that users can understand about the language without any assumptions about other levels, like how compilers represent things internally.


The word pragma in perl has always been used to refer to that specific syntax sugar, just read the start of "perldoc perlpragma".

Furthermore the $^H (there's also %^H) facility you mention is just one way pragmas are implemented, e.g. "use overload" is a core pragma that doesn't use that method at all, instead it defines special functions in the importing package which the compiler is aware of.

Then there are other "pragmas" that are really just utility wrapper functions, e.g. "use autouse". The "Pragmatic modules" section of "perldoc perlmodlib" has the full list.


Guy Steele, "Growing a Language", https://www.youtube.com/watch?v=_ahvzDzKdB0


Lisp?


His concern was directed at the magic comments though. I don't understand why they didn't just create new syntax for pragmas since they're already parsing something. e.g. #noescape

# could be treated as syntactic sugar for //go: until v2 if they want


The page is missing the best one of them all //go:linkname, which allows for linking in private functions from other packages. Including the Go runtime. For example: https://github.com/tidwall/algo/blob/master/algo.go


linkname is brilliant thanks! I've done that sort of thing with an assembler shim in the past.


Pragmas are always a bad idea. The Ada community has learned that the hard way. Whatever the pragma does, it should be part of the language standard and never be implementation-dependent.

It's time that language designers include language pragmatics in the core language. That includes for example big O information about data structures, packing of structures, alignment properties, memory access information, etc. Currently, in most if not all languages this information is spread all over levels, from nonstandardized compiler flags over pragmas up to the core language. It's a huge mess.


> Pragmas are always a bad idea. The Ada community has learned that the hard way. Whatever the pragma does, it should be part of the language standard and never be implementation-dependent.

> It's time that language designers include language pragmatics in the core language. That includes for example big O information about data structures, packing of structures, alignment properties, memory access information, etc.

So pragmas are "always a bad idea" but you should have them "in the core language"… Don't you feel your comment is pretty contradictory?

A pragma is a directive for the system (mostly compiler), that's orthogonal to it being implementation-specific.


You misread my comment. The functionality offered by pragmas must be mandatory and in the core language, whether you call them pragmas or not. Everything else leads to problems.

It's true that if pragmas were all fully specified in the core language and not optional, then they wouldn't pose any problems. In reality, however, some pragmas are regulated by the core language and others are implementation specific additions. The result is a huge mess, it's the #1 source of incompatibility of standardized languages like Ada. Even just having optional pragmas in the core language is problematic, because at one point or another developers will start relying on the optional functionality to do something that one implementation does and another doesn't. Optional optimization and packing directives are typical examples. In theory they shouldn't be able to break programs, in reality they do.


> You misread my comment. The functionality offered by pragmas must be mandatory and in the core language, whether you call them pragmas or not. Everything else leads to problems.

Your previous comment literally states that pragmas are always bad, then goes on to state that they should be included in the core language. This one does not retract the original statement that pragmas are "always a bad idea" but further asserts pragmas "must be mandatory and in the core language".

I don't think I misread your comment, no. You may have miswritten your comment when you meant that, say pragmas should not be optional and/or implementation dependent[0], but that's on you.

[0] I've no idea whether that's you mean given again you state that pragmas should both not exist and be part of the core language.


When I said "pragmas are always a bad idea" I was talking about pragmas as they are implemented in current languages like e.g. Ada. In most languages pragmas are added to the language without being integrated into it. For example, in Ada you write "pragma (Something);" next to other statements to influence their interpretation by the compiler. Other languages have similar mechanisms that are somehow outside the core language or added on top of it. These types of pragmas are always a bad idea.

I also argued that the functionality of pragmas must be represented as non-optional choices in the core languages. This is one of the lessons that the Ada community has learned over time - but it's too late to change this now in Ada.

So yes, pragmas are always a bad idea if we talk about the way they are implemented in most languages. The functionality of pragmas should be part of the core language, with proper syntax and semantics.

I hope that clarifies what I've stated and which you haven't understood, and that, your nitpicking aside, my position now makes sense to you. If not, there is not much I can do, I just wanted to point out a lesson learned in the Ada community, which is very stringent about language definitions. If you're interested in these kind of topics, I can recommend Michael Scott's Programming Language Pragmatics.


I don't know if I agree. The pragmas listed in the article, by and large, are directives at specific parts of the reference go implementation, allowing for specific optimizations/annotations that implementation needs. Any other implementation of go seems like it could safely ignore those directives.


I hated the idea of using comments as directives when Go 1.4 introduced //go:generate. But, holy, they were there from the beginning?

They bring back my painful memories of the old days when I had to use conditional comments to support IE6...


Comments are also interpreted by go build, to choose which files to build on each platform. Memories of IE6 are painful because of the deviation from the standard, but with Go you don’t have that problem.


Every higher level language has directives nowadays, and those almost every time encoded in comments.

Honestly, between documentation, compiler pragmas, linter directives, packaging and linking instructions, and etc, we are getting into a point where languages will need to specify something like "comments starting with this string must be ignored".


> and those almost every time encoded in comments

which lanugages do that?


Possibly off topic, but Verilog has something similar. //read_comments_as_hdl_on is a thing and it makes it a pain.


IE's conditional comments were actually a pretty elegant solution given that you had to be compatible with every other HTML parser out there. The painful memories that I recall are about IE 6/7 itself, not about conditional comments.


The pragmas in Go are not intended to be used in general user code. They should be only used in Go runtime implementation and standard packages. The pragmas in Go are just some hints for compilers. Compilers will ignore many of the pragmas used in custom user code.


Except for the build conditional flags: https://golang.org/pkg/go/build/#hdr-Build_Constraints

Which are pretty useful when you want to target tests to different versions of Go when std lib exhibits different behaviour (behaviour changed as std lib matured).


Also when you want to put package-main files in a library package directory, you need to put

  // +build ignore
near the top of it.


This seems like one arugment to be made in favor of giving the language first-class access to the compiler, à la Lisp.


Not at all! This a common impulse among programmers: upon seeing a specific case, you want to make it as abstract and generic as possible. This maximizes the power and flexibility available to the programmer.

However, the design philosophy of Go pulls in the exact opposite direction! Go emphasizes simplicity and large-scale engineering -- i.e. standardization. (gofmt is the canonical example of this.) Giving programmers the power to manipulate the compiler in arbitrary ways would be a nightmare for Go's designers. It opens the door to "clever" code that you tear your hair out debugging 6 months later. The strength of Go is precisely that it makes it difficult to write "clever" code. Go looks pretty much the same everywhere. Which is a little boring, sure; but for me (and many others), that's an acceptable trade-off.


I rather prefer the Ada, Eiffel, Delphi, C++, Java and C# decisions regarding large scale engineering.


Yeah, keep telling yourself that. Do you actually write code to solve problems yourself or are you more into paying others to do the same? I honestly don't know which alternative would be worst here. What I do know is that this attitude isn't serving us. We're in the business of solving really tricky problems, forcing ourselves to do that using dumbed down tools to protect us from our own intelligence and creativity is insane.


This is what Nim[1] does. Its pragmas are extensible via macros[2].

1 - https://nim-lang.org/ 2 - https://nim-lang.org/docs/manual.html#macros-macros-as-pragm...


> Given the race detector has no known false positives, there should be very little reason to exclude a function from its scope.

Performance, maybe?


> Performance, maybe?

Performance for running race detector (debug) binaries? Are you worried slower code hides a race?

I can't think of a reason to ship or use race detector compiled binaries in production. Or do you have something in mind?


Yeah, that’s what I was getting at. Race condition detection would probably slow down your code, so you probably wouldn’t use it in production.


Race detection is totally disabled for production binaries, because it does slow down the code.

The question is: For debug builds that explicitly have the "-race" flag passed to the compiler, why would you want to disable race detection for a specific function?


I actually do have a real example of this.

We use -race during our automated testing. The setup for some of our tests involves CPU mining (rapid blake2b hashing). This code definitely doesn't have any races, and it runs waaaaay slower when race detection is enabled. So we could speed up our tests significantly by disabling race detection just for the setup phase.


Have you considered enabling parallel tests for that package? It let's test functions run in parallel with each other. Might address some of the issue with the performance.


Unless you’re IO bound?


Go's "pragmas" are already exposed to the developer via go generate [0].

Edit: Granted, this is not a directive for the compiler though.

[0] https://blog.golang.org/generate




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: