It is a bit astonishing what lengths the go community is willing to, er, go, in order to avoid compile-time code execution (the basis of lisp macros) or generics. I understand that neither of these ideas were invented at Bell Labs during the 1970s, but at some point, can't we just put aside the not invented here syndrome?
Could we all just agree to pretend that Bell Labs folks invented those ideas? Would that work?
There have been many lengthy disucssions about generics on the Go mailing list. Haven't checked them in a while but the essence used to be that generics are difficult. C++ has really powerful generics but unfortunately C++ templates are in many respects a nightmare. Java templates on the other hand are not really powerful. The idea was to wait until someone would come up with a powerful generics concept that is nice for the developer.
Not all languages need to have all the features of Haskell or Rust. Just as not All languages need to be dynamically typed. Just because a language is missing your favorite feature(s), doesn't mean it's a bad language. And maybe you have tried writing programs in go and didn't like it. That's fine. But there's no need to insult everyone who uses language. Every language has its niche, even perpetual punching bags like vb and php.
> But there's no need to insult everyone who uses language.
Is that from Go's social media handbook? I have never ever seen users of a language complain about perceived insults as consistently as Go. It seems like it's the only argument they are able to make if they disagree with other people.
The only thing cute is the way you are desperately trying to come up with ways to turn "I don't like what you say" into "it's wrong what you say" in your head. :-)
So how many programming languages besides LISP are powerful enough to support turning Protocol Buffers into code during compilation? And is this the ultimate must have?
It could be done in OCaml or Haskell, considering they both have compile-time AST generation/manipulation, but I'm not sure if anyone's written an actual implementation of this. Nimrod too, as that has AST macros. As does Elixir.
To be clear for the Haskell newbies, he's talking about an implementation of the protobuf library, not Haskell. You can fairly easily lift from Template Haskell types into IO in GHC (not sure about TH in other compilers).
The Go community are out building solutions productively and fruitfully, often finding the lack of generics an edge nuisance, but something that is by no means a showstopper.
The people who have no knowledge or skill in Go, on the other hand, fill the message boards, railing rabidly about how absolutely critical generics are, and don't these people (who almost universally have extensive experience across a number of languages) know what they're missing?
The point you are missing is that people who don't want to live in the 1970ies avoid Go in the first place, so the Go community by definition consists only of people who think Java-1.0 style programming is perfectly fine in 2014.
Yes, there are reasons why the intended audience of Go just didn't care and Go is mostly used PHP, JavaScript, Python, Ruby etc, developers today.
> Go is mostly used PHP, JavaScript, Python, Ruby etc, developers today
Could be dynamic typing has had its day, just like visual programming from the 1990's. Or maybe users of those languages wanted to use one with a large corp backing it, like Java and C# have. The most common glib response I get when I say I'm trying out Go to write something is "You're using Go? So you wanna work for Google, do you?" Perhaps that shows the real motivation of why a programmer learns Ruby or Python or Go.
There are people who want their language to be as complicated and feature-rich as possible. They have lots of choices available, including C++1*, Java, Ruby, etc.
Those languages probably don't have anything that has only one idiomatic solution. Even writing a for loop will have multiple different but viable alternatives.
Then there are people who want their language to be as simple as possible, yet still powerful enough to allow you to create all the things you can with Go.
There are very few such languages. In fact, there are a few things/special rules I'd like to see simplified in Go because they have less benefit than cost.
Having a simple language allows for some cool benefits, and if Go starts to compete with C++ for number of features, it will lose its main distinctive property.
The desire for a simple language doesn't mean you have to accept all the mistakes Go made. Go is just not a very good language. It would be completely irrelevant without the Google name behind it.
It certainly could be better (this is true of everything that's not perfect), but it is being improved (and it's open source, so you and I can help make it better). I already prefer it over many other languages.
> It would be completely irrelevant without the Google name behind it.
You mean... It would be irrelevant if Google and other people who work on it (being open source, many contributors aren't Google employees) did not make it what it is?
That's like saying... <any product> would be irrelevant if <those who made it> didn't make it as good as it is. It is a true statement, but how is it useful?
The things I care about can't be fixed, because they are fundamentally wrong in Go, they are not just some little oversight.
> I already prefer it over many other languages.
If I would start aiming low enough, I could also certainly find languages which are even worse than Go.
Honestly, I don't care. I prefer languages which do things better than X, not languages which are less worse than X.
> You mean... [...]
Eh no? I meant what I said. If the people who created Go wouldn't have been able to leverage the Google name (either by working at a company or by Google saying "don't use our brand for your toy projects") nobody would have cared.
But people said "OMG, Google invented online search!!! Then–by definition–they have to be language design experts, too!!!" and the tragedy unfolded.
Fair enough, it sounds like your needs are very different from what Go satisfies.
Just out of curiosity, what language(s) do you prefer to use over Go?
> Eh no? I meant what I said. If the people who created Go wouldn't have been able to leverage the Google name (either by working at a company or by Google saying "don't use our brand for your toy projects") nobody would have cared.
I have a very good counter-example for you. Google also made Dart. I have no interest in Dart and I don't think it's anywhere near as good as Go from what I can tell about it.
> so the Go community by definition consists only of people who think Java-1.0 style programming is perfectly fine in 2014.
This is a false analogy because there are significant differences between "Java-1.0 style programming" and Go.
> Yes, there are reasons why the intended audience of Go just didn't care and Go is mostly used PHP, JavaScript, Python, Ruby etc, developers today.
Why do you feel the need to bin people? I'm a Haskell programmer and yet I love Go. In your world view, this is an impossibility. But maybe I've given you too much credit.
Go is mostly used PHP, JavaScript, Python, Ruby etc, developers today.
Go is absolutely preferred by people building real solutions to real problems. Pure, idealized languages like Haskell seem to be preferred by people who don't actually do anything with it, but instead talk about it. It is the "big brother" language.
I don't mean this dismissively, though it invariably sounds that way, but there is a huge divide between things that actually get things done, and things that you can debate the intricacies of for weeks on end while you foment that grand plan that you'll never actually pursue.
Though I will add that your nonsensical, ignorant simplification and mischaracterization of Go merely belies the fact that you have virtually no real knowledge of it at all, beyond probably some rubbed off "knowledge" from advocacy boards.
It sounds like this is just a clean way to run yacc and protobufs. And: using yacc from Golang programs is in fact ugly right now; it's actually a little worse than it is in C. I think you should just take this document --- whatever it's provenance --- at face value.
> using yacc from Golang programs is in fact ugly right now
Why do you say that, and how does this mechanism affect the use case you have in mind? You run go generate instead of make. The advantage is that you don't have to have make; e.g. Windows and Plan 9 don't have make. Arguably, the advantage is pretty minor, but I don't see how it changes anything in your workflow.
Remember that go generate does not run at build time. You still have to run it manually when you want it (just like before, with make) and commit the generated files into your repository.
Grunt support for Go is patchy, the one package I found is no longer supported, but as I'm writing web stuff the real thing I needed was to do with js compression and obfuscation - I want to deal with nice easy-to-read js files in dev but I don't want to waste my customers' bandwidth delivering comments to their browser.
I understand the argument for go generate as a resolution for some of the problems with no generics (and yes, Go's type system is a bitch. I love the language but rewriting everything for each individual structure gets old fast). I can see how generating boilerplate ORM-a-like code would be great.
So the argument that go generate is somehow a replacement for make leaves me a bit puzzled. I mean, yes, you could do that, but why?
Am I missing how to make this process work with the idiomatic Golang build system? Can I use "make" or "grunt" to create a package that someone else can "go get" and build without using a third tool?
Go generate is for package authors, at which point you check the results into your repo. Then anyone else can go get your package as a dependency and not need to regenerate that code. So, yes.
It's not uncommon in Go programs to have makefiles (or shell scripts) that make some transformation to the code. This imposes no burden to the user since the generated files are checked into the repository, but it does put some burden on the developer (who has to run make), or to the prospective contributor, who not only has to have make, but potentially has to have all other tools invoked by make or learn the varying non-standard mechanism by which code is generated (e.g for the Go project itself, a new developer has to learn how to use the various shell and perl scripts that generate the code in the syscall package).
Go generate does not eliminate this burder for the developer. Someone still has to write the yacc invocation and remember to run it; it won't know how to run itself, but it has the advantage that now there is a standard canonical way to generate code. Anyone can just type go generate instead of learning about makefiles or shell scripts or whatever the project is using. Code generation is normalized under the go generate umbrella.
>If this is seriously their proposed solution to the lack of generics then my estimation of Go just dropped even further.
On a related note, in one of the videos with Rob Pike (audience q&a I think but can't remember if it was GopherCon2014 or other event), he specifically mentions external "code generators" as a workaround for lack of generics.
If anyone (not saying you specifically) is saying that the Go team is proposing code generators as the solution instead of a workaround to lack of generics, that would be an inaccurate reading of their position.
It's not about generics. It's just a helpful part of the tooling to do some stuff before you compile. Like many projects already do, this just standardizes it.
the author commits the generated files to the source repository, so that they are available to clients that use go get
Treating generated files as primary source is a disaster waiting to happen. Someone will end up just doing a quick edit of the derived file because yacc isn't set up right, or running a sed command to do a refactoring, or just not understand the difference. One of the things that I like about go is that it does a good job of keeping the codebase clean; recommending a solution that sets up this sort of trap is a big departure.
I wonder what other approaches are available for packages that intend to distribute in source form, however. I could imagine adding a parallel directory structure for the generated files, to make it clear that they are out of bounds, akin to the mvn source / target layout.
I completely agree. If you're going to add generated files to a source code repository, they need to be clearly marked in one way or another.
If the author truly want a "go generate" they good sell it much better. I understand the yacc idea, it can be a pretty good solution for something like configuration files. OpenBSD use yacc to do the grammar for many of they tools and that has worked out create.
The big sell though would be WSDLs. I understand the aversion towards SOAP and WSDL, but it's here and having Go support would be nice. Generating Go code from a WSDL wouldn't be to hard a sell I think, and it fits nicely into this idea.
Actually it's not really a big deal. You just put a header at the top that tells people not to hand modify it. Every C# winforms project already did that.
The nice thing is that you can diff the newly generated code versus the old code and make sure changes make sense.
Given the wait and see approach taken by Go maintainers on generics/templates I think there may be an opportunity for a template based pre-processor scheme (not using go generate since that works on go files). This approach would be the C++ approach - i.e., templates not generics.
The template could be in, say for Vectors, a Vector.got file.
In your go code (also a .got file - say my.got for example) you reference Vector<SomeType>. Some build system+pre-processor parses .got files and creates a Vector_SomeType.go file. A corresponding my.go file is also generated from my.got with all template references flattened to real types.
All *.go files are compiled using the regular go build chain to create the final application. Any missing types would be identified by the go compiler.
make could be used to drive the process, but the got2go translator application is the missing piece. Is it better than generics inside the language - no. It is very loose in terms of type safety as we are just performing text substitutions but it could save a lot of typing. We could also come to rely on a semi official/community approved repository of core collection .got files. The C++ STL implementations may serve as an inspiration for the API of core template code.
This is what Borland C++ had in the mid-90s before they added support for template on their MS-DOS compiler.
Way back when STL had just been announced as being integrated into the C++ standardisation effort and C++ compiler vendors were using preprocessor tricks to generate type specific generic data structures and algorithms.
This is neither a preprocessor in the means of cpp (c preprocessor) nor a generics implementation. If you need a tool like cpp for go, use m4. It was made for things like that.
Now, what this is, is a tool to make it more convenient to convert data into .go source files. I really like the idea that you can just include your source-data (string-tables, grammars, etc.) in source format, and have one tool to convert it. Run "go generate" and everything is ready to build.
Please correct the title. The original title is correct: "Go generate: A proposal" the rest is wrong. This tool is not a preprocessor. It's as wrong as calling 'make' a preprocessor. It can be used to call preprocessors or more likely generators.
How do you use CPP or templates to generate .c files from yacc grammars or protobuf definitions, to embed HTML or binary data from files? I think you misunderstood the main purpose of this tool. You probably only read the last and least important use case.
I can do this with external tools and build systems (make, cmake, waf, scons + protoc and any other binary that I want) that at the end of the day work nicely with the provided tools. Go on the other hand seems to want to be monolithic. It is the center of your compile and link universe. To me it is quite un-Unix. Not unlike how limiting the import system is. Many of the features of Go make it feel like a walled garden.
I personally like this solution. I don't know why there's so much negative opinions on this. Yes, it doesn't solve the problem of generics as a language construct and it doesn't provide macros as a language construct. But it gives a lot of power for many other real-life applications like embeddeding/processing some content or generating type-safe bindings/wrappers/parsers/collections/...
It's an extension of a build system, not a language. It helps to specify build process details right in the source files that either depend on it, or provide necessary details for it. This approach worked for CGo, it worked for +build flags, so I don't see why it will not fit in this particular case.
One thing I worry is how to get the processor app during the build process. E.g. if bindata is used to embed a file - how will "github.com/jteeuwen/go-bindata" be installed?
no, this wont fix generics, but its not about that. The dependence on make is an awkward legacy left over and something that other languages (eg. rust) are also still sitting on, and the sooner it goes away the better.