
Go: Best Practices for Production Enviroments - ConceitedCode
http://peter.bourgon.org/go-in-production/
======
grey-area
What an interesting article, thanks for the writeup.

Are you using Docker for containers?

As a language neophyte I found this far more persuasive than arguments over
emergent behaviour[1], because it talks about actual practice, as opposed to
talking in abstractions (which rarely leads to fruitful discussion). The
attraction of Go to me is in its commitment to simplicity, at the expense of
some power or terseness it becomes easy to read, easy to extend, and hard to
use to create monstrous complexity.

It feels like an improved C rather than building on the traditions of C++ say.

The package system, import system, composition and no inheritance, and
extensive use of interfaces all encourage simplicity above all.

Composition - it's harder to create huge taxonomies which all depend on a
common root.

Packaging - encourages very simple small components rather than huge
frameworks which are hard or even impossible to understand all at once or
separate into components. Godep[2] looks interesting here, thanks for the
link.

Interfaces - similar to many other languages, but without the boilerplate of
declarations (this sort of decision is where controversies start - what some
see as boilerplate others see as important contracts, same goes for header
files).

It is of course missing a few things which would be nice, but nothing should
stop people trying it out today - if you value simplicity above all, Golang
will probably appeal.

1\.
[https://news.ycombinator.com/item?id=7653164](https://news.ycombinator.com/item?id=7653164)

2\. [https://github.com/tools/godep](https://github.com/tools/godep)

~~~
peferron
Go's simplicity didn't stop me from ending up with a messy, bloated and
monolithic project on my first try, but static typing and good test coverage
makes refactoring relatively painless.

~~~
jweir
I reckon there is no language that can prevent a bloated and messy code base.
Avoiding that comes from both understanding the solution and how to write the
solution with the language.

~~~
krakensden
Indeed- languages without bloated and messy codebases are languages without
serious users.

------
dvirsky
Nice article. As for keeping stable dependencies while remaining go get
compliant - I opted for a slightly simpler solution. We fork dependencies to
our internal git server (gitlab in this case, sometimes we just keep our
stable fork on github), and go get them from there in the build process.

It's not much different, in that it still requires rewriting the internal
imports of the libraries, but it doesn't require copying of files or working
with git submodules, and provides the same stability.

I think godep certainly looks promising and I have it in my TODO to try it
out, but working with forks for the relatively small codebase we have is
enough for now.

------
heavenlyhash
"Dependency management! Whee!" indeed.

I totally share the author's feelings that fetching master to do deployments
is pretty rough for repeatability. Vendoring fixes the core repeatability
problem, but leaves a lot of other questions. What if I want to use this
library in several projects without essentially forking it for each one? What
if I want to reuse two of _my_ libraries in other project, and they both
vendored the same things? And so on.

I use git submodules for golang source dependencies with reasonable success. I
also asked myself "What if I need this to be backwards compatible with `go
get` semantics?" and found it's hackable: create a dir in your project which
you'll add to your $GOPATH, add git submodules there with fully-qualified dir
names matching their public urls (i.e.
`myproject/.gopath/github.com/whoever/whatever/`), and to tie it off make a
symlink from your own project's fully-qualified dir name back up to your repo
root.

With that symlink, your own project has the same package names regardless of
whether the code is being referenced internally to that project, via the
public go-get'able urls, or via being submoduled in another project. Nobody's
left out.

Example here
[https://github.com/polydawn/pogo/commit/c6fac440c99c00c2db65...](https://github.com/polydawn/pogo/commit/c6fac440c99c00c2db6501140308cd10d0abbbe4)
(though I've since switched to putting the symlinks in a .gopath dir, as that
seems more popular).

Submodules can introduce complexity to a project if inserted into the middle
of a project where abstraction layers aren't finished yet (all the complaints
about merging being totally undefined are true). But when a relatively stable
library API is either already existing or a firm intention, submodules stay
pretty well out of the way in normal development and can actually cross over
to helpful by drawing attention to when you should think twice about API
shifts. (Also, it might be worth mentioning git's CLI for submodules has
gotten radically better in the last few releases.)

It's certainly not the only way to do things, and if you're happy with other
solutions like vendoring, by all means carry on (godep is great too, as the
article mentions; it doesn't vendor, and still gets you a hash for precision
and integrity of your deps)... I just thought it was interesting that it's
possible without any special tools.

~~~
e12e
While I have in general sworn off java, and is rather enamoured by clojure's
lein build tool -- I enjoyed the recent post (that was on hn as well):

[http://blog.paralleluniverse.co/2014/05/01/modern-
java/](http://blog.paralleluniverse.co/2014/05/01/modern-java/)

It does appear grails manage to expose many of the good parts of Ivy/Maven
while keeping build-files reasonably sane -- and crucially it appears to offer
a pretty solid story in terms of "predictable future deployment" (can I
restore a service in two years and deploy it on a contemporary system?).

------
girvo
I really like Go, and have been getting into it quite a bit. It's powerful,
quick, and fits in nicely as a "native code" utility language, which I dig.
However, the first thing the OP talks about, $GOPATH, really fricken annoys
me. I know that's sort of silly, but hey.

Are there any good fixes for this at all? What other ways of tackling the dev
env are there for Go?

~~~
rakoo
Use the symlinks, Luke.

My $GOPATH is in a standard place: $HOME/.local/share/go (aka
$XDG_DATA_HOME/go) and all my projects (whatever language it is) live in
$HOME/dev.

I use github for all my projects, so the public path for my go projects would
be $GOPATH/src/github.com/rakoo/petproject. So all I have to do is

    
    
        $ cd $GOPATH/src/github.com/rakoo
        $ ln -s ~/dev/petproject petproject
    

and everything works flawlessly. I get to keep using my ~/dev dir for all my
projects, and the Go environment is well defined and works as expected.

You do have to do this once for each project you start, but that should scale
quite well.

TL;DR: Your projects don't need to _be_ in $GOPATH; they only need to be
_accessible_ from $GOPATH

~~~
girvo
Ah hah! I wish I could give you more than just an upvote, as that's awesome,
and I really wish I'd thought of it first. I'll give it a try, thanks very
much!

------
voidlogic
Vendoring is the way to go, but it done in a very awkward way here. I find it
is better to just fork the repos you depend on and reference your fork. This
makes it very easy to update them from upstream and eliminates any need for
third party code in your repo.

Using the command line args is great for programs with small configs, but we
find good old INIs and JSON to work better for large complex configs.

I think the deploy step outlined should be running the unit tests with the
race detector enabled.

~~~
lobster_johnson
Since an import cannot reference a specific tag or commit, even with forking
you're exposed to the situation where a team member may update the fork and
break the project. Keeping dependency information external to the project is a
bad idea.

So with forking, you still have to reference a "vendor" path, as described in
the linked article. You could use Git submodules or Git subtrees, or something
homegrown, all of which add complexity and are not ideal.

------
timtadh
This is a great article. I have also been vendorizing my dependencies. I
recently have been bit by projects that I rely on (config parsers, things of
that nature [and for a couple of my projects, I have too much config for using
command line flags]) moving or doing silly things. The old code works fine but
now it is a pain to setup a new dev environment because I have to clone from a
different repo than its import path is from and check out the appropriate
commits.

To solve these problems I have started an experimental thing and I would love
some feedback on it. I am calling it `gopkgr` and the idea to have a thing
that will let you make nice tarball packages and install them.[1] Eventually,
we could have something like godoc host version-ed packages. I want to be
cross compatible with `go get` but I also want to support more general
situations.

For instance, right now it is really difficult to write polyglot go systems.
In my research, I have a system that is go, java and python to tie things
together. Super annoying to have such a polyglot system but I have different
needs and each ecosystem offers the path of least resistance on each one of
those needs. With the current go situation is not possible to create a re-
usable package for the go code which is usable by other projects without
breaking it out of the repo (which doesn't really make sense in this case).

In anycase, I don't want to solve dependency management with gopkgr. I want to
solve packaging dependencies for repeatable builds without having to worry
about code moving from this source code host to that source code host. (eg.
google code to github or vice versa) So if you have good ideas in this regard.
Let me know about them.

[1] [https://github.com/timtadh/gopkgr](https://github.com/timtadh/gopkgr)

~~~
mjibson
How is this different from godep?
[https://github.com/tools/godep](https://github.com/tools/godep)

~~~
timtadh
I aim to do _packaging_ (putting things in tarballs). Godep is trying to solve
the harder problem of creating metadata and systems to say what versions of
libraries to (transitively) depend on. I actually want to utilize the Godep
stuff to solve the dependency management problem for the packaging portion.

You can think of what I am trying to make as a language specific version of
%.debs or %.rpms. Combined with proper dependency management, hosting, code
signing and verification we could build a language specific apt-get. This is
what python's pip is like. There are a lot of tricky problems between where go
is at today and apt-get. Especially if we want to be resistant to things like
evilgrade.[1]

[1] [http://www.infobyte.com.ar/down/isr-evilgrade-
Readme.txt](http://www.infobyte.com.ar/down/isr-evilgrade-Readme.txt)

------
why-el
Nice article. For those of you working on a SOA project, can I have some
feedback on how people manage data sources? For instance, I am in the process
of breaking up a monolithic application into small services, but the problem
is that most of these services rely on the same data (For instance a search
service and an autocomplete service will work with the same data). What are
you strategies to deal with this? I mean, is it simply a matter of relying on
the same database? How about models built on top of this database and their
tests?

~~~
iamartnez
Nothing wrong with two services using the same data source. If data locality
is a concern, deploy both services alongside one another.

Or you could place the data behind an API. It would speak HTTP/Thrift/whatever
and would be solely responsible for handling the data. Front the API with a
cache if latency becomes an issue.

For your specific example, consider pushing data to autocomplete and search
instead of having them pull.

~~~
why-el
Interesting. Are you aware of any high profile examples of this architecture?

For my own case, search and autocomplete use the same modeled data, but
autocomplete for instance runs lots of precalculations on this data, so the
data behind an API is not really a choice.

------
allochthon
_That means tabs for indentation, and spaces for alignment._

Not a go user (yet), but this one caught my attention as being a little ugly.
I'm a fan of only using spaces.

~~~
NateDad
Tabs are just customizable whitespace. If you like 6 space indents and I like
2 space, we can both be happy if it's tabs, and just configure our editors the
way we like. With gofmt, there's no danger of mixing spaces and tabs (not that
it's actually dangerous, since whitespace isn't important in go).

------
sergiotapia
Excellent article, thanks for sharing! I wish I could attend Gophercon but I'm
way down here in South America.

What book would you recommend for someone who wants to learn about Go
practices and professional tips? I'm not interested in a reference book that
teaches me how to loop, but a book that teaches me applicable knowledge in a
bit more advanced subjects.

I'm aiming to make Go my primary workhorse language.

------
sudhirj
Why the advice against `make` and `new`? Is it only because make sometimes
requires you to know sizes in advance?

~~~
tptacek
It's less noisy just to take the address of an automatic variable, like foo :=
&Bar{}

