
How to use Google Cloud’s free logging service with Go - edibleEnergy
http://blog.bugreplay.com/post/150086459149/how-to-use-google-clouds-free-structured-logging?utm_source=hn&utm_medium=web&utm_campaign=blog2016sept08
======
kozikow
I had a similar setup at one point, although in python. At some point I wrote
a python logger talking to stackdriver http api, although I don't use it now:
[https://github.com/understandwork/stackdriver_python_logger](https://github.com/understandwork/stackdriver_python_logger)
.

This setup is not optimal. If you use kubernetes/gke the most optimal choice
is relying on GKE logging. You just log to stdout, and GKE takes care of
attaching relevant metadata (e.g. instance log came from) and transporting
logs reliably and efficiently (relies on fluentd agent provided by the GKE).
You don't need to do any setup, except logging to stdout in json format. In my
python code I just use python-json-logger.

If you don't want to rely on GKE, instead of sending http calls with logs
fluentd is a better choice:

    
    
      - open source logs collector you control, lives on your machines.
      - You can export fluentd logs to google cloud stack driver.
      - There are already well maintained client logging libraries talking to fluentd
      - Better efficiency and reliability than http calls 
      - No lock-in to any logging platform. E.g. move from StackDriver to competition by fluentd config change.
      - Handles more than just application logs - e.g. your nginx/apache logs
    

Regarding stack driver - article does not mention that you can export logs
from StackDriver to BigQuery. I find it very useful. BigQuery internal
version, Dremel, was created for that very purpose - logs analysis.

Disclaimer: I'm not affiliated with fluentd or GCP, although I use both and I
used to work at Google (primarily search, no ties to GCP).

~~~
ediblenergy
oh nice, yeah that is one of the things I read about in the docs but have not
tried yet. I haven't really used BigQuery enough yet.

~~~
skybrian
You have to do a bit more yourself in BigQuery. (Define a schema and do a
schema update when you log something new.) I implemented logging via App
Engine requests which save the data to BigQuery. Not sure if that's better or
worse than using the StackDriver service, but it's working.

~~~
kozikow
You can now log to stackdriver and export to BigQuery. No need to handle
schema updates.

------
breakingcups
As mentioned in the comments of the article, the auto-generated Go package the
author used is not the only Google package. There's also
[https://godoc.org/cloud.google.com/go/preview/logging](https://godoc.org/cloud.google.com/go/preview/logging),
although still experimental.

