Hacker News new | past | comments | ask | show | jobs | submit login
How to use Google Cloud’s free logging service with Go (bugreplay.com)
29 points by edibleEnergy on Sept 8, 2016 | hide | past | favorite | 5 comments



I had a similar setup at one point, although in python. At some point I wrote a python logger talking to stackdriver http api, although I don't use it now: https://github.com/understandwork/stackdriver_python_logger .

This setup is not optimal. If you use kubernetes/gke the most optimal choice is relying on GKE logging. You just log to stdout, and GKE takes care of attaching relevant metadata (e.g. instance log came from) and transporting logs reliably and efficiently (relies on fluentd agent provided by the GKE). You don't need to do any setup, except logging to stdout in json format. In my python code I just use python-json-logger.

If you don't want to rely on GKE, instead of sending http calls with logs fluentd is a better choice:

  - open source logs collector you control, lives on your machines.
  - You can export fluentd logs to google cloud stack driver.
  - There are already well maintained client logging libraries talking to fluentd
  - Better efficiency and reliability than http calls 
  - No lock-in to any logging platform. E.g. move from StackDriver to competition by fluentd config change.
  - Handles more than just application logs - e.g. your nginx/apache logs
Regarding stack driver - article does not mention that you can export logs from StackDriver to BigQuery. I find it very useful. BigQuery internal version, Dremel, was created for that very purpose - logs analysis.

Disclaimer: I'm not affiliated with fluentd or GCP, although I use both and I used to work at Google (primarily search, no ties to GCP).


oh nice, yeah that is one of the things I read about in the docs but have not tried yet. I haven't really used BigQuery enough yet.


You have to do a bit more yourself in BigQuery. (Define a schema and do a schema update when you log something new.) I implemented logging via App Engine requests which save the data to BigQuery. Not sure if that's better or worse than using the StackDriver service, but it's working.


You can now log to stackdriver and export to BigQuery. No need to handle schema updates.


As mentioned in the comments of the article, the auto-generated Go package the author used is not the only Google package. There's also https://godoc.org/cloud.google.com/go/preview/logging, although still experimental.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: