This setup is not optimal. If you use kubernetes/gke the most optimal choice is relying on GKE logging. You just log to stdout, and GKE takes care of attaching relevant metadata (e.g. instance log came from) and transporting logs reliably and efficiently (relies on fluentd agent provided by the GKE). You don't need to do any setup, except logging to stdout in json format. In my python code I just use python-json-logger.
If you don't want to rely on GKE, instead of sending http calls with logs fluentd is a better choice:
- open source logs collector you control, lives on your machines.
- You can export fluentd logs to google cloud stack driver.
- There are already well maintained client logging libraries talking to fluentd
- Better efficiency and reliability than http calls
- No lock-in to any logging platform. E.g. move from StackDriver to competition by fluentd config change.
- Handles more than just application logs - e.g. your nginx/apache logs
Regarding stack driver - article does not mention that you can export logs from StackDriver to BigQuery. I find it very useful. BigQuery internal version, Dremel, was created for that very purpose - logs analysis.
Disclaimer: I'm not affiliated with fluentd or GCP, although I use both and I used to work at Google (primarily search, no ties to GCP).
You have to do a bit more yourself in BigQuery. (Define a schema and do a schema update when you log something new.) I implemented logging via App Engine requests which save the data to BigQuery. Not sure if that's better or worse than using the StackDriver service, but it's working.
As mentioned in the comments of the article, the auto-generated Go package the author used is not the only Google package. There's also https://godoc.org/cloud.google.com/go/preview/logging, although still experimental.
This setup is not optimal. If you use kubernetes/gke the most optimal choice is relying on GKE logging. You just log to stdout, and GKE takes care of attaching relevant metadata (e.g. instance log came from) and transporting logs reliably and efficiently (relies on fluentd agent provided by the GKE). You don't need to do any setup, except logging to stdout in json format. In my python code I just use python-json-logger.
If you don't want to rely on GKE, instead of sending http calls with logs fluentd is a better choice:
Regarding stack driver - article does not mention that you can export logs from StackDriver to BigQuery. I find it very useful. BigQuery internal version, Dremel, was created for that very purpose - logs analysis.Disclaimer: I'm not affiliated with fluentd or GCP, although I use both and I used to work at Google (primarily search, no ties to GCP).