Hacker News new | past | comments | ask | show | jobs | submit login

"Also, our entire logging and analysis infrastructure is being migrated to Go." This intrigues me....

Go is great for this kind of stuff. Spark is great when you have a lot of custom queries, but if you have a few fixed queries writing a simple solution in Go have have huge gains.

I worked to implement a web service + simple map reduce (with network transparency) + a time series DB (storage engine too, NOT using levelDB like influx) all in Go and the result was it is many times faster than an order of magnitude larger hadoop cluster, BigQuery, etc for the limited operations it supports.




Did you use a framework to run mapreduce with go or did you roll your own?


We wrote our own, Go 1.0 had just come out.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: