Hacker News new | past | comments | ask | show | jobs | submit login

I have some biased doubts (come from the JVM world) about needing really fast compiling and is often cited as the reason Go does things the way it does (or is).

Is binary dependency management just not an option ever?

I have a friend that works for Google and supposedly they have a proprietary build infrastructure that will offload the building of C++ code into a cluster. I sort of wish Google open sourced that as I believe it basically does some form of binary dependency management.

Yes I know Go like C++ can target lots of platform (and thus needs many binaries built) but an organization only needs the major ones. And yes I know the language statically compiles to a single binary but that doesn't mean things can't be precompiled.

Go these days seems to be mainly used for microservices or small utilities and thus you don't need and should not have a gigantic code base for such things. I can understand monolithic UI apps having large code bases but this is clearly not what Go is being used for these days.

There are many other languages that compile to native that seem to compile fairly fast (OCaml and Rust from my experience but I don't have huge code bases).

Is compilation speed really an issue given the use cases for Go?




>Is compilation speed really an issue given the use cases for Go?

Yes, I find compilation speed to be one of the most important things. But it is not a selling point for go. The go compiler is not very fast, and speed is not an acceptable excuse for a lack of parametric polymorphism. Ocaml has not just parametric polymorphism, but many other basic type system features. And yet ocamlopt is both 5-10 times faster than the go compiler, and still produces faster binaries.


That is what I'm saying (I think we agree). It seems like a focus of Go's simplicity is to improve compilation speed and yet there are languages like Ocaml that do have generics (and a whole lot more) that seem to compile faster.


The G build system was open sourced: http://bazel.io/


Not the distributed secret sauce though.

> Does Bazel require a build cluster? Google's in-house flavor of Bazel does use build clusters, so Bazel does have hooks in the code base to plug in a remote build cache or a remote execution system. The open source Bazel code runs build operations locally. We believe that this is fast enough for most of our users.


Unless you have a very advanced infrastructure sitting around, their version of the distributed pieces is going to be worthless. Same reason tensorflow was initially released without it.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: