Hacker News new | past | comments | ask | show | jobs | submit login

Yes, this is a great leap forward in my opinion. I had to do a project at a previous job where I wrote an agent that ran on x86, MIPS and ARM, and doing it in Go was a no-brainer. The other teams who had a bunch of C code that was a nightmare to cross-compile were so jealous they eventually moved a lot of things to Go.

I've been doing this for 35 years and cross compiling anything nontrivial was always a toolchain nightmare. Discovering a world where all I had to do was set GOARCH=mips64 (and possibly GOOS=darwin if I wanted mac binaries) before invoking the compiler is so magical I was extremely skeptical when I first read about it.




As long as you don't have C libraries to cross compile / link against of course ;)


sqlite is the only thing that makes me sad I have CGO_ENABLED=0.


This non-cgo port might help - I started using it recently and it's fine but I'm not exactly a demanding user https://pkg.go.dev/modernc.org/sqlite


It's still pretty slow, but overall correct. There's tricks, like reader connections and a single writer connection to reduce contention. There was a blog post on here detailing some speedups in general.


Use the new sqlite wasm wrapper that works on any platform


A fair enough assessment, it be that way, however I will note that a large reason that C exists in the first place was to have a machine independent language to write programs in.


> however I will note that a large reason that C exists in the first place was to have a machine independent language to write programs in.

That's fair, but what we call a monstrosity by modern standards is much simpler than porting the assembly

There were cross plaform languages before C, but they never really took off for system development the wat C did (OSs, for example were commonly written in pure assembly)


A side effect of C not having a price tag associated with it, anyone with UNIX source tapes got a C compiler for free, until commercial UNIX became a thing, and splitted into user/developer SKUs, and thus GCC largely ignored until then became a thing worth supporting.


For the UNIX authors, that was already possible a decade before.


Cross-platform languages have existed for decades, no?


mips64!? That's a blast from the past. It must be some kind of legacy hw that's getting current software updates in some kind of really niche use case. Or academia. :)

Like previous you, I have to admit I'm skeptical but would be happy to be wrong.


> mips64 .. must be some kind of legacy hw that's getting current software updates

Hundreds of thousands of linux-based smartnic cards, actually. Fun stuff. Those particular ones were EOLd and have been replaced with ARM but the MIPS based ones will live on in the datacenters until they die, I'm sure.

> Like previous you, I have to admit I'm skeptical but would be happy to be wrong

Seriously, you are going to be delighted to be wrong. On your linux machine, go write a go program and write "GOOS=darwin GOARCH=arm64 go build ..." and you will have yourself an ARM mac binary. Or for going the other way, use GOOS=linux GOARCH=amd64. It really is that simple.


It gets even more amazing than that. Look at this bit from my GitHub Actions: https://github.com/ncruces/go-sqlite3/blob/fefee692dbfad39f6...

I install QEMU (I have the same setup locally), then it's one line each to run unit tests for: Linux 386, arm64, riscv64, ppc64le and s390x.

With QEMU installed, all you have to do is:

    GOARCH=bla go test ./...


Wait is go test automatically running it under QEMU or what's going on here?


Ah I found this https://ctrl-c.us/posts/test-goarch I guess it's qemu-user-binfmt registering the alternate bin formats to automatically run under QEMU, that's pretty neat


Yep.

The Go build system runs under your current architecture, cross-compiling tests to your target architecture.

Then, the Go test runner also runs under your current architecture, orchestrating running your cross compiled test binaries.

Since you registered to run cross-compiled binaries under QEMU, those test binaries magically run through QEMU.

The Go test runner collects test results, and reports back to you.

The first run might be slowish, as the Go compiler needs to cross compile the standard library and all your dependencies to your target platform. But once that's done and cached, and if your tests are fast, the edit-test cycle becomes pretty quick.


MIPS64 is still very much alive: https://store.ui.com/us/en/products/er-4

"EdgeOS" is based on Linux, and people run vanilla Linux distributions on those boxes, as well as OpenBSD and NetBSD.

I wonder how long Marvell will continue selling those Octeon MIPS64 chips, though. Marvell (then Cavium) switched to ARM nearly a decade ago (2016) for newer chips in the Octeon series. I think Loongson sells more modern MIPS64 (or at least MIPS64-like) chips, but they don't seem to be commercially available outside China.


They're not widely available in the west, but you can get a whole Loongarch workstation on aliexpress:

https://www.aliexpress.us/w/wholesale-loongson-3a6000.html




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: