Performance isn't terrible, but it isn't fantastic either (70-100 MIPS native, 10-30 MIPS browser)
Those screenshots must've taken some extreme patience. Especially that Windows 10 one. 70-100 MIPS is roughly a fast 486. It sounds like this isn't even a JIT, unlike QEMU's emulation mode, and the latter is already quite noticeably slower when running modern OSs on modern hardware.
If I recall correctly, QEMU did portable JIT by implementing the instructions in C and then copying their compiled versions out of the emulator binary instead of calling them as functions.
Right, but it kinda depends on what you mean by "portable". Only an interpreter will work without modification on a new host arch, but code for a JIT that is organized in such a way that makes it easy to add support for new host arch might also be called "portable".
The idea that an x86 emulator, written in C99, would require node to build is a smell if I’ve ever smelled one.
If a dev can’t be bothered to use standard tooling, can he be bothered to write safe and performant C? Is this kind of “cleverness” running throughout his code? Maybe the build process is “too complicated” for standard tools, which is a problem for sure.
This is more of a question for the next developer who tries to pull something like this, I don’t really want the OP to be discouraged from continuing their work. Just, maybe next time, don’t force JavaScript into a round hole.
Makefile.js here is actually Makefile plus configure, which makes more sense. The very need for configure is why a pure C project can be annoying to build, especially when you have a complex or auto-generated dependency or have to support multiple platforms.
I've read the code and I know that it reimplements a substantial portion of Make. The point is however, this project doesn't work with a single Makefile either. Traditional projects would typically use configure plus (generated) Makefile, where configure is a shell script typically built from autoconf which is an abomination. In order to avoid autoconf some projects replace configure with other languages like JS. This project went further by merging that JS configure with Makefile. All of this hassle would be pointless unless you consider one important platform: Windows, where you don't natively have Make anyway.
You may still argue that this project still requires GCC and thus there should be Make even in Windows. That's reasonable, but right now I have GCC and Clang but no Make in my Windows console. If you can remove a hard dependency on Make with a reasonable cost (and I don't claim it's the case with this project, I would have used Ninja instead [1]) it's pretty worthwhile in my opinion.
> All of this hassle would be pointless unless you consider one important platform: Windows, where you don't natively have Make anyway. [...] You may still argue that this project still requires GCC and thus there should be Make even in Windows.
I think that is a very good argument though: Windows doesn't have MSVC or any other compiler out of the box either, it is up to you to install the requirements so having Make as a requirement - especially considering how standard it is - is perfectly fine. Using something like MSYS2 gives you a Linux-like shell with all the tools you'd expect there.
There is no real reason to require Node aside from the author wanted to play around with it, the functionality that tool provides can be provided by other existing tools (Autotools, premake, CMake, meson/muon, etc) or even GNU Make itself (using its extensions that wont be compatible with other "makes" though).
(also while Autotools can feel a bit arcane, they have one feature that other similar tools do not have: you do not need to have Autotools installed to build the project, only some shell and some make, both being very standard in pretty much any unix-like OS and for Windows, well, see the first paragraph :-P)
Your notion of "standard" build environment with Make and POSIXish shell doesn't apply to MSVC. Well you do have nmake, but it is substantially different from Make anyway.
This is why i mentioned "for Windows, well, see the first paragraph" where i write "Using something like MSYS2 gives you a Linux-like shell with all the tools you'd expect there".
Cmake is confusing as all get out. Make follows the Unix philosophy of just doing one thing - call other programs. It comes with almost all tool chains you’d care about. You can pick up the basics of a makefile in an afternoon… I still don’t know where cmake functions come from, how you would know about them or how to write your own. Let alone which version is the right one.
It’s like jsonnet to me - if you need config files that make config files something has gone horribly wrong.
Then again, I stopped bothering trying to learn Cmake. If I see Cmake I don’t bother with the project so there’s that - and also not a fan of c++ so I am probably the wrong audience anyway.
What do you mean it's comparable? Does it support multiple compilers? CFLAGS? Out of source builds? "compile_commands.json" generation? Automatic header dependency monitoring?
Also I've actually checked that Makefile, and it errored out (macOS):
It seems that in order to use the default rules, the target name needs to have the same name as the source file, e.g.
file: file.cpp
so specifying the "exe" filename as the output seems to be already a custom rule, so it doesn't seem to work without manually calling the selected compiler.
Another issue is that indeed, CC+CFLAGS is supported by default (implicit) rules, but as soon as the user tries to build a custom rule (which is 99% of the time), the support is lost unless explicitly adding it.
Third issue is that it seems that implicit rules can be different by different implementations of make, i.e. BSD make can have different rules than GNU make. Even if we consider only one implementation, then default rules can be different to each platform. The docs state that the user should consider the output of the `make -p` command in order to see the rules, which gives me an impression that different implicit rules can exist on different Linux distros, depending on how `make` was compiled (not sure what is the de facto state).
Trivial examples look nice enough in either system, big ones look bad in make and utterly fucked in cmake. Some of this is that people keep trying to write things in cmake scripts that should be external programs in a workable language.
To be honest, I'm not sure what is your point. Does the number of lines increase when the project complexity increases? Yes. Is CMake harder to read if the build system increases in complexity? Yes. The point of a minimal example when reasoning about CMake is to show that you get much more free stuff than in Make, not that it always stays small.
Putting an equal sign between CMake and Make, because the complexity increases in both, is pretty ignorant IMO. It's like saying "A car is no better than a bicycle, because both have problems with tires when driving long enough".
You have pasted a link to a complicated build system, which would be impossible to write and maintain in manual Make. It would require using Autotools. But then the Windows support would be dead. And lots of the files from the directory you've pasted are there in order to interface with hard-to-interface autotools-based build systems.
You can check out libarchive, which has support for both Autotools and CMake: https://github.com/libarchive . I'm not saying it uses both build systems the best possible way, but I pick CMake build over Autotools any day. What I get for free?
- I can open the project in my favorite editor and I have clangd working automatically,
- I can open it in Visual Studio, CLion, Xcode, Eclipse, or whatever,
- I can create multiple configurations (release, debug, asan, clang, gcc, vs2019, vs2022), comple them all and run tests for all configs at the same time after each change,
- When changing the build system itself, I don't need to think in layers: first the M4 layer, then the bash layer (or was it sh?), then the /bin layer -- "is my option passed to "cp" accepted on OpenBSD or is it a GNU extension? I better quickly install an OpenBSD VM and check it."
By the way, implementing the logic in CMake instead of an external scripting language may be in order to decrease the dependencies of the build system itself. I mean, why should I require having to install Tcl in order to compile my Gtk Bitcoin-monitoring app? Also the argument with "should be implemented in scripting language" is an opinion, which is heavily discussed when talking about wheter a build system's language should be turing-complete, or shouldn't. There is no consensus in that discussion.
Last point, I'm not advocating the use of CMake the tool. I'm advocating the use of CMake the model. Meson uses a similar model, and if the world would switch to Meson instead of CMake, I would still be happy.
There are very few projects for which the above CMakeLists.txt would be enough. Once your project reaches a certain size you will inevitably end up in a part of CMake that is confusing.
Well, first argument is that it actually IS enough for lots of small tools, and second argument is that if it's not enough and must be extended, it still requires much less complication than an equivalent Makefile.
if you rarely have issues, either someone else dealt with them before you knew it, either your project is very simple, either your project doesn't ever change or all of the above.
I haven't tried CMake recently but I remember using it in 2012 and it was awful - the documentation was unhelpful and as soon as I tried to do anything not in the examples I was on my own.
I'd love to hear why you think that's a bad thing. IMHO it's a huge quality of life improvement over 2.x. Everything is part of a target (flags, defines, dependencies, etc.) and far, far fewer magic variables that pollute random projects whenever you try to compose them.
I was going to play with this until I saw node.js required for build which means I am not going to bother. Use the right tool for the job, if there exists a job for node.js it certainly isn’t building C projects.
// Special build system used for Halfix builds. Supports the following features:
// - All built object files go in build/objs (no cluttering src/)
// - Build multiple targets at once (i.e. switch between Emscripten and native
// builds without having to clean)
// - Parallel build (same as make -j)
// - Certain files can be conditionally built based on the target
// - Dependency resolution and conditional compilation (note: this is a bit
// buggy)
// - Automatic dependency regeneration (via redep)
// - Simple to use (no configuration, multi-platform, etc.)
// Usage:
// node makefile.js [target] [options]
// Omit "target" for default native build.
// Run with --help for options and target list.
I wonder why HN likes projects written in C so much. C has its uses, but most of the modern code should be written in Rust, C++ or higher-level languages.
> It's for fun; "should" doesn't apply to projects written for fun.
It shouldn't apply to any project. We shouldn't accept the Rust astroturfers injecting themselves into every thread here. Rust has problems, too, and there are many better choices for many applications.
Sure is a good thing that you found out that this random pet project is written in C before you shipped it to production and filled it with sensitive medical data! I'm sure you're running Redox anyway, so you would have been safe regardless. Close shave!
Those screenshots must've taken some extreme patience. Especially that Windows 10 one. 70-100 MIPS is roughly a fast 486. It sounds like this isn't even a JIT, unlike QEMU's emulation mode, and the latter is already quite noticeably slower when running modern OSs on modern hardware.