This is brilliant work and I think their soft-body model is pretty simple and understandable why it "works":
> The basic principles in short: TPE uses soft body physics, bodies are modelled as spheres connected by springs but the springs can be made stiff so that the bodies behave almost like rigid bodies, so you can simulate (fake) both soft and rigid physics. Environment in which bodies are placed is modelled by distance functions, i.e. you can in theory create any environment as long as you can create a function that for any point in space returns the closest point to the environment (functions for basic and some more complex shapes are included in TPE).
You’ll love this then, using a similar mass-spring approach to model vehicle dynamics in a PS1 game: https://youtu.be/pwbwFdWBkU0
There’s a bunch of similar stuff with varying degrees of physical accuracy under Position Based Dynamics and the more physically correct eXtended Position Based dynamics. It’s a fun approach.
Some years ago i experimented with using springs for game physics[0] but while it looked nice with a single dynamic object against a static environment, collisions between dynamic objects quickly fell apart - objects intersected each other in weird ways and something as simple as stacking two boxes with imperfect placement (like a player would do, not the perfect stacking shown in the GIFs in the linked page) wasn't possible.
I did consider using convex volumes for the collision mesh and making checks against the spring ends (point-in-convex is trivial to check) but then i realized that in 3D space convex objects can collide even if both have their vertices outside of each other (consider two cubes that are not aligned to each other having two of their edges touch with one edge being almost perpendicular to the other). The code was already getting too hairy and at that point might as well drop the whole spring idea and do full convex shape collisions with rotation (big can of worms).
Though truth be told, the simplicity of using just springs and spheres is alluring and the issues might be solvable with a large number of spheres (IIRC Nvidia at some point years ago had a physics engine running on GPU where they modeled everything using small spheres). It does help avoid a bunch of stupid bugs that you can easily get with trying to do angular physics [1] :-P
Someone pointed out to me my library made it to HN, thanks to everyone for taking interest, it brightened my day to see some of you starred the library as this has been completely unexpected, I greatly appreciate it (trust me, I don't have much other joy in life than seeing someone like my projects). To be honest, I am actually not as satisfied with TPE as I've been with some of my other projects, mainly because this was really my first 3D physics engine and I lack a deep knowledge of this field, here and there I used a few not so elegant hacks, but I see TPE as a successful proof of concept that tells me a real expert on physics simulation could make a truly amazing library in this style, and it wouldn't even take as much effort as it took me. I certainly hope this might inspire someone to try to prove that "this can be done better" :) I would definitely love to see more people jump on the KISS train and make a library like this in whatever their field of expertise is (chess engine, machine learning, image processing, ...) -- there is an abundance of mainstream (big/bloated) libraries for basically everything nowadays, but almost no "KISS" alternatives to them, there are tons of opportunities for making something really nice here. And yes, I also have controversial opinions that I know look very scary, I wouldn't really like to discuss that here because that could kill the thread and it's hugely offtopic, let me just assure you I firmly believe in nonviolence, peace and I want to help all humans equally, be it with my programming or otherwise. For any questions I can be reached via email, I'll be glad to talk about anything. Once again thank you for your nice feedback.
I absolutely love the approach, the code, and the motivation. Unfortunately, some things I read in the 'about me' page makes me worry about the author's future.
the page is somewhat sad. seems like an individual with a lot of forward looking and upbeat views on the world, but completely lacking a whole lot of perspective, some basic education regarding economics.
and well, the pedophilia thing.
he self-describes as non-competitive so probably doesn't care, but he could simply be so much more, as a person, and all the ingredients are there. but likely will waste away most of their life as a (in the best case) relatively benign weirdo.
but they are likely in a social situation where nobody will give them that perspective and education. and society certainly isn't set up to detect and help these people on its own.
I wonder why all the comments pointing this out are getting flagged. I don’t think it’s anything malicious, maybe an edge case in HN’s moderation system.
Anyway, I agree, excellent code, but I worry about this guy. His website(s) have a lot of worrying content - pro-pedophilia, misogynistic, racist, transphobic.
Kind of, you could render to buffer and use it as texture for drawing, like in the old blitter days, and that is an approach being used by some rendering solutions like OTOY.
What an interesting person. I don't agree with about half of it but I respect the courage to act on those beliefs. Looks like the software is in public domain, for example.
I can agree with you. I also disagree with much of what they wrote (although I did not count how much), but I believe that anyone should have freedom of speech whether it is agree or not.
But, most of that is not important for the software, since the computer programs can be judged by their own merits instead of everything else that the author does (especially if they are public domain).
Exactly what I thought. Some of what he believes in is maybe a bit excessive, but then again, maybe it's just outside of what we currently believe to be normal.
I respect the guy for speaking up (doxxing himself in the process too!)
Please stop making "header only libraries": they're nontrivial to bind to from non-C languages. You have to basically either split them up yourself or embed a C compiler in your compiler.
See the top part of the header file, where all the function and type declarations are? That's the header. Just move everything below it to a c file.
> they're nontrivial to bind to from non-C languages
This particular library doesn't seem to follow the STB convention where the implementation is inside an #ifdef IMPLEMENTATION block, but that would be trivial to fix.
Once you have that it's just as trivial to use from other languages as a regular .h/.c library, just include the implementation into a C source file and compile that into a static link library or DLL, then include the declarations (without IMPLEMENTATION define) into your language (if the language can directly include C headers) or use the header to auto-generate the bindings.
It turns out that regular libraries are nontrivial to use from C though, due to no standardized build system. Should a C library optimize for usage from C or non-C languages?
Hm in Haskell I prefer single-header. I can just copy it into my repo and it Just Works. I guess Haskell does have really good C interop. Both the FFI and hsc2hs make for seamless interop. It even understands pkg-config - it doesn't embed anything in the compiler though. Just understands the ecosystem. It's to the point where I consider C a DSL you can use.
Important difference to typical C++ header-only libs: STB style libs don't use the inline keyword, but instead place the implementation into an ifdef/endif block.
It's really just for easier distribution and integration, and the difference to a single .h/.c pair isn't all that big.
Definitely not big enough to get all riled up about it - the actual problem are libraries that come in dozens of source and header files and with their own build system files, or C++ libraries which put the implementation into inline code, like most C++ stdlib headers (because this increases compilation time for every file which includes such a header).
I should have said: the lack of standardized build system. CMake is a pain, Bazel, vcpkg, autotools, plain old Makefiles, etc... All of them do not match the developer experience of npm, cargo, etc...
Yeah we managed, I never said the opposite. I only talked about why I liked header-only libs: because I just copy/paste the source in my source tree, regardless of the build system I use.
If I'm using CMake, and a library is using autotools, I'll have to rewrite the lib's build system, and maintain it. I'm too lazy for that.
It was a massive pain in the butt in that time. Just try to intregrate a "standard library" like libjpeg or libpng into your cross-platform project. That's exactly why everybody has switched over to stb_image.h to quickly load image data without hassle.
Easy, have your "load/save_image" calls the OS provided APIs for image handling, while on UNIX assume the libjpeg-dev or libpng-dev are already installed via the OS package system, have the makefile targets adjusted as needed per platform.
Nowadays, use cmake + conan/vcpkg as alternative as well.
...and with all this you just made everything a thousand times harder (and more brittle) than it needs to be. Integrating stb_image.h is literally just a single #include statement, no matter the platform, build system or dependency manager (if any).
Other than having someone keep track of it, across all projects it gets included, even if it gets vendored somewhere there is the issue with symbol duplication across binary libraries used across teams.
Also it is a toy library in size, then there are those that can be used to stress test compilers.
Even on C++, regardless of templates, most code doesn't need to be header only, other than laziness to properly learn how to work with compilers and build systems.
Hello. Forgive my ignorance and possible misunderstanding of your 'most code' qualifier.
If I have a template header .hpp and I do not wish to specialize the template function bodies in a .cpp file, the template header function bodies must be declared in the .hpp, right?
Yes it needs to be on the header, however depending on what is the goal, many operations could eventually be delegated to helper functions or PIMPL classes, which aren't fully exposed on the header.
And for the lucky users of VC++ 2022, there is also the option to use modules instead, where templates are then exposed via the module metada.
Helper functions, PIMPL, that's all fine - the declaration and function body of the template is still defined in the .hpp. Whether all of the body of the template is also defined there was not my concern in asking.
> The basic principles in short: TPE uses soft body physics, bodies are modelled as spheres connected by springs but the springs can be made stiff so that the bodies behave almost like rigid bodies, so you can simulate (fake) both soft and rigid physics. Environment in which bodies are placed is modelled by distance functions, i.e. you can in theory create any environment as long as you can create a function that for any point in space returns the closest point to the environment (functions for basic and some more complex shapes are included in TPE).