Although given both of these articles are from 2015, and Nguyen Duc Thang is still contributing (more than 3200 videos now, most recent was yesterday), maybe his youtube is a better link: https://www.youtube.com/user/thang010146
His work has been shared here multiple times, but it's worth sharing another artist's hard work again: Bartosz Ciechanowski's incredible interactive animations.
The internet is an absolutely amazing place that demonstrates that sometimes people just want to share their work and have others enjoy it as well. We see this rarely in the real world where the focus and reward is on monetization
> The internet is an absolutely amazing place that demonstrates that sometimes people just want to share their work and have others enjoy it as well. We see this rarely in the real world where the focus and reward is on monetization
I want to engage in this with a way that sounds nit-picky, but isn't meant to be so. The internet is part of the real world; these people who are sharing on the internet are people in the real world who want to share.
I think a major difference is that reach is so globally amplified when you use the internet. When someone shares something so freely through the internet, it's literally available to the whole world, so I can know about all the sharing done on the internet by anyone around the world. When someone shares something so freely in a way that, by necessity or by choice, is in person, then only those sufficiently near to them will know about it (or at least can partake of it). So, when I look around me in the real world and see fewer people sharing than I do on the internet, it's not because they aren't there, only because I'm comparing populations of vastly different sizes.
I also highly recommend checking out https://www.reddit.com/r/ShittyLinkagePorn/. It's beautifully animated renditions of mechanical linkages that have less than stellar performance in the real world.
Is there a general theory or framework for how these kinds of mechanical things get designed and work?
I am always really impressed by the ways mechanical motion can be transformed but I have no idea how people figure out these mechanisms. Is it just 3000 years of trial and error and intuition?
I studied Mechanical Engineering and did learn about mechanisms and how to calculate a bunch of these things, but we saw only a let's say "very limited subset" of possible mechanisms, though we studied those in depth. IIRC the most complex gear was a planetary gear.
I am truly marveled when I see the more complex mechanisms and imagine how/where they could be used. You have a lot of repeated ones of course, like linear alt motion to circular or the reverse is a favourite one, but even then it's beautiful seeing many options because each might be more useful for a specific purpose (e.g. high torque, or high speed, or ...).
I remember constructing a digital prototype of a Geneva wheel (really cool mechanism btw) and came to respect the complexity in wrapping your mind around some of these CAD designs in the 3d space, along with the necessary constrains. perhaps even more complicated than producing them in reality.
I really hope at some point he releases CAD files for all the mechanisms he animated. He likely has the largest database of mechanisms in a computer understandable format.
There was a yt channel "King Mechanical" that would make similar long-duration videos that were very mesmerizing with a kind of "on hold psych rock" music background track. But that channel appears to have pulled all their videos.
I am pretty sure that the channel you linked is ripping off Thang’s videos. The color scheme looks way too close to his to think that it was made independently.
King Mechanical were also basically the same color scheme, not being an autodesk inventor user I don't know if it just a built in theme. I cant say, so I won't comment on who was first.
Can these be fed as training data to ChatGPT so it can generate imagery or representations in CAD-formatter output of described 3D models, scenes, and physics simulations?
Clearly generalized diverse content ingest indexing is the next step on the road to improvement.
You could definitely do this by training it on PLM data. so the idea for chat GP to learn this even though it's a language model is not too far fetched
ha! now I see it too. But it's an illusion caused by the frame rate and the animation speed. if you try to imagine the pink gear turning correctly, eventually your perception will snap and it will appear correct again.
for me I can't see it any other way. When I watch the teeth meshing, they look correct, but when I zoom out, the other side of the gear is always going backwards
now that you've linked it, I feel like it's easier to ingest his information directly from YouTube. Wonder whether him as the author, would prefer that or his own website
Although given both of these articles are from 2015, and Nguyen Duc Thang is still contributing (more than 3200 videos now, most recent was yesterday), maybe his youtube is a better link: https://www.youtube.com/user/thang010146