This is the weirdest and most nonsensical description of the GIL I've ever seen, and assuredly one coming from a Python/JS non-programmer. In both languages the serialization of runtime is a feature explicitly designed to simplify the programming model and its implementation.
There is only one case where the serialized runtime presents a problem, and that is in CPU-bound 90s-style shared memory parallel computation that the industry as a whole has been trying to escape for the past 20 years, because thinking about individual threads and the lifetimes of shared memory allocations turned out to be an incredibly shitty abstraction.
Even if you want to shoot yourself in the foot, both Python and Node.js provide facilities to allow e.g. concurrent array access (in Python via the multiprocessing package). The reason those approaches aren't more popular in those languages is exactly because the model itself is defective. Anyone worth their salt working in a computation-heavy domain stopped writing explicit threading code a long time ago.
There is only one case where the serialized runtime presents a problem, and that is in CPU-bound 90s-style shared memory parallel computation that the industry as a whole has been trying to escape for the past 20 years, because thinking about individual threads and the lifetimes of shared memory allocations turned out to be an incredibly shitty abstraction.
Even if you want to shoot yourself in the foot, both Python and Node.js provide facilities to allow e.g. concurrent array access (in Python via the multiprocessing package). The reason those approaches aren't more popular in those languages is exactly because the model itself is defective. Anyone worth their salt working in a computation-heavy domain stopped writing explicit threading code a long time ago.