Except... Most companies offer products built in multiple languages. Google notoriously has a multilingual monorepo and a uniform build system. Every company of non-trivial size uses multiple languages. Even Asahi Linux does!
Polyglot projects are harder, but definitely not as hard as OS development in C.
The majority of companies doing web is doing "something plus Javascript".
And then if there's some AI in the mix, there's also almost always Python. Often maintained by a super small team.
What about apps? Java and Swift. The migration to Swift included a lot of Obj-C and Swift living side by side for a while. Same with some apps migrating to Kotlin.
In general... it is often a skill issue with labor pools, as firms accepted Java/JavaScript is almost universally known by juniors.
Apps are in user space, and are often disposable-code after 18 months in most cases. Projects like https://quasar.dev/ offer iPhone/Android/iOS/MacOS/Win11 single code base auto-generated App publishing options, but most people never hear of it given a MacOS build host is required for coverage.
The talk pokes fun at the Ivory tower phenomena, and how small-firm logical assumptions halt-and-catch-fire at scale. =3
How the can it possibly be an issue with the labor pool when JS is the only truly supported web language and Python is the main language for ML tooling?
I think that you are handwaving away differences in runtime environments. All languages (almost) are Turing-conplete. Many of them, if used in the wrong context, will get you stuck in a Turing tarpit.
Languages succeed or fail on the strength of their runtimes. If you seriously think that mixed-code codebases are trash after 18 months, then I think I'm wasting my time - that statement is so fundamentally detached from reality that I don't even know how to start.
"that statement is so fundamentally detached from reality that I don't even know how to start."
It is the statistical average for Android, but again it depends on the use-case situation. Apps are a saturated long-tail business that fragmented decades ago.
Python sits on top of layers of C++ wrappers for most ML libraries, and doesn't do the majority of the heavy computation in Python itself (that would be silly). Anyone dealing with the CUDA SDK for Pytorch installs can tell you how bad it is to setup due to versioning. That is why most projects simply give up to use a docker image to keep the poor design dependencies operational on a per project basis.
"then I think I'm wasting my time"
Than just let the community handle the hard part, and avoid having to figure it out yourself:
Sorry but apps being disposable and "JS-only frameworks" existing doesn't change reality. Nor do juniors only knowing only one language.
Single-language codebases are the exception rather than the rule in a lot of industries, which includes apps and web. Ditto for operating systems: the Linux Kernel might be majorly C, but both Windows and Mac mix C and C++ in the kernel but a lot of userland is in C#, or a mix of Swift and Objective-C.
"Explanations exist; they have existed for all time; there is always a well-known solution to every human problem -- neat, plausible, and wrong." ( H. L. Mencken )
Probably conflating popularity with good design. The talk covers the financial incentives for naive code choices, and perceived job security in ivory towers.
It might work for you, but in general is bad for the firm long-term. Best of luck =)
"Single-language codebases are the exception rather than the rule in a lot of industries"
And indeed >52% of those Polyglot projects end up failing in more than one way, and are refactored or abandoned. I didn't like that the talk rips on Elixir/Phoenix either, but factually his insights were not inaccurate.
On average it takes around 2 to 3 years to understand why... As it takes time to see why Tacit knowledge is often ephemeral. Tantalizing bad ideas reverberate on the web, and the naive recycle them like facts. YMMV and it usually doesn't matter if you have under 38k concurrent users... and spend $3m/month in transfer fees to AWS. Perhaps your part of the world works differently than mine. =3
Google does not cancel projects because they're technically unwieldy, and if you ask pretty much any ex-Google engineer what they think about internal tooling they will generally praise it because it's very good.
Complexity is a consequence of dealing with the real world. An OS talks to hardware, which might have errata or just straight up random faults.
Googles initial technological success was due to the design assumption individual systems were never reliable, and should fail back gracefully. It was and still is considered a safe assumption...
Indeed, on average 52% of all IT projects at any firm end up failing, or were never used as designed. What exactly does a glorified marketing company have to do with successful OS design? The internal Android OS they planned to replace Linux with was canceled too. "All software is terrible, but some of it is useful" =3
I would suggest actually reading what engineers who have worked at the company say about it. Many of these statements you've just made are irrelevant or factually incorrect (e.g. Fuschia might be cancelled, but Google remains one of the major contributors to Linux and is certainly not a "marketing company.")
The reason I don't understand fully is because you don't explain yourself. I'm not convinved it has anything to do with my work experience.