Hacker News new | past | comments | ask | show | jobs | submit login

That was you? I wouldn't have made the observation had I not read that comment!

Maybe we should make a list of topics where people frequently miss the point in this way? Like https://en.wikipedia.org/wiki/List_of_common_misconceptions though this is probably not in scope for Wikipedia.




I'm up for it, though off the top of my head, I'm not sure how to make it the right size - short enough to be useful, but longer than "1) learn about feedback loops, 2) think in dynamic systems, not static ones, 3) meditate on “Meditations on Moloch”".

A different list I believe I could use is one that's listing some working mitigations to tough problems. Like, checklists are effective at this-and-this in medicine[link set 1], and this-and-that in aviation [link set 2]. "Two-in-cockpit" policies are effective at helping in this-and-that in aviation, to such-and-such degree [link set 3], etc. The motivation here is that it seems that some industries figured out ways to reduce the severity of various tough problems, and there may be an opportunity for cross-pollination, or staging those techniques into an even better mitigation.

RE the Moloch link, I'm not in the right mood to respond to the reply under that comment, nor do I have energy for this right now. I mostly disagree, but I note one point - it's true, at least for me, that realizing a problem is one of coordination at scale (and therefore likely systemic) makes me feel despair. The realization itself is a mood killer - I find myself recoiling from it, and thinking along the lines of "please let it not be a Moloch thing, please let it be something solvable, something that can be approached directly". I would love to know of a way to feel encouraged, instead of instantly demotivated, by this kind of problems.

EDIT:

Another one to your list of common point-misses: 4) what's the base rate, and 5) what's the effect size?

Prompted by 'userbinator remembering that one, and asking the important question. Paraphrasing, "if those chemicals are so very bad for us, and have been everywhere for decades, then where are the ill effects?".

It's like with dietetics and cancer scares. Consuming/not consuming red meat/coffee/artificial sweetener/whatever can give you cancer! How much of an increase? 10%? Over what base rate? 0.00001? In that case, it's affecting your lifespan less than worrying about it is.


Moloch things are solvable. They're a relatively recent issue. The despairing tone of Scott Alexander's essay only applies because it assumes we've got an unlimited number of people – but we've got an unknown, finite, and small number of people. That puts the game closer in nature to the Iterated Prisoner's Dilemma, where you don't even need superrationality for the best decision to be "cooperate / cooperate".

The problem isn't some kind of game theory issue. The problem is the cultural phenomenon where people believe cruelty is a virtue. The solution is kindness, like people have been saying for thousands of years. (Also reorganising society somewhat.)

In Scott Alexander's terminology: we can build a garden, and keep Moloch out of it, because we are the only ones in our vicinity capable of being powerful agents of Moloch, and we can just… decide not to do that. So many societies have managed that over the years: it's just historical coincidence they didn't end up with industrial metallurgy, lots of boats, and a penchant for proselytism. But now, we don't have any other societies that could come from across the sea and influence us: we're globalised.

Our fate is in our hands, if we can step back from isolationist ethics and be kind (and stop trying to take over the world).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: