Hacker News new | past | comments | ask | show | jobs | submit login

That's what the person responding to meant - attempts to make human systems "rational" often involves simplifying dependent probabilities and presenting them as independent.

The rationalist community both frequently makes this reasoning error and is aware they frequently make this error, and coined this term to refer to the category of reasoning mistake.






> coined this term to refer to the category of reasoning mistake.

That's not at all what the Wikipedia article for it says. It presents it as an interesting paradox with several potential (and incorrect!) "remedies" rather than a category of basic logical errors.


The “quadrillion days of happiness” offered to a rational person gives away that such allegories are anthropomorphized just for the sake of presentation. For the sake of what the philosophers mean, you should probably imagine this as an algorithm running on a machine (no AGI).

It’s a mental tease, not a manual on how to react when faced with a mugger who forgot his weapon at home but has an interesting proposition to make.

Similarly the trolley problem isn’t really about real humans in that situation, or else the correct answer would always be “do nothing”.

It’s what the comment here [0] says. If you try to analyze everything purely rationally it will lead to dark corners of misunderstanding and madness.

[0] https://news.ycombinator.com/item?id=42902931


> Similarly the trolley problem isn’t really about real humans in that situation, or else the correct answer would always be “do nothing”.

The correct answer in case of it being about real people, of course, is to switch immediately after the front bogey makes it through. This way, the trolley will derail and make a sharp turn before it runs over anyone, and stop.

The passengers will get shaken, but I don’t remember fatalities being reported when such things happened for real with real trams.


The scenario is set up by an evil philosopher though, so they can tie up the people arbitrarily close to the split in the rails, such that your solution doesn’t work, right?

In this case, it won’t matter, I’m afraid, which way the trolley goes as it will at least mangle both groups of people, and the only winning move is to try to move as many people as possible away from the track.

An Eastern European solution is to get a few buddies to cut the electrical wire that powers the trolleys and sell it for scrap metal, which works on all electrical trolleys. (After the trolley stops, it can be scavenged for parts you can sell as scrap metal, too.)


> An Eastern European solution

Made me chuckle. Funny 'cause it's true. About the trolley problem, if taken literally (people on tracks, etc.) pulling the lever exposes you to liability: you operated a mechanism you weren't authorized to use and for which you had no prior training, and you decided to kill one innocent person that was previously safe from harm.

Giving CPR is a very tame/safe version of the trolley problem and in some countries you're still liable for what happens after if you do it. Same when donating food to someone who might starve. Giving help has become a very spiny issue. But consciously harming someone when giving help in real life is a real minefield.

P.S. These philosophical problems are meant to force a decision from the options given. So assume the the problem is just a multiple choice one, 2 answers. You don't get to write a third.


> P.S. These philosophical problems are meant to force a decision from the options given. So assume the the problem is just a multiple choice one, 2 answers. You don't get to write a third.

I know about it. And yet I refuse to play the game. The problem is that even philosophers should be able to acknowledge that in the real universe, no box should be too big to prevent from thinking outside of it.

Otherwise we get people who conflate map with the territory, like what this whole comment thread is about.


> The “quadrillion days of happiness” offered to a rational person gives away that such allegories are anthropomorphized just for the sake of presentation.

So what? It's still presented as if it's a interesting problem that needs to be "remedied", when in fact it's just a basic maths mistake.

If I said "ooo look at this paradox: 1 + 1 = 2, but if I add another one then we get 1 + 1 + 1 = 2, which is clearly false! I call this IshKebab's mugging.", you would rightly say "that is dumb; go away" rather than write a Wikipedia article about the "paradox" and "remedies".

> Similarly the trolley problem isn’t really about real humans in that situation, or else the correct answer would always be “do nothing”.

It absolutely wouldn't. I don't know how anyone with any morals could claim that.


Interestingly, the trolley problem is decided every day, and humanity does not change tracks.

There are people who die waiting for organ donors, and a single donor could match multiple people. We do not find an appropriate donor and harvest them. This is the trolley problem, applied.


I would pull the lever in the trolley problem and don't support murdering people for organs.

The reason is that murdering people for organs has massive second-order effects: public fear, the desire to avoid medical care if harvesting is done in those contexts, disproportionate targeting of the organ harvesting onto the least fortunate, etc.


The fact that forcibly harvesting someone’s organs against their will did not make your list is rather worrying. Most people would have moral hangups around that aspect.

Yea, it doesn’t seem quite right to say that the trolley problem isn’t about really people. I mean the physical mechanical system isn’t there but it is a direct abstraction of decisions we make every day.

> the trolley problem isn’t about really people

My actual words quoted below give one extra detail that makes all the difference, one that I see people silently dropped in a rush to reply. The words were aimed at someone taking these problems in a too literal sense, as extra evidence that they are not to be taken as such but as food for though that has real life applicability.

> the trolley problem isn’t really about real humans in that situation


> We do not find an appropriate donor and harvest them. This is the trolley problem, applied.

I don't think that matches the trolley problem particularly well for all sorts of reasons. But anyway your point is irrelevant - his claim was that the trolley problem isn't about real humans, not that people would pull the lever.

Edit: never mind, I reread your comment and I think you were also agreeing with that.


> his claim was that the trolley problem isn't about real humans

Is it though? Let's look at the comment [0] written 8h before your reply:

> the trolley problem isn’t really about real humans in that situation

As in "don't take things absolutely literally like you were doing, because you'll absolutely be wrong". You found a way to compound the mistake by dropping the critical information then taking absolutely literally what was left.

[0] https://news.ycombinator.com/item?id=42907977




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: