Hacker News new | past | comments | ask | show | jobs | submit login

A person's mental illness can have drastically different outcomes depending on who they are surrounded by. Even before Facebook there were plenty of cases where people exploited the emotionally and socially naive or unstable, for their own gain or amusement. I worry social media can enable something similar.



But it's really not social media in particular. It seems pretty clear that when delusions are coming on, a person will seek out those who humor those delusions. If you're talking pre-social media Internet, you'd have the person on BBSes and email lists for whatever they find appealing.

And it's worth considering that the situation isn't some mentally ill people and some opportunistic/predatory people. You have a mixture of delusion and predation over a substantial population and you have the ability of people in these groups to seek each other out. Maybe if you shut down usenet and everything past that, you could stop this phenomena, at the expense of more or less shutting down the Internet. But otherwise, the discussions blaming social media in particular for this stuff seem misguided. Social media is about connecting with people you know whereas "socially resonant insanity" or whatever wants to call it, involves find new friends and new facts.


But there are healthy connections and unhealthy connections. Is it possible that there might be a way to encourage the healthy ones and discourage the unhealthy ones?

Just because social media works the way it does now doesn't mean it has to always work like that. These things are constructed by humans and we can change them if we want. There is nothing intrinsic in Facebook or YouTube that says they must promote "engagement" above all else.

Maybe they might make a few less dollars, wouldn't that be sad.


But there are healthy connections and unhealthy connections. Is it possible that there might be a way to encourage the healthy ones and discourage the unhealthy ones?

There has been talk of Facebook actually using algorithms to make mental health interventions. The problem is that I don't think that many users find some automatically generated warning credible.

Just because social media works the way it does now doesn't mean it has to always work like that.

Either you something like a single, state approved connection system or each social media network will serve up anything that users will buy. That's both because of the profit motive and because someone, somewhere thinks X sort of content is totally cool, both factor are in play. What's the plan to counter that?


They can make the interventions much more subtle and palatable than just a dry warning. Subtly changing the feed to nudge the person on another path could do this. It is manipulation, but maybe some people need it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: