Hacker News new | past | comments | ask | show | jobs | submit login

Okay but ethics isn't universal either. There is no objective moral framework.

"Not everybody shares your views" == "Not everybody shares your ethics".

Let me try and steel-man this for you: if you believe in classical liberal principles of free speech, then you might probably be "proud" to work at FB, as well as of Zuckerberg's stance.

Here's a better articulation of that: https://stratechery.com/2020/zuckerbergs-choice-zuckerbergs-...




Ethics and morals are not the same thing. Morals are not universal.

Ethical frameworks were created specifically to be an objective standard.


> Ethical frameworks were created specifically to be an objective standard.

Yes, and there are many ethical frameworks. You and I may subscribe to different ethical frameworks, and that's entirely a function of our morals.


Facebook and the like have directly driven a huge increase in worldwide division and polarization. And faced with that fact they've continued their course full-steam-ahead. I don't know of any moral framework where that isn't a negative for society. Some on the economic-right might argue it isn't Facebook's problem to worry about because something something The Market. But even they wouldn't say the full picture is a good one, and wouldn't deny the logic that working for the company, in practice, ends up contributing to that outcome.


I really encourage you to try and construct the strongest counter-argument to yourself so that you might at least be able to empathize with why someone might be sympathetic to Facebook's current stances without jumping to the conclusion that they are cartoonishly evil. Remember that everyone is "the good guy" in their own story.

The moral framework where Facebook "isn't a negative for society" is the one that says that a free society should be able to openly express itself, and insofar as there is a high degree of polarization/division, it's because society itself is divided, and the free expression simply exposes that. In other words, how do you know that Facebook caused the polarization? How can you definitely conclude that Facebook isn't simply a mirror on society, and that society was already divided to begin with?


> the conclusion that they are cartoonishly evil

> The moral framework where Facebook "isn't a negative for society" is the one that says that a free society should be able to openly express itself

I wasn't making any such grand claims. I intentionally scoped my argument to "Facebook has increased division in the world" and "increased division is bad", in an effort to keep it as non-subjective as possible.

> In other words, how do you know that Facebook caused the polarization?

I didn't have to look very far; Facebook did the study itself, internally, and executives directly chose to ignore it: https://www.theverge.com/2020/5/26/21270659/facebook-divisio...


> I didn't have to look very far; Facebook did the study itself, internally, and executives directly chose to ignore it: https://www.theverge.com/2020/5/26/21270659/facebook-divisio....

While this is indeed a compelling data point, I want to point out 2 things:

1. We have no way to verify the validity of the methodology, since the study is opaque to us.

2. We have no way to verify the veracity of the results, since the study is not peer reviewed.

And while you didn't have to look very far, perhaps if you look even a little farther, you'll find that there are conflicting studies (in published peer reviewed research):

"We combine nine previously proposed measures to construct an index of political polarization among US adults. We find that the growth in polarization in recent years is largest for the demographic groups least likely to use the internet and social media. For example, our overall index and eight of the nine individual measures show greater increases for those older than 75 than for those aged 18–39. These facts argue against the hypothesis that the internet is a primary driver of rising political polarization." [1]

"Contrary to conventional wisdom, my analysis provides evidence that social media usage reduces mass political polarization."[2]

"After conducting a detailed analysis of recommendations received by each channel type, we refute the popular radicalization claims. To the contrary, these data suggest that YouTube's recommendation algorithm actively discourages viewers from visiting radicalizing or extremist content."[3]

So this is by no means conclusive, and very much debatable. To repeat: the most charitable argument for "being proud" to work at FB is if you believe in classical liberal principles of free speech, and don't believe that social media itself creates polarization/division, it simply reveals it.

[1] https://www.nber.org/papers/w23258

[2] http://rubenson.org/wp-content/uploads/2015/10/barbera-tpbw....

[3] https://arxiv.org/abs/1912.11211




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: