Hacker News new | past | comments | ask | show | jobs | submit login
Could Removing Social Media Metrics Reduce Polarization? (onezero.medium.com)
89 points by respinal on Aug 20, 2019 | hide | past | favorite | 129 comments



I don't trust Twitter and Facebook when they justify these proposals by claiming they improve social welfare.

Hiding the number of likes makes their algorithm less transparent. Removing transparency makes it harder to distinguish between paid content. At the end of the day it's all about dollars.


No, the polarization (and attendant culture war) has been going on longer than social media as we know it have existed. It doesn't help, but it's just not the big factor it's assumed to be either - an assumption driven by social media companies' desire to trade engagement for ad $. The people who are doing most to drive social polarization are the least likely to be motivated by social proof. Consider that 8ch and its ilk have no social scoring mechanism but produce some of the most radical extremists.


Far longer in fact. Before the internet was popular. "We find that despite short-term fluctuations, partisanship or non-cooperation in the U.S. Congress has been increasing exponentially for over 60 years with no sign of abating or reversing."

https://www.ncbi.nlm.nih.gov/pubmed/25897956


My current hypothesis is that our current culture war is being driven by a lack of accountability. From graft in politics to anonymous bullying online, more and more people are able to get away with worse and worse bad faith behavior for longer and longer.


The USA's problem is the voting system. Currently, it favors extremism eg try to get the nomination in primaries saying another side is actually ok. Change system something like ranked-choice voting and you get nominees literally singing kumbaya together https://www.youtube.com/watch?v=IyJKh_CwPI4 .


>Currently, it favors extremism eg try to get the nomination in primaries saying another side is actually ok.

This is just flat out wrong, and I think you might be conflating what could be interpreted as a "purity spiral" on the left right now as "normal". It is most certainly not normal. The 2016 Dems nominated the most centrist candidate in a generation (at least), and Trump managed to win the general by essentially playing the game better in the Electoral College. Even if I accepted your premise for a moment (I don't, in reality) that ranked-choice voting actually pushes candidates into aligning with each other ideologically, to the extent it makes them less detested by their ostensible political opponents, it is not at all clear that this is an optimum outcome. Especially in times of political turmoil - and I don't think it's controversial to suggest that we are in one right now a la the 1970s - a "regression to the mean" of choice in political candidates seems to be the exact opposite of a good idea. That just spells "status quo" for electoral outcomes, and would cause many (or most) to lose faith in the political system all together.


Who would hold whom accountable?

A critical part of the factionalisation is that people will defend those "on their own side" no matter what they've done provided they maintain loyalty. In order to maintain their veneer of legitimacy they will go to any lengths to discredit the accusers. The peak of this may have been Alex Jones slandering the Sandy Hook victims, but I can quote an awful lot of examples.

The only way society can become less polarised is intra-faction accountability. Since that works against the desire to win over the other team, it's not going to happen by itself.


Or maybe there are substantive sociopolitical issues at play - primarily people at some kind of disadvantage trying to wrest power, wealth and control from those who already have it (as if history has never heard that story before). Except now it is this weird kind of interplay between race, gender, migrants, and no-one is which side they are on.


> The people who are doing most to drive social polarization are the least likely to be motivated by social proof.

Young females dedicating their lives to be Instagram fitness/fashion/bathing suit influencers would like to have a talk with you.


That's not what we mean by "polarisation": those women (not "females") are usually completely apolitical because it would hurt their product marketing.

Doesn't seem to work the same way for male "influencers", who are more likely to have a connection to the right in some way.


Is the Instagram influencer motivated by polarization or by social proof? I think the GP is arguing that when your goal is polarization you don't care about social proof.


Most Instagram influencers want you to know how awesome their body is. They want you to sign up for their fitness program or purchase (and envy) the latest fashion they are wearing.

If you don't follow their fitness program or purchase their fashion, you're polarized against them, right? Because their lifestyle is the one with millions of followers and likes, not yours.

Does that make sense?


No, that doesn’t make sense.

Polarization is a fundamental disagreement on some issue or set of issues. People pick sides for whatever reason and they are unlikely to ever change their beliefs.

An influencer is just a marketer selling a personal brand for profit. It’s entirely different.


> Polarization is a fundamental disagreement on some issue or set of issues

I feel like "I have 1m followers and you don't" polarizes people into two separate groups?


No, it doesn't. That's just jealousy. The 1m followers do not themselves have 1m followers and yet they still follow the influencer. You could make some argument that this is analogous to a religion and if the influencer pushes an agenda that could lead to polarization but is also probably bad business. I don't see social media influencers pushing anything controversial, they're marketers, they want to appeal to everyone.

Polarization is when one politician says we should tax rich people but another says we shouldn't. It's a fundamental disagreement on a personal belief. Not a statement of fact. The belief may be informed by facts but polarization comes from a disagreement on a course of action.


Could it be they're already that way and they're concentrated at 8chan due to them being driven out of places where social scoring is used?


If that was the case, would you expect anecdotes such as https://i.supload.com/HJ_QEC-rg.png ?

The internal narrative of those sites is that without the ability to appeal to popularity (no scoring) nor the ability to appeal to authority (anonymity), all ideas must stand on their own merits. Rationality means you don't get to believe what you want to believe (hence the hurt-box). Is it really surprising that a radical search for the truth would produce radicals?


But the ideas don't really stand on their own merit. The reason the poster claims to have gone onto the site at first was for trolling purposes and they used a specific political angle to do so. If you have enough people like this then it becomes easier to convince others that these bad ideas have merit as there will seemingly be many people supporting the ideas.

Basically, you're not dealing with tabula rasa because the trolls seem to favor one side and it tilts the balance.


Ideas don’t have merit based on the number of parrots.


I disagree. Haven't been all told off by our parents that "if everyone jumped off a cliff, would you?"? That in my opinion means that humans are prone to do and think what their peers do or think. The key issue is what we define as "peers".


"epic Nazi trolling" can be interpreted two ways: epic trolling of Nazis, or epic trolling as a Nazi. Given the context, I suspect it's the former.

There is no tabula rasa because everyone comes in with pre-conceived notions. For an idea to thrive in that kind of environment, it needs to be able to displace incompatible ideas. The poster specifically states it was the evidence that piled up that changed them, not the mere number of posts. It's the "that's interesting..."s that weights on you and sticks with you, not the noise.


They certainly meant the former, whether they're lying to us or to themselves is the more pertinent question.

They appear to be making the claim that unlike the soft left, they are heroically and manfully exposing themself to the raw truth.

However he doesn't really seem that bothered to have discovered that he's some kind of pure specimen that can deal with the "reality" that the Nazis were right and it, hey who would have guessed, it turns out he is better than the minority groups and women that they target. What a bitter pill that must have been for him to swallow.

There's layers of irony here, so he could be genuinely warning people that they will turn into an unlikable asshole like he feels he has, it's hard to decipher all the layers.


I've always thought that Reddit would be better without displaying karma. What value does it provide except to create a perverse goal for some people?


They very first version of Reddit used spark lines instead of points. People hated it. We experimented multiple times with removing karma from the equation. It always reduced engagement.

People just really need that dopamine hit of seeing their points go up.

Honestly I prefer the way HN does it, where only the submitter can see their own points.


I can see that; there are literally thousands of people "farming karma" on Reddit every day. It's definitely engagement but I don't necessarily think it's healthy engagement. Unhealthy engagement is what this article is all about.

Measuring engagement seems a bit like Goodhart's law.


Unless you can show healthy engagement clicks more ads than unhealthy engagement the people who pay the bills won't care what kind of engagement you have.


In hindsight, maybe optimizing for engagement also optimized for polarization?


You're probably correct, but what product owner is going to take the chance on purposefully chasing lower engagement?


Hacker News. :)


Ad-supported business can’t solve this problem.


Considering that this is also the biggest problem regarding youtube, you may be right.

However, if it's true, the consequences are terrifying, if you take into account that engagement is directly tied to the financial success of every company whose money comes from advertising.

(And here it is: yet another reason to despise advertising and surveillance capitalism...).

EDIT: typo.


>It always reduced engagement. Exactly. All these companies optimize for "engagement," regardless of how vapid, destructive or banal it is.


When all the advertisers and investors care about is clicks...


This is also why cancel culture is a thing—pressuring advertisers to remove funding is the only input we really have to what goods and services our society produces.


I miss being able to see up vs downvotes per post. With only the total displayed, it feels like only the hivemind opinion wins out.


Well, engagement for the sake of karma is likely not a worthy engagement, I would consider reducing that sort of engagement a good thing. Surely it's not a black and white matter though.

Without having seen what kind of content really gets moderated, I'd say HN got a good balance.


I've often wondered about a reddit like site that just egregiously and unapologetically showed fake metrics to make it look like people loved your comments and read them, regardless of the truth. You might have to do it a bit randomly to keep people guessing or enjoying the tension, but just make a large chance of hitting the karma jackpot from a comment slot machine - it might make people feel happier.


What about a variant of shadowbanning?

You can't see your real scores, but everyone else can.


This makes sense to me, perfect sense. Removing metrics would reduce engagement, and that's all that matters.


> People just really need that dopamine hit of seeing their points go up.

In other words, people are naturally vain.


It’s hard to notice those who don’t care.


> Honestly I prefer the way HN does it, where only the submitter can see their own points.

FWIW lobste.rs has another interesting take: it doesn't show points on comments until after a while and then they become public.

I have figured out the reason is to avoid vote piling while keeping transparency.


Or better yet, make getting karma data a paid, rate-limited feature. 99% of people would not be willing to pay for it, but the data is there (for a price) if you want to crunch numbers or looks for trends. A "karma tax" if you will.


Getting not giving? I'm pretty confused here - wouldn't most people refuse to pay and be stuck at whatever the limit was before the first payment was required? That might cause an even weirder effect where having that level of karma was considered the "maximum genuine karma" and just lead to a much lower requirement when it comes to farming likes to boost views.


I'm saying hide everybody's comment scores kind of like HN does.

However, if you pay $1/mo for access to the karma API you can then have access to the comment scores. That's all. People can still upvote/downvote freely, they just can't see the scores unless they pay.


Hrm, interesting... so it'd be less a product for common folks to judge others and move it almost exclusively into the realm of a thing to support advertisers who are hunting for "influencers" - interesting.


It provides sentiment without clutter. Instead of 1000 identical comments or 1 comment with 1000 "same" or "this" comments underneath.


You could still do that if the karma wasn't displayed.


How do you envision this working?


Sort by karma, but don't display it.


How does that solve the problem?


I don't mean comment karma, I mean user karma.


Hmm, are people really looking at their total user karma?

I admit that seeing my comments get upvoted gives me a hit of dopamine, knowing that that number of people were interested in what I said.

However, I never look at my TOTAL karma, just each individual comment. I like knowing that THIS thing I said people agreed with.


I never look at my own karma either but then I'm also not actively trying to increase it. But reddit users do tell us themselves that they post things specifically for the karma. They also do things like post screenshots of threads instead of the threads themselves to get the karma from those posts.

There is a whole host of behaviors on reddit around acquiring user karma.

For individual post karma, I'm the same as you.


Even the ability to up or downvote I think takes away from the incentive to provide good discourse. Some people are content just down voting a perfectly reasonable argument rather than contributing to the conversation and having to reason out their opinions. You even see it here, where posts are greyed out but no one has given any reason why.

I don't have a better way mind you. Unmoderated discourse tends to end up bad, and voting is a form of self moderation. The alternative of fully staffed moderation also has it's issues. Community building online is just a hard problem to solve.

Just removing downvotes I think is a good step. Because if someone just agrees you aren't losing much by them upvoting instead of posting "me too!". But if someone feels like they disagree and want to downvote, that could be an interesting conversation that never happened.


Removing downvotes to have only positive reinforcement is, IMO, a bad idea - there are things that are incorrect, misleading, offtopic or offensive that we deem do not contribute to the discussion. The HN karma system (when the topic isn't political) helps to push more interesting insights on the article or question to the top of the board.

Currently there are three actions that can be taken, upvote, downvote & flag. Since these actions are never actually defined (AFAIK) I can only report my interpretation flagging is something I maybe have done once for an openly hateful comment, but I mostly never do... I generally downvote when a statement is thin (without being amusing), redundant (common knowledge), FUD (probably a regurgitated false talking point) or off topic. This doesn't mirror my upvoting, my upvotes tend to focus on: insight, interesting question, well founded argument.

I am a bit sad your comment has been downvoted as, while I don't agree with your idea, it's an interesting thing to consider and it seems to be suggested entirely in good faith.


Maybe how slashdot used to do it (still does it? I haven't been on there for a while, but not because of moderation)...

People who had good standing would get N 'moderation points' that they could use, and you were allowed to mark N comments as being good or bad (with categories for which type of good or bad (e.g. insightful, funny, interesting, etc))

Most of the time you didn't have the option to moderate, so you can't just downvote or upvote, you have to reply.

However, I am not sure if this is actually needed... most of the comments that get downvoted SHOULD be... they dont contribute to the conversation, and replying to them just feeds the trolls.


This happens because people want a disagree button, but have only a milder version of flag button instead. Sometimes one just wants to support a viewpoint without bringing additional arguments.

I think it would be better to display separate agree disagree and flag counters on each post, and not reduce karma if there are more disagree votes than agree.


I think simply adding "Agree" and "Disagree" buttons and separate "off topic" or "downvote" button would make a big difference. What "up" and "down" means is a bit vague and everyone has a different opinion on the matter. I will readily upvote comments I agree with but I usually only downvote comments that add no value to the discussion rather than being something I disagree with. So even my own voting pattern on HN is asymmetric.


Maybe adjust the cost of upvoting and downvoting. So a downvote to a reputable commenter might cost 10 or 100 karma. Upvoting might also incur a cost. Maybe it should depend on the reputation of the post (post karma) or the poster (reputation) and also it should cost people with more karma more to vote (but not linearly).

As a side-effect it would allow real-time insight into a sliver of online virtual economics.


Stackoverflow "charges" you one point to downvote. You recieve 10 points for an upvote, which are free.


It's a problem that only gets worse at scale, too. At least with decentralisation, the problem is distributed among multiple instances and moderation teams.


I don’t have a better way mind you

I do

Get rid of votes entirely. All of it.


People might start reposting stuff they think should be more visible. You'd have to do something like 4chan's /r9k/.


I proposed it on HN for HN, to mixed reviews. A common sentiment was “if you can hide the counts from yourself, why should you be permitted to hide them from me without my consent?”. Still, it looks like the world is catching up to my idea. Hooray :) I continue to hope HN braves the ire and tries it out some week.

https://news.ycombinator.com/item?id=19745267


I've never looked at mine and I never look at others.

Arguably a-socially, I treat each post and response as a discrete pair and gain pleasure from the depth of thread which follows but only ephemerally.

Maybe karma is best self assessed?


The value is there, but it goes to the company and not the users.


> a perverse goal for some people?

Its not a perverse goal.


Racking up "Internet points" is not a perverse goal? I don't see how it provides value to either the person attempting to acquire these points or the visitors providing them.


> I don't see how it provides value to either the person attempting to acquire these points

It obviously does. correct? why else would be they be doing it if it provides no value to them.


That's begging the question. People do a lot of things that actively harmful to themselves and others -- my argument is that purposely acquiring karma, for the sole reason of increasing one's score, could very well be one of those things.


> People do a lot of things that actively harmful to themselves and others -- my argument is that purposely acquiring karma, for the sole reason of increasing one's score, could very well be one of those things.

You are making a random and unsubstantiated claim that acquiring karma is somehow "harming" the person acquiring it. Onus is on you to backup your claim.

Why would one assume that its obviously true.

> That's begging the question

I would say your point is this.

"Begging the question is an informal fallacy that occurs when an argument's premises assume the truth of the conclusion"


My original comment was actually asking the question: what value to the user does acquiring karma provide?

You begged the question by saying that acquiring karma must provide value because people are acquiring karma. That's the assuming the truth of the conclusion.

I'm not strongly making the claim that acquiring karma is harmful but I'll re-iterate my original question: what is the value of acquiring karma for either the person doing it or the audience? My unsubstantiated feeling is that provides no value but I'd be interested in hearing a counter-argument.


The problem is that we have clear proof that people are motivated to acquire karma since... people try to do that thing. You could possibly shift the question over to the topic of "Is being motivated to acquire karma a positive or negative utility - perhaps, does the user acquire karma as a goal, or acquire karma to avoid not acquiring karma for their comments?"


> What value does it provide

Many people cannot form an original opinion. A vote count makes that decision for them.


I'm not talking about vote count, I'm talking about individual user's karma counts.


Doesn't everyone benefit from the fact that there are people obsessed with finding content that appeals to them? It's like reposts: what do I care? Though I suppose you could argue this is good for their mental health.

And I'm not convinced it would even have the tiniest impact on a polarized circlejerk like r/politics.


What?! Polarization has been around since long before social media.

The two systemic factors driving it have been 1) open democratic primaries which push party nominees away from the center and towards greater polarization, and 2) increasingly polarized media originally driven cable TV news that could profitably deliver to a specific niche, e.g. Fox News, and websites generally.

Social media is just the icing on the cake.


The first paragraph in the introduction of this study lists more factors: https://www.ncbi.nlm.nih.gov/pubmed/25897956


It’s a giant political dodge that has nothing to do with the underlying issues. You want to know what’s the real cancer, the real drug these dealers don’t want you regulating? The NOTIFICATIONS. That’s something everyone on the inside knows. The average person treats validating push notifications like a mouse getting cheese and will act out and continue to generate content to get it.

Look at the emails and push notifications these companies send out to regular people on a constant, daily basis, and you will find the cancer you’re looking for. Eliminating like counts and follower numbers is a nice idea that excites teenage activists but is pointless now that these companies have already built accounts with zillions of followers.


> has nothing to do with the underlying issues

You make a fair point, but I wouldn't say it's got nothing to do with the real problem. It's a step in the right direction, even if it isn't a panacea.


You hit the nail on the head.

Don't remember where I saw it, but a long time just as smartphones were coming out I saw a bold take on what the next major platform would be. It wasn't going to be web, it wasn't going to be mobile, it was going to be NOTIFICATIONS. The app companies that understood this thrived, and the rest withered and died.


I did notice that notifications were habit-inducing early on, and disabled them entirely for Facebook/Twitter/Email years ago. Still, finding myself regularly going back to all of them, I'm not sure that it's the only problem


This XKCD makes a similar argument highlighting opening habits rather than notifications: https://xkcd.com/2183/


Polarization is orthogonal to habit formation.


Right before social media started to take off the academics were arguing if we were already seeing polarization [e.g. 0]. It might be worth considering whether the polarization trend changed with social media before attributing causes and potential solutions.

[0] Abramowitz, A. I., & Saunders, K. L. (2008). Is polarization a myth?. The Journal of Politics, 70(2), 542-555. http://www.acsu.buffalo.edu/~jcampbel/documents/AbramowitzJO...


Making twitter a little less like Jerry Springer could definitely help. Don't promote tweets from opposing groups to opposing groups. How hard can this really be?


I've never really seen tweets from opposing groups promoted to me. As a non-public-figure (AKA, nobody is directly @-ing me or attacking me), Twitter makes it really easy to stay in your own bubble. You follow who you want, and that's it. You might get some suggested tweets, but they're tweets fav'd or followed by people you're following.


There's also the promoted/sponsored tweets, and I think that's what they were hinting at (eg a Chinese news agency promoting their pro-China news stories to Hong Kong Twitter users [1]).

But that's very easy to avoid. Get one of the original Twitter clients like Tweetbot (iOS / MacOS) [2] or Fenix (Android) [3] and you'll get a chronological timeline, no promoted ads, mute/keyword filters, and you'll never see other people's likes & replies suggested to you as notifications. It puts you in control and removes the worst parts of Twitter.

[1] https://news.ycombinator.com/item?id=20734808

[2] https://tapbots.com/tweetbot/

[3] https://play.google.com/store/apps/details?id=it.mvilla.andr...


Nothing really puts you in control of twitter, when half the Twittersphere is calling you a moron because you said one thing which is counter to half the population. It doesn't matter how much you ignore them once they put you on the news. You don't even have to be as stupid as the people below to have a negative outcome from Twitter. Just consistently use it to cite your honest moral opinion and I guarantee someone will hunt you down as if you were the worst serial killer.

https://pjmedia.com/lifestyle/7-people-whose-lives-were-ruin... https://www.theclever.com/15-tweets-that-have-ruined-peoples...


From someone who's been chased off of twitter, perhaps your opinions are simply mainstream? Or maybe you're just not popular enough to be attacked.


They need to de-emphasize the blue checkmark. The eye is drawn towards it, and it lends tweets a false sense of authority. I'm not sure what a good solution is, as it is necessary to differentiate authentic accounts from fake ones, even for people who are not super-famous.


But that'd just create more echo chambers and destroy their engagement metrics, right? I feel like the contemporary draw of Twitter is the discourse.


I think there are good parts of Twitter. But it's also very easy to be a part of outrage-driven discussion.

I liked this suggestion that one way to reduce outrage is to reveal the complexity of what's going on. https://thewholestory.solutionsjournalism.org/complicating-t...


The physical world is hugely polarized. I don't see how tweaking the UI of websites is going to reduce real-world tendencies, it's a losing battle.


It seems to me the online world is more polarized (and abusive) than the real world.


Online culture may have driven this, but in a survey of people in the USA people were asked which type of person they would least like their children to marry. The “types” were things like same-gender, transgender, Christian, Muslim, no-college, Republican.

Those who identified as Democrats had Republican as “least liked”.

Those who identified as Republican had the first two I listed.

However more polarised online is, that above is real world polarisation, and it’s pretty serious.

Source: https://open.spotify.com/episode/3ymwgbNCHG9f34dmxPNNuj?si=R...


Completely agree here. For most people, having that fired-up political debate makes them feel smart and empowered.


> “The problem, of course, that Dorsey and Zuckerberg and everybody else are running into is they’ve built an entire business model in which metrics are a central component or central mechanism by which engagement with the platforms is driven.”

This is the heart of the matter. These platforms exist not only to enable polarization, but their very business model thrives on polarization. There are a lot of other second-order effects, too, such as "The Perception Gap," which I saw posted to HN recently: https://perceptiongap.us/


I always wondered if social media instead of polarizing could actively counteract bubble metrics to lead people to a more centrist view. Essentially by "inverting" the interest function partially.


Years ago I daydreamed about making a social news aggregation website like Reddit(though I was more into Digg at the time) that focused the "karma" on getting multiple, mixed view points and perspectives on a story/topic. So like if there is an event, someone posts the first source they find on it. Others get karma by linking other sources. I think I was brainstorming ways to record where sources disagree and other ways of keeping things open and balanced.


Who gets to decide what is extreme?


The targets of extremism.


The local majority.


Studies show that YouTube does largely do this. The recommendations are usually proximal to the current video but they do generally trend toward the center. I wrote my own YouTube crawler to test this empirically (or at least as empirically as I can, recommendations aren't exposed through an API and I had to use selenium). It is possible to go towards either extreme if one selectively chooses more and more left/right leaning videos, but a random traversal usually doesn't result in extreme content. It's generally difficult to gauge, because recommendations usually fall into what I call "topic pits" where once you view a few videos in a topic all subsequent recommendations will be in that topic. So my crawler would often fall into viewing a bunch of videos about gardening or math or something that is totally not political. By the point is, I never saw any trend of my crawler going down the "rabbit hole" that allegedly exists.

The problem I see is that people's preexisting polarization means that they see a trend of centrist content as radicalizing. So you end up with liberals seeing centrist recommendations as pushing people to the right wing, and conservatives see a push to the center as restricting conservative content


I think making reshares and retweets much harder than the 1-2 clicks it takes now would have far more impact.


Politics is driven by the most extreme, passionately held views because those people are the most motivated to do something about it. Responding to extremes is a game of chasing goalposts.

Removing social media points isn't going to change any of that. People care more about belonging than being factually correct. As long as there is a place for two people who happen to share the same beliefs to find each other, polarization will continue to be a thing.


Question for the HN admins: What was the impact of removing vote counts from comments? What lessons did you draw from it?


I don't know. It was a long time ago. I could make an answer up, but you could do that too.


I've experimented with this theory by opting myself out of these vanity metrics using a browser plugin I wrote to hide them.

I can say that hiding them has reduced my engagement, which was my goal (the plugin is called Disengaged), but hasn't done anything for polarization.

To establish a link between engagement with these sites and polarization, you would basically be taking the stance that these sites are inherently polarizing, so using them less (by removing the engagement generators) decreases polarization.

Not a bad theory, but not one we can expect the businesses and investors behind these sites to implement on society's behalf.


I don't know about polarization, but I'd bet it would make everything a little less addicting in a very healthy way. When I left facebook, part of the reason was that whenever I posted, it came from a weird emotional place. Instead of casual "i'm gonna share the thing" it was "think really hard about how to make it super cool." Here too, but to a lesser degree since it's more anonymous and I don't know you all.


I don't see anything great about being casual. If I post some underwater photos on Facebook then I absolutely think hard about making them super cool! I don't waste my friends' time by posting sloppy garbage.


The problem isn't the like button, that's a symptom. The problem is human nature; specifically confirmation bias. The internet has enabled people to find what they want to find and see what they want to see.

We are a species that survived because we are social. That said, perhaps we are getting too close, too connected? Maybe we're wired for less "engagement"? Perhsps occasionally being quite and/or lonely has benefits?


No.

Remove the text.

Make it a long form conversation. Yes, voice. So that you can hear inflection, empathy, sarcasm, hmmmms and uncertainty.

Every tweet reads like a proclamation. No nuance.


I like how WeChat does it for their "Moments" posts, where you can see only the number of Likes from your friends.

https://www.quora.com/Who-can-see-what-you-publish-and-comme...


What is wrong with "polarization" beyond it being different to the centre? All this talk of it seems nothing more than mere reactionary.

Surely, views from the extreme shouldn't just be dismissed, and we should understand why people hold those views instead of doubling down and trying to eliminate them.


We already understand that pretty well. Trying to hug it out with people who are already committed to killing you is a fool's errand. There are several groups devoted to deradicalization if you want to research that in greater depth.


Wow.

That’s a huge chasm you’ve shoveled to define polarization, “people committed to killing you”, there are definitely extreme whackos out there but do you really think they’re that representative of the whole or is there maybe a more nuanced way of defining the polarization effects of social media?


If we step it back a bit to "committed to policies which cause avoidable death", or "protection of those who have caused death by negligence", or "mistreatment of others in a context which is known to trigger suicide", then ... yes, that's all mainstream policy.

One example of this was https://www.independent.co.uk/voices/katie-hopkins-when-is-e... ; now, is Katie Hopkins "an extreme whacko" for overtly demanding the murder of refugees, or is she "just saying what everyone is thinking" among Sun and LBC audiences?


The GP was specifically talking about extremists.


But what if people didn't have these views before and platform polarization is in fact creating these extreme views. That could very well be the why.


It certainly facilitates transmission but the underlying ideology is usually independent of the technology, imho - although it's arguable that extremist ideologies may be used to navigate the social uncertainty accompanying disruptive change.


Instagram is already testing hiding Like counts in several countries: https://www.nytimes.com/2019/07/18/world/instagram-hidden-li...


Polarizing opinions drive views.

It might help to tamper the anxiety from watching your own posts, but let's be real—social media is a polarizing place because there's money to be made from drama, even (especially?) if it's fictional.

Looking to reduce polarization? Blame our ad model.


CGP Grey's "This Video Will Make You Angry"[1] makes this point well. Opposing ideas will generate a lot of discussion/activity but mostly within the opposed groups instead of between them. That further drives polarization as the opposed groups mostly believe their own caricatures of the other group (which further disincentives cross-group discussion).

[1] https://youtu.be/rE3j_RHkqJc


Absolutely. Anything you measure and show the measure as a sign of progression will skew things towards behaviors that lean towards an increased occurrence of that metric.

To quote Worf: "If victory doesn't matter, why keep score?".


Someone should make a social network where bots autonomously interact with and like your posts in a positive way, as part of the core design.

Call it Turing Club


Could reducing polarization reduce profits?


Paywall. But how does the author deal with replies, which are a very strong metric?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: