Didn't Tumblr do a whole "alienate your user base" thing like Reddit did recently? They got rid of all the racey hot content and basically made it super boring no?
They did, but that was a while ago. Things are still pretty bad -- if your blog gets flagged for racy content, you lose the ability to have a custom avatar, for example. (Even more, and you get banned.) They've also had quite transphobic moderation policies (really strange considering their userbase!) -- normal, non-sexual photos of trans people are "nsfw" now, which'll get your account flagged or banned. They've been making the app actively worse, redesigned the website to look like twitter, and attempted to cram "Tumblr live" down everyone's throats (only having a "hide for one week" button). Generally doing things nobody asked for, nobody wants, and continuing to break existing functionality while not listening to their users. This is an own-goal.
> normal, non-sexual photos of trans people are "nsfw" now
FWIW, the algorithms that big tech companies use are often trained to recognize women’s underwear. So non-sexual photos of trans men wearing women’s underwear are often flagged, even when similar photos of women wearing boxers are not.
I’m not saying it’s right, just giving the actual reason this happens. (Albeit I don’t have Tumblr-specific knowledge.)
Cool. Now why does it consistently happen for images where no underwear is visible, and why doesn't it happen for cis women? Why are appeals not granted? And why didn't staff even acknowledge there's a problem?
No idea. I’ve never worked at any of these companies, I just learned this in an academic conference presentation on ethnographic research into OnlyFans users. OnlyFans doesn’t have its on recommendation system, so users need to find their customers on instagram, Twitter, tumblr, etc.
> OnlyFans doesn’t have its on recommendation system
This is by far the most mind-blowing thing about OnlyFans to me. I’m a very open minded person and I would have been so sure that a site like OnlyFans would absolutely require discovery. Like, so sure that if I was making it, I wouldn’t even question including it. The fact they just offloaded this to every other site, and then took all the monetization is amazing to me. Twitter, Reddit and Instagram can barely get money from these users, but they’re super popular, and they just link to their OnlyFans on their profile.
I seriously wonder what other opportunities like that are out in the wild.
They also centralize all the risk. Operating a “pay me $X/month” service is really complicated legally, ethically, etc.
In a way, OF is valuable to platforms like Twitter and Instagram, because it has monopolized a lot of the riskiest cash flow that those companies would prefer to avoid / explicitly disallow in their ToS for a “subscribe to this influencer” service.
If OF had their own discovery platform, they’d have to deal with the thorny editorial problem of ranking. In the current setup, they can totally sidestep it. It would probably cost them more to operate discovery than it’s worth.
How sensitive/stupid are these? Can the behavior be exploited to force innocent images get flagged as explicit? Asking for a friend… also would make an excellent shirt design if properly tweaked
Yes, there are some examples I’ve seen in the past year of images specifically generated to make image detection think it’s X when it’s really just a mess of shapes and pixels
Tumblr users are rabidly against all the user-hostile stuff other large free sites do. I think they probably could have put the site behind a $1/mo paywall two years ago and people would have gone for it, if they changed nothing else. There's a general "we're fucked" atmosphere on the site because we all know they're trying to wring money out of their users and the enshittification has begun. It doesn't seem like anyone in charge understands their users.
that was a long time ago, under a previous owner. Automattic has seemingly tried to be a good steward of the tumblr brand, but it seems like at this point, the only people left using tumblr are unmonetizable.
the bad idea was buying tumblr. but i think automattic knew that when they bought it, and just thought it would be fun to set some money on fire for a while.
> Didn't Tumblr do a whole "alienate your user base" thing like Reddit did recently?
No, they did that years ago (getting on for literally 10 years ago), and that's what ultimately resulted in selling it to its current owners for like 1/1000th of what Yahoo paid for it.
Those are too big to fail. Nobody would buy an iPhone that doesn’t have Instagram.
500px is another example of a site full of nudity that has to arbitrarily censor for iOS, all because Apple is protecting their 10x-market-rate credit card processing business (30% instead of 3%) by preventing sideloading. It’s bullshit that Apple decides for me what I am allowed to view. How convenient that it also protects their revenue stream.
I think you might be forgetting that Tumblr had a completely fine status quo, including showing me very much adult content on my iPhone, until Verizon bought them.
Blaming the App Store was always an excuse.
Tumblr just didn’t want to put the effort in to keep CSAM off their site.
What you've said above is completely incorrect, and as a former Tumblr employee I find it deeply offensive.
Why confidently state these things when you seriously have no idea whatsoever what actually happened here? GP's statements were much closer to the situation in reality.
> Last month, Tumblr said its iOS app was pulled from the Apple Store over child pornography found on the platform. According to a Twitter user who claimed to have found the illegal content, Tumblr had initially failed to take down the images, despite repeated complaints about the problem. (As of Monday, the Tumblr app is still unavailable on Apple's App Store.)
I think you can also look at the various discussions of this issue online. The user perception was that it was inordinately difficult to report abusive content.
I also recall an instance finding problematic content and now knowing how to report it. I think it didn’t occur to me think “I think this person might be underage, better hit the share button”. TikTok for example makes it incredibly easy to report offensive, unsafe, or illegal content.
Whether or not Tumblr management _intended_ to have very poor controls against CSAM doesn’t change the fact that they did in practice. And when push came to shove, instead of protecting user expression on their platform, they banned a whole category of content rather than building the tools and policies needed to make it work.
Worst case, they could have, instead of deleting all adult content, removed the toggle for iOS users and made them go to the browser for filtered content.
Even if you hate Apple for censoring their platform, it was Tumblr’s choice to apply that policy to Android and web users.
This response and earlier comments also fail to address the fact that the iOS app had a toggle to block NSFW content. It was allowed in 2017, and it was allowed in 2022. So I’m not sure why they had to purge non-CSAM content and remove the toggle in 2018. Source: https://techcrunch.com/2022/01/11/tumblr-sensitive-content-t...
Is there insider info that challenges any of that?
You're making a ton of incorrect assumptions about what approaches Tumblr tried, what solutions Apple rejected as insufficient, what resourcing decisions were made by Tumblr vs parent company, and so many other critical details here.
It's pretty messed up for you to say "Tumblr just didn’t want to put the effort" when you have no idea who or what was actually responsible for any of the things you're describing. I don't feel you are addressing this topic in good faith, so I'm not going to continue discussing this with you.
You’re saying that from the perspective of the Tumblr people, having been purchased by Yahoo/Verizon, there were limited options. And maybe negotiations with Apple came into play once the CSAM problem was already so bad the app was delisted.
What I’m saying is that from the consumer perspective, it doesn’t really matter who in the company made the choice (Tumblr people or Verizon people), a business choice was made to remove a significant part of the users and use case for the platform.
Similarly by the time Apple banned the app for CSAM, maybe it truly was too late to keep NSFW content on the platform for Apple users, and maybe it was seen by ownership as too expensive to implement a partial solution. Regardless of ownership, failing to invest in effective moderation and flagging tools at every point in Tumblr’s journey was a business choice.
Maybe it was too expensive to keep CSAM off the platform. I don’t see evidence that Tumblr made those investments. An easy reporting option for users is the bare minimum. Maybe Tumblr would have bankrupted themselves policing that content - but choosing to not go down that road is still a business decision and still not something that can realistically be pinned on Apple.
In other words there are two versions of the story:
- Excuse: we had to take down NSFW content platform wide because Apple hates adult content
- Explanation: we didn’t have the resources to police an NSFW platform effectively, so we made a business decision to drop that segment of our business
This always seems like a really bad idea.