Hacker News new | past | comments | ask | show | jobs | submit login
Why did Usenet fail? (shkspr.mobi)
172 points by edent on June 5, 2023 | hide | past | favorite | 281 comments



I've used Usenet quite extensively in its heyday. At one point I even ran a Usenet site with dozens of peers and thousands of users. All on a very beefy (for its time) Sun E450.

In my opinion Usenets greatest strength was also its downfall. It is a distributed system without central authority over who could connect to the network or firm control over groups and their contents. To connect to it as a peer, you just had to find at least one Usenet site willing to exchange messages (peer) with you. Usually that wasn't a problem. ISPs, universities, organizations of all kinds were running their own servers, offering Usenet client access to their customers and members for no additional charge.

In a distributed system without central authority, innovation is exceedingly hard though. The protocols and features were basically set in stone. Taking the concept of offering discussions among users to a centralized or closed system made it possible to innovate very fast and offer superior features to the users.

As the article pointed out, that wasn't the only protocol which got picked up that way and innovated upon until it displaced its origins.

Since Usenet is still frozen feature wise were it was 30 or more years ago, I don't see a revival coming anytime soon.


This is the key problem with decentralised systems. They simply can't innovate as quickly as centralised systems.

We saw this with usenet and we also saw it with IRC. IRC really is what slack was, 30years ahead of it. However, IRC has so many essential missing features it never caught on - push notifications, saving your state when offline, admin controls, etc. I know some of these were solved with extensions and workarounds (having a shell connect to irc an then you connect to the shell for persistence for example) but it's a huge hack. This compares to Slack or Teams where they can push a back and front end update to millions/billions(?) of users in a very short space of time to add additional functionality.

This is another reason why all the decentralised 'blockchain' things would have struggled (on top of many others).


I think this argument is seductive but wrong because it ignores an invisible elephant in the room: funding.

Decentralization is a business problem, not a technical problem. Engineers tend not to see this because we're engineers and so we see technical problems first.

Usenet had no economic model. All the problems you list are solvable if there were funding available to solve them.

Free volunteer developer work tends to stop at the level of polish with which developers are comfortable, which is usually command line interaction and fairly manual processes. Developers generally have to be paid to develop new features and polish those features for the general audience, which is why there are precious few open source systems used by anyone other than developers.

Those that do exist tend to be subsidized by huge companies for the purpose of "commoditizing your compliments" or as tools to herd users into an ecosystem that has upsell opportunities built into it. Examples: Chrome, any open source client for a SaaS service, etc.

Non-profits can fund to some extent, but the truth is that polished feature-rich easy to use software is extraordinarily expensive to produce. A system that a developer can create in their spare time might cost millions to render usable to non-developers. Computers are actually very hard to use. We just don't see this because we're accustomed to it. Making them easy to use is a gigantic undertaking and is often far more difficult and complex than making something work at the algorithmic level.

Centralized systems with built-in economic models like SaaS or commercial software tend to triumph because they can fund the polish necessary to reach a wider audience. A wider audience means exponentially larger network effects. See Metcalfe's Law.

Cryptocurrency could have offered an alternative model but failed for entirely different reasons: perverse incentives that attract scammers. In crypto by far the most profitable thing to do is build a fake project that can appear just credible enough to attract retail buyers onto whom you can dump your tokens. There is no structural incentive to stick with a project and really develop it because all the money is made up front through the initial offering. This also ruins the ecosystem because "the bad chases away the good." Scammers make legitimate people not want to go anywhere near crypto, transforming the whole ecosystem into a "bad neighborhood."


> Decentralization is a business problem, not a technical problem. Engineers tend not to see this because we're engineers and so we see technical problems first.

> Usenet had no economic model. All the problems you list are solvable if there were funding available to solve them.

You hit the nail on the head. Usenet was very much a "back channel" in its early history, with school and corporate IT people setting up feeds on the quiet. Indeed, it began outside the Internet, using dialup UUCP links to exchange news and mail.

Email addresses were "bang paths" where you had to route your mail to its destination. If your outside mail gateway was foovax, an outgoing address might be like "foovax!decwrl!ihnp4!uiucuxa!example" instead of "example@uiucuxa.uiuc.edu". In posts, it was a convention to specify your email address path from one of the famous well-connected sites.

One of the biggest early players was (surprise) Bell Labs. If the name "ihnp4" rings a bell to you, you might be as old or older than me. :) That was a central hub for the UUCP network, both news and mail. I wonder if any AT&T bean counters knew about it back in the day, or, getting back to the original point, how many bean counters found out about Usenet in their respective companies and forced it out?


I'd add, polish often means different things to different people. For example, failure planning and risk management often only happen when there is funding.

There also was no content moderation.


> However, IRC has so many essential missing features it never caught on

IRC never caught on? IRC was extremely popular throughout the 90's and into the 2000's


You're right; but I read that as IRC didn't implement ("catch on to") those essential features as technical feasibility marched forward.


> extremely popular

Email and the web were extremely popular. Facebook is extremely popular. Very few people have ever known what IRC is.


Decentralised systems solve this problem with a BDFL, which many have used successfully. Look at Linux, for example.


> without central authority over who could connect to the network or firm control over groups and their contents

Moderation. In public forums, everything else is trivial and moderation is the invisible stinking hairy elephant in the room. To the question "why isn't there a decentralized open protocol free software X" the answer is moderation. Think or it as the discreet but efficient bouncers without which a sufficiently popular public place cannot maintain a welcoming atmosphere. Newsgroups became sufficiently popular, death by inadequate moderation ensued.


"Moderation" is a terrible excuse used by the corporate types as a reason to stick in their safe walled gardens.

We had moderation on Usenet. It was simple, but damn, did it work. We had plonka lists (client selected ignored users), keyword scoring, server level scoring, some anti-spam. There was EARLY work in Bayesian filters built in to clients, but was too late for the "big event". (When ISP's everywhere killed Usenet servers, for what is pretty surely assumed as anti-piracy.)

But lets look corporate moderation. It's done in secret, with vague rules that may or may not be stated, and by content moderation farms that use and abuse people. https://www.wired.co.uk/article/facebook-content-moderators-... . Or worse yet, you fool people in faux-ownership (reddit) and get them to do your moderation for you. Well, that is unless you do a bad job and get your subreddit banned for "low/no moderation".

I'd much rather see everything, and craft my own blocklists as I see fit. You know, just like Mastodon. Sure, there's some fediblocks at admin-level over pervasive content from a single server. But aside admin tasks, its pretty damn sweet. And it's loads better than the shit that happens over on Twitter, Reddit, or Facebook.


We, the technologically literate, are not the target audience. Client-side filtering requires awareness of client-side filtering. How many Reddit users want that ? Facebook users ? Oh, maybe they want to fine-tune their scoring rules library or tweak they bayesian filters - so much fun ! And lacking social intelligence, client-side filtering fails against organized harassment. By the way, who is liable for infractions to the local law ?

Enter the party, meet regulars, meet new faces, enjoy a drink and don't even think about safety. Sure, you can manage it yourself for a birthday party at your home but at larger scales even the wild rave party thrown in an abandoned hangar out there will have some form of institutionalized enforcement, even if they are not corporate security.

Yes, that means that successful large public places tend towards bland - the same reason why successful edgy underground dives are small affairs and their character can't survive growth.


There's no reason a private/paid service addon to a client can't work for filtering. Been thinking similarly for Mastadon, in that you can pretty easily have a gated experience, and get top ranking for Android/iOS client use with a monthly service that includes filtering with relatively sane defaults, but toggle-able filtering/reporting.

You can use fdroid, webapps or side-load the "full" unfiltered client but the experience for the normies would be that paid app experience. And I say paid app to avoid the pitfall that is advertising fed results.


> There's no reason a private/paid service addon to a client can't work for filtering.

There should be no reason why a service can't provide that feature natively.


Cost.


> Client-side filtering requires awareness of client-side filtering. How many Reddit users want that ? Facebook users ?

Facebook[1] and Reddit[2] both have a block user feature. It's a very rudimentary version of the killfile that usenet clients have. If users didn't want features like that, then they wouldn't be available on either platform.

> client-side filtering fails against organized harassment.

Usenet killfiles were advanced enough that you could easily filter out messages even from organized campaigns. For example, filtering it based on a number of conditions such as group header, from header, a unique header, certain keywords in the body, etc.

[1] https://www.facebook.com/help/168009843260943

[2] https://support.reddithelp.com/hc/en-us/articles/214548323-H...


> We, the technologically literate, are not the target audience.

Highly disagree. It's the early adopters of ANY platform, hardware, or whatever that drive real organic demand and adoption. And those early adopters ARE exactly that: technologically literate.

Alienating that group is how you get relegated to the "Digg v4" area of history.

> How many Reddit users want that ? Facebook users ? Oh, maybe they want to fine-tune their scoring rules library or tweak they bayesian filters - so much fun !

How many Facebook or Reddit users even HAVE the capability of client side anything? The big threads are that Reddit's killing most API access unless you pay a pound of flesh. And Facebook does enough horrible things in their coding that makes it impossible to even screen-scrape. The text "Sponsored" shows up as shittery like "essprnodo".

For those gated islands, you'd need their permission to really do any sort of client side anything.

Places like Mastodon make blocking trivial. Ive shared that link, and no, you don't need to know about Bayseian weights or whatever new mod scheme you want to make. Its there, the initial functions are easy-peasy, and the API hooks are there to add whatever you want.

> And lacking social intelligence, client-side filtering fails against organized harassment.

And that already is a major problem on the corporate side. Reddit calls it "brigading". HN has chill-down scripts to defuse flamewars. Twitter, Facebook, and Reddit all have mass-reporting as harassment. There's even current 4Chan threads to organize just that. Sure seems like an unsolved problem, not just "unsolved for federated systems".

> By the way, who is liable for infractions to the local law ?

Irrelevant. Someone in another area could read content that is illegal to them. That's *their* responsibility, not mine. As long as I follow the laws in my jurisdiction and where my server is located, I'm fine. And 17 USC 512 covers me as long as I do good-faith DMCA process.

> <example of comparing physical real world with raves/alcohol/drugs with online "dangers">

And in your latter half, your other problem is that you're equating online "safety" with in-person bodily safety with alcohol and drugs. Those 2 aren't even remotely the same, and its laughable that you'd try to compare them.

If someone is coming on to you in a sexual manner, and your "No" isnt effectively heard or acknowledged, you have a big problem. If someone uploads a bad image (say, animal torture, for something abhorrent), you can block the user, the server, report the user to their/your server admins. You can also choose not to show images. It's terrible, but physical assault is no comparison to a bad image or terrible text.


Someone in another area could read content that is illegal to them. That's their responsibility, not mine

Unless you serve people from the EU or sell things to US customers like bitcoin.


Moderation isn't only about removing bad content, it's about removing content that's not necessarily bad but is unwanted. /r/gaming and /r/games are entirely different subreddits, with different content focuses and different communities because of their moderation. If I really wanted junk food content consumption I can go to /r/gaming. If I want to see news about video games I can go to /r/games. How do you accomplish that with client side filtering? /r/games has Indie Sunday where developers can post their own games on Sunday without competition from news about the massive AAA games everyone's already heard of. How do you accomplish that with client side filtering?


Good point. The majority of my Usenet memories revolve around meta discussions. What is and what is not supposed to be posted. Mid 2000s and people still had flamewars about whether a four line signature was to heavy.


Most meta discussions (beyond the basic etiquette and moderation, which were mostly worked out in the 1980s already) don't have objective answers, just subjective preferences. The solution is to keep aggressively iterating on splitting subreddits/communities/categories of users, until quality interaction is maximized. Obvious example: one category of user wants to discuss politics via memes, another category wants to discuss in long-form essays, another in threaded discussions, Canadians want to discuss Canadian politics without being flooded with US stuff, and plenty of others not at all i.e. they want any political posts off-topic and banned. The obvious way to keep all these groups maximally happy is to split into different groups.


Corporate/algorithmic moderation is almost entirely about removing "bad" content. You're talking about community moderation, which works exactly the same in decentralized systems.


> the "big event". (When ISP's everywhere killed Usenet servers, for what is pretty surely assumed as anti-piracy.)

I thought this was due to the deal[1] that Andrew Cuomo (New York's attorney general at the time) made with major ISPs to restrict access to child pornography. Many major ISPs discontinued their usenet service in that time period.

[1] https://www.cnet.com/tech/tech-industry/n-y-attorney-general...


There were plenty of well moderated newsgroups actually.

A moderated newsgroup had all messages going through the moderator's system, with their programs deciding whether or not the message should be forwarded to the rest of the newsgroup. This doesn't necessarily mean that a human read all messages either.

Some of the oldest flamewars I've participated in are about moderated vs unmoderated USENET newsgroups. It amuses me that we're still debating the issue today.


What kind of moderation programs were people using then? Word filters? I can't imagine anything that isn't both easy to circumvent and likely to catch innocent posts merely quoting a "bad" word.


It wasn't like programmers were less creative or productive than they are now. The languages were fully capable of doing anything on a server you could do now. It was mostly C. Programmers were creative and smart, just like now, and could implement anything, albeit ground-up since there were fewer existing building blocks to reuse.

A lot of sophisticated anti-spam software depended on some sophisticated anti-anti-spam showing up first, and there is a lot more of that now for sure.


Even now forums rely on manual after-the-fact moderation and not just algorithmic pre-moderation.


Several programs are available to do it: https://www.big-8.org/wiki/Moderated_Newsgroups#Moderation_S...

The type and quality of moderation is up to the moderation team. Might be automatic, might involve a review by a human for each post.

As someone else already mentioned, there are groups which were configured to be moderated but have no moderators at all to approve posts. Either they disappeared or there never were any. To participate in these groups you have to set the "Approved:" header in your post yourself. Obviously that would also be the way to circumvent the moderation process for any other moderated group. The countermeasure is that the moderators send out a cancel control message which instructs all the newsservers receiving it or at least those who honor cancel messages, to delete this specific message.


I wasn't "tech awakened" at this time, so I simply didn't know which programs were used.

What I can say is that plenty of spammers would have one or two posts on Newsgroups, before the mods would catch them and ban their email address.

Depending on the Newsgroup, I'm sometimes "new users" would have all of their posts reviewed before their posts were forwarded to everyone else. But after being added to some kind of whitelist, they could say whatever they wanted (well, until a flamewar caused them to go over the line and get banned).

So similar tools as moderators use today on Reddit / IRC / whatever. Or at least, it felt like that to me. Maybe it differed in implementation, but the overall effect was the same.


People used Killfiles [1]

[1] https://en.wikipedia.org/wiki/Kill_file


I don't think a "kill file" was the same as "moderation".

A kill file is "User X is tired of seeing User Y, so X stops reading posts from Y". Everyone else can still read Y.

A moderator is "Moderator is tired of seeing User Y, so Y cannot post to Newsgroup anymore". A much stronger action.

---------

Its been a few decades since I was part of any Usenet though. So maybe I'm mis-remembering the lingo. Or are you saying that the moderators used kill-files that somehow applied to the whole newsgroup?

Because... the later would make sense. But I really don't know how things worked back then.


There's a third layer: the servers. Newsgroups relied on servers propagating messages to other servers as well as delivering them to end users, and they could have various types of filters.

The same sort of model exists in modern distributed systems like Mastodon/ActivityPub. I can block a user from another server or an entire server individually, or my server admin can block them from communicating with anyone on my server.

What's missing that I think will be required for anything that gets popular enough is a means of sharing blocklists automatically, preferably with some machine-readable details so that they're useful even when servers have different rules (e.g. I want to subscribe to bans from Foo if the ban is for hate speech, but not for porn).


The machinery for a moderated newsgroup works by the news server you post an article to having a list of moderator email addresses for moderated groups. Instead of posting your article the server emails it to the moderator. The mod will then either just discard it, or else post it to their own news server with an Approved header to say it's moderated (and these days some crypto signing stuff). Then the post gets propagated as usual.

The "approve/deny/edit" stuff is handled by moderation software, which in principle can do anything you like.


No, killfiles were at an individual level. But they were more than "I'm tired of seeing User Y", as you could also plonk text patterns and the like.

IIRC group-wide moderation was handled per-post in that every post was approved by hand.


> IIRC group-wide moderation was handled per-post in that every post was approved by hand.

Definitely not. The kinds of "mistakes" I've seen in moderation back then suggests that the process was highly automated.

Moderated Usenet is way more similar to modern Reddit than people probably realize.


There was moderation. There was even a newsgroup that probably has some overlap with here that required you to circumvent it in order to post.


[flagged]


Your summary is incorrect, and insulting. That is not how it happened. The old academic usenet was more than content to discuss sensitive topics. The problem with post-September usenet was spam.

Small groups were ok. As soon as a group had any reasonable traffic, the spammers moved it. Since the spammers didn't care about conversation, they didn't need to maintain stable identify and evaded killfiles. They could quickly drowning out the topic with 10x ads. Not wanting this is not about being uncomfortable - it was about swimming in 90% garbage ads.


> Your summary is incorrect, and insulting. That is not how it happened. The old academic usenet was more than content to discuss sensitive topics.

I see no insult towards you, or anyone else in particular. In fact, to the opposite, others here are saying Usenet died due to lack of content moderation - aka other people enforcing their will on others.

> The problem with post-September usenet was spam.

Now I know you're just guessing. The 'September that never ended' was when AoL peered with internet gateways to bring everyone on the AoL network to the internet, along with knowing little about computers or anything. "Me too" was around then, as a derogatory comment of what an AOLer would say. Even Weird Al in "All about the Pentiums" had a line to put them down like Old Yeller.

Usenet really died in 2007 winter-2008 Spring, when all major ISPs killed their subscriber Usenet servers. They never officially stated why, but grumbles that made it out in the systems engineering folk were that it was alt.binaries were eating up loads of bandwidth, and much of it was 100% piracy.

Spam was a thing, but most of us used good NNCP clients that removed a good 90+% of spam. And having lived through when Usenet was ubiquitous, I fully reject your hypothesis.

And you might want to consider taking conversations here more impersonally. Feeling "insulted" over a comment that was directed to nobody in particular isn't healthy.


> Usenet really died in 2007 winter-2008 Spring, when all major ISPs killed their subscriber Usenet servers. They never officially stated why

It was because the attorney general of New York (Andrew Cuomo) made a deal with major ISPs to restrict access to child pornography[1]. ISPs decided to drop their usenet service in response.

[1] https://www.cnet.com/tech/tech-industry/n-y-attorney-general...


Towards the late 2000s, alt.binaries was irrelevant to consumer ISPs etc, as it was not available. It was trivial for any systems engineer to exclude alt.binaries.* from their news server, and most did.


This is the main thing IMO.

Usenet went from students posting at a modest number of elite schools, .mil sites, and a relative handful of tech companies--often under their real name--to the Eternal September and beyond. Like many problems with the Internet over the years which occurred because systems that grew up on the assumption of mostly trustworthy actors.


This is level 5 of the Content Moderation Speedrun.

https://www.techdirt.com/2022/11/02/hey-elon-let-me-help-you...


More it was overwhelmed with bad actors spamming everything, and no way to stop that.

Free speech is not about letting you post your conspiracy theory in the wrong group


I can see you are a "I support free speech but [except]" and then proceed to explain that you do not actually support free speech


What if someone followed you around and just kept saying "You're a cunt. You're a cunt. You're a cunt." over and over?

Any time you're not on private property. "You're a cunt".

Is that free speech?

I mean, I'm allowed to be on public property. I'm allowed to say whatever I want. Exercising those two rights means I'm allowed to follow you around and call you a cunt.

You'd probably try to say "No, that's harassment". Aye, but now you've agreed there should be limits on speech.


When people say that there should be no restrictions on speech, it's usually about content, not manner. No free speech absolutist would think it's acceptable to tell someone "you're a wonderful person" through a 600 dB speaker (if it was possible to construct such a device). If I scream at you so loudly that it liquefies your organs, I don't think that the fact the sound happened to form words makes the act any less harmful. Why should it be any different with harassment?


> If I scream at you so loudly that it liquefies your organs

At that point, the "speech" is objectively harmful.

> Why should it be any different with harassment?

Free Speech absolutionists act as if words on their own are not harmful, such there's no such thing as verbal harassment.

Look at the sibling comment [0]. phpisthebest actually claims that being followed and told "You're a cunt" repeatedly would be amusing.

[0] https://news.ycombinator.com/item?id=36202392


If they have that kind of time, and no life I would find that rather amusing...

I am sure they would tire of it long before I would get offended by it. and I can assure you I would could, and have responded far more decisive and penetrating insults...


If there's one thing we've learned from the last two-decades-odd of the Internet being ubiquitous, it's that one cannot, in fact, outlast the harassers. They organize and they take shifts. They can keep the harassment going as long as it takes. They can keep it up until you are dead. And they can do it at scale.

That's why cutting off their ability to do it through various channels is so important.


The internet reduces the cost of being an ass, both in terms of reach and being punched in the face for harassment


>What if someone followed you around and just kept saying "You're a cunt. You're a cunt. You're a cunt." over and over?

Free Speech is just that. It's not freedom from consequences too.


Not sure how the "Freedom from consequences" bit enters the conversation here.

"Free Speech" that has consequences from the government isn't free speech. The "Freedom from consequences" has to do with private entities reacting to your speech. For example, asking you to leave the premises or getting banned.


Is it possible to separate free speech from free consequences? I'm not sure it can be. From your house, to government, to nature, doesn't every person/place have their own set of compromises/laws/rules for every set of conditions? Are you saying there should be no laws or no moderation, anywhere, at all? I don't think you mean to. Also, should 'online' be distinct/exempt from all other human endeavours, where rules don't apply? Anywhere there is a line between individuals and governments regarding free speech, and indeed law, then neither free speech, or law, exist truly. But it is difficult to give an example of a country where this is not so.

I am reminded of the other hot potato 'privacy'. The 'Solid Project' has an interesting way of dealing with that, which may also have a positive effect on how free speech evolves.


> Are you saying there should be no laws or no moderation, anywhere, at all?

I wanted to address this first. Nowhere did I mean to imply that I think there shouldn't be limits on speech or no moderation. There should certainly be some speech that is illegal (Calls to violence, harassment, etc), and web platforms absolutely have a right to moderate.

But, I think everyone can agree that the right to free speech granted by the First Amendment does mean that you can criticize the government, for example. If someone tries to claim "Yes, you have the right to criticize the government, but the government can arrest you for it! Freedom of speech does not mean freedom from consequences!", then I would have to ask that person what they think "Freedom of speech" means, and what a "right" means, because to me, a "right" means either "You are allowed to do the thing and the government must not interfere or retaliate" (ie, free speech) or "You must be allowed access to the thing, and the government must provide it for you if you can't provide it yourself" (ie, a lawyer when you are on trial).

The notion of "You have the right to something, but the government will punish you for it" is just completely non-sensical.

But...all this only applies to government. Private entities are another matter entirely. They have the right to remove content and users however they want. Your freedom of speech does not trump their right to decide what gets displayed on their platform.

Private entities can also react however they want. If you go on a racist tirade on Twitter, your employer has a right to fire you, because having an extreme racist in the workplace is just asking for trouble.


Billy Joel's rule: "You are free to speak your mind, but not on my time."

That is, sure, you can speak to whoever wants to listen. You can't make anyone want to listen, though - including, you can't make a forum want to listen. It wants to talk about gardening, and you want to talk about the latest Covid conspiracy theories? Great, knock yourself out, talk about your conspiracy theories - but not on the gardening forum.

See, I support free speech. I support their freedom to talk about gardening without having to wade through a bunch of conspiracy theory posts. "Post anything anywhere" reduces actual communication - willing speakers finding willing listeners - because the listeners give up rather than wade through a bunch of garbage that they don't care about. I support their freedom to talk about gardening without getting drowned out.


But here the question is who is doing the controlling

Usenet is a open protocol, not a forum that someone owns. So if you support moderation on open protocol then you support most likely systemic, or worse government censorship. Not someone controlling their own property

With Usenet there was plenty of tools (just like with email today) to block spam, content, users etc at the client level if an individual did not want to see certain speech.

With time those tools would have improved just like they have with email


> So if you support moderation on open protocol then you support most likely systemic, or worse government censorship.

Feel free to stop telling me what I support. You're lousy at it.

If a protocol is open, that doesn't mean that my servers are. I'm moderating my servers, not the protocol or the traffic carried using the protocol. (In the same way, you don't control who drives by your house, but you control who's allowed in the door.)

I don't want systemic censorship of speech (including by ISPs). I don't want government censorship of speech. But I support the right of servers to say "I'm not carrying that". And I support the right of a community to have a space that is public to that community, and yet to keep stuff that they don't want out of it. And I support the right of those who don't agree to go form their own community, and to decide for themselves what the rules are for their community. And all of this can happen on open protocols without systemic or government censorship.

Specifically on usenet, you could have, say, alt.hobby.gardening, and have it get overrun by spam, and have alt.hobby.gardening_moderated start up with moderation to keep the spam out. And if you don't like it, and want to post your conspiracy rants, go post them on alt.hobby.gardening with all the spam that nobody reads. Knock yourself out. Or go start alt.hobby.gardening_conspiracy, or whatever. But let the gardening nuts have their space without your rants.


I, for one, own that label.

I support free speech except in some circumstances where curtailing it makes almost everyone's lives better.

Where I support it to the most extreme: criticizing the government.

Where I want it most throttled: my own living room, so I can have some peace and quiet.

I think most functioning adults have a similar spectrum of allow / deny and aren't absolutists on the topic.

And, I mean... I don't think we need to look deep into the history of the Internet to see that becoming the dominant pattern. USENET failed. Email without spamblockers failed. Every forum either eventually adopts moderation policies or goes underground where barely any casual user hears of its existence.

The user interprets lack of censorship as damaged and routes away from it.


Here we have a fundemental problem

It is often NOT the users of the platforms that want censorship, it is users of another platform, targeting something they do not like on the internet, and then going after underlying structures be it ISP's, DDOS services, Domain Registrars, Hosting providers, Credit Card Processors, etc etc etc

We have plenty of examples of this on the internet, where it is not the users of the sites that want something censored, but 3rd party actors using systemic pressure on critical infrastructure to effect the censorship

You have this utopia where someone puts up a gardening forum and the users or even the owner of the forums says "We do not allow talk about Politics here"...

Where in reality someone sets up a political forum, someone posts something, then someone on Twitter, Facebook, Insta, etc gets offended and starts a campaign to have CloudFlare, or AWS ban them.


Isn't campaigning to have something banned also free speech? If they start blowing up data centres, that's a different question.


Yep. It's a pretty clever approach, and I for one am in hindsight surprised that it took this long for people to realize that the system is a network and is therefore vulnerable to network dependency attacks.

This is not necessarily a bad thing. Nazis, for example, deserve no platform and if companies decide to revoke service when they discover they're serving Nazis that's their prerogative. It can be abused, especially given that the largest players in the space are going to trend sensitive to controversy... But the internet has become embedded deeply enough in our lives that people can no longer enjoy the pretty fantasy of the neutral service provider; IBM got away with it in World War 2 and that's generally pointed to as a moral failing these days.

If people are calling for enforced neutrality, they're really calling for government regulation (who else would enforce the neutrality?) which is its own can of worms.


>Nazis, for example,

Then you have the problem of labeling everyone that disagrees with you a "nazi" for example just because they favor strong border laws, and immigration controls, or oppose social justice and ESG "capitalism" or other such things

Or the true crime of having a difference of opinion with a trans person...

>>This is not necessarily a bad thing.

yes it is a bad thing, it will always be a bad thing, even if doing to to actual nazi's not the new age defination of "nazi's" which seemly is anyone that happened to vote for Trump, or holds conservative political positions.

>>If people are calling for enforced neutrality, they're really calling for government regulation

yes and no, depends on what we are talking about

Are we talking about an already heavily regulated industry like banking. This yes 100% I am calling for enforced neutrality. JP Morgan Chase, nor MasterCard should be able to refuse business to someone for the protected political speech.

Services like AWS, or CloudFlare.... No I would not call for supported regulation but then we come full circle back to loss of cultural support for Free Expression

There was a time were the ACLU would defend the KKK's right to speak, We should return to those days, where the axiom was "I disagree with you but I will defends your right to say it"

If we at the point was the population is pushing for AWS to de-platform competitors of Twitter because of disagreements over politics we are not far removed from government censorship

The reality is you would likely support both, as you clear abhor actual free speech

This is also why Elon's purchase of Twitter was so critical and 100% support his efforts.


> The reality is you would likely support both, as you clear abhor actual free speech

Broadly speaking, I'm a fan of the marketplace of ideas. That marketplace includes boycotts, callouts, and choices by private individuals to provide or revoke a platform for someone else to speak in their space. Freedom of the press has never implied that everyone gets a free press.

> which seemly is anyone that happened to vote for Trump, or holds conservative political positions.

Hacker News isn't really a great venue for that kind of discussion. But there's a reason the pushback on him and his supporters has been so strong relative to previous politicians. He and his supporters actually do represent something fundamentally caustic to the American body politic. https://www.bu.edu/articles/2022/are-trump-republicans-fasci...


>>That marketplace includes boycotts, callouts, and choices by private individuals to provide

Which means I am sure you support the ongoing boycotts happening to BudLight and Target? Or are boycotts only for one side?

> Freedom of the press has never implied that everyone gets a free press.

The problem here is when the government subsides the people that makes the press, regulates the people that makes the press, and limits the who and produce presses it ceases to be a free market, and becomes a regulated market

One can make the case, that since the internet was started by the government (dod) and regulated for decades by the government (Dept of Commerce Via ICANN), and tons of subsidies ISP;s have gotten and continue to get, locations where the internet is a government service, and tons of other factors that the idea that the internet is a purely free market is clearly a false narrative

One often used for convince when it meets peoples political goals, and then the second that freedom is not aligned with their goals anymore.

Further when it comes to consumer boycotts, you have to actually be a consumer, 99% of the people that have effective policy changes at CloudFlare and AWS have never spent a single dime at either of those services.

Today cancel culture is something more than just a boycott, it is something new not seen before in human civilization, and it is a huge threat to not only freedom of expression but freedom in general

I am unclear why people do not get that.


> Which means I am sure you support the ongoing boycotts happening to BudLight and Target? Or are boycotts only for one side?

Of course I do. Nobody should feel compelled to buy a beer they don't want to buy.

> 99% of the people that have effective policy changes at CloudFlare and AWS have never spent a single dime at either of those services.

I think I need a citation for that. I used to work for a cloud company and there is a lot of back room negotiation that people don't realize happens. My default suspicion is that product advocates and product managers received real clear signal from people with money to spend that they were going to spend it elsewhere.

> Today cancel culture is something more than just a boycott, it is something new not seen before in human civilization

No, shunning is actually very old. It is the way communities small and large dealt with unacceptable behavior in their midst. Nobody is compelled to associate with somebody they don't want to.

What did change is that for a brief period of time, the existence of the internet, it's relative obscurity, and the pseudonymity it provided convinced a generation that they had subverted the old cultural norms. What we are witnessing is a reification of those cultural norms to the new technology. The USENET day is dead. The techno utopia was tried and found very wanting. Old patterns are reestablishing themselves, though with more voices at the table because ultimately, you can't actually stop the signal.

The former president was kicked off of somebody's microblogging service and responded by creating his own. That's how it's supposed to work.


> Where I support it to the most extreme: criticizing the government.

> Where I want it most throttled: my own living room, so I can have some peace and quiet.

This is a category error. What if someone wants to criticize the government in your living room?


They are free to buy a living room and criticize away. My living room, I am allowed to determine whom to allow and whom not to allow in. And I would like to find online spaces where the loser spam (both commercial and ideological) are kept out, so I can find the speech I am interested in. Why is this desire felt as oppressive by so many?


> Why is this desire felt as oppressive by so many?

Because they're the spam being kept out.


Before you wrap yourself too tightly in the 1st Amendment to the Constitution of the U. S., perhaps it would be helpful if you defined what “free speech” means to you. Because repeatedly typing “no true Scotsman…” doesn’t do much to move the conversation along, and definitions obviously differ.

For example, I feel that you can say what you like, just not in my living room. Free speech does not guarantee an audience.


That is the often cited copout that has no meaning.

"I feel that you can say what you like, just not in my living room" is a pointless axiom that provides nothing of value to the conversation.

The debate becomes infinitely more complex when you step 1 inch outside of the "living room" context and start talking about Public and Private companies operating platforms on a global scale all under a infinitely complex set of regulations, subsidies, and other factors

" I feel that you can say what you like, just not in my living room" seems to me then you want to translate that into "Reddit, twitter, etc can choose what people say in their living room", the problem is they do not have a living room, and attempting to make that analog is ridiculous


Why should a company be forced to carry anything anyone wants to say?

If you want to publish your own speech, buy your own server


What you are calling free speech is boring to the audience. Usenet died because the endless unfiltered spam and shit posting was boring compared to slashdot and then reddit and then here.


>Lots of people like to claim they support the idea of free speech, only then then list 100's of things they want to ban...

Sounds just like Elon Musk!


Obviously, this is why 4chan died back in 2012. If only they had moderation, they might still be around today.


They're a special niche, and I believe they've been forked multiple times over moderation disasters resulting in 8chan etc.

If 4chan "breaks cover" into the mainstream in the way that, say, Facebook has, it will make for some very entertaining Congressional hearings.


4chan has a million jannies, it's heavily moderated...


4chan has moderation.


The old chestnut "The Net interprets censorship as damage and routes around it" referred specifically to USENET if I recall correctly; it is often mis-applied to the Internet itself or to other services. But it referred to the notion that since USENET servers just routed their stash of stories to any node that requested them, censorship was functionally a non-starter; if you (the node owner) wanted content that some other node owner was squelching, you could request it from another node.

But it turns out that model had some major flaws. Nowadays, one could say the 'net interprets lack of censorship as offensive and migrates away from it. If there's one thing the experiments after USENET (from Slashdot to Mastodon) suggest, it's that people on average want some control (and don't on average want the burden of being their own moderator for every damn bitstream that comes down the pipe).


The net interprets hate as noise and filters it out.

At least, that's the goal.


Interested to see if we find these negatives paralleled in the fediverse.


Relevant: The ecosystem is moving (Challenges for distributed and decentralized technology from the perspective of Signal development)

https://media.ccc.de/v/36c3-11086-the_ecosystem_is_moving


Usenet was - by the standards of the early 00's, which is when it began to decline - slow, and asynchronous. Posts could take hours to days to propagate, which meant conversation threads would go off in multiple directions at once.

Technology that was transformative during the BBS-to-early-internet transition became a liability ten years on.

Slashdot, PHPBB and later on Reddit offered synchronous, low-latency, single-point-of-truth updates to thread/post order.

The promise of self-configured "agents" for content filtering never really caught on - they were too time-consuming for anyone bar a few geeks and academics to manage. The social media "feed" is an evolution of the same idea, but there's not really any self-configuration - instead the system is programmed by the platform's owners to adapt to your behavioural feedback.


I disagree: between 1994-5 and about 2005 I made heavy use of a "private" NNTP server (for the Edge Magazine forums).

Compared to early web forums it was super fast - you could sync all of the messages in about two minutes. Then you could disconnect your modem, read + reply etc. Then go online an hour later to send reply + update the feed.

Broadband changed the game: once the internet was fast and always-on then web forums pretty much took over.

We lost something though: the beauty of NNTP was that you could easily keep your own archive of discussions of the groups you were interested in. Search worked and it as offline. Oh and no adverts or tracking!


I'm talking about server to server message propagation, not client/server - agreed, the Usenet clients were much faster and had better feature-sets than any web forum running on Netscape or the early versions of IE. That part was great, but messages could take hours to even days to propagate across the whole network. The sync-reply-upload work flow was indeed useful - especially so in countries where you had to pay by the minute to be online.


Propagation time was never the problem. I ran a major European node in the 90s and our propagation times were in the seconds. We absolutely obsessed about it. Only dialup leaf nodes took longer, but that was and remains a feature, not a bug.

I've been using NNTP as a Gossip-type protocol for eventually-consistent pub/sub telemetry, monitoring, and notifications between partially-connected nodes in an unstable mesh ever since. Just not in public.

Let’s be careful to separate the message protocol from the system it supported. Usenet’s carcinogenic inability to self-regulate at scale arose from the absence of distributed authority and negative feedback loops in the control plane, something we’d aim to build in algebraically and cryptographically if designed today. Alas, VC backing directs most research in this subsector towards systems enabling consumer surveillance.


Can confirm. I ran a small node for a reasonably sized university in the early 90s. Replication times were in the tens of seconds. Could be in the minutes if a perfect storm of congestion and server overloading occurred. But my memory was more than 99% of the time latency was pretty low (seconds to tens of seconds.) Not bad for a single VMS cluster that serviced 20k+ clients.


That reminds me of the days of FidoNet [0] and Barren Realms Elite [1] where the inter-BBS propagation delay made your BBS's attacks on another BBS even more tense and exciting, and well, it felt realistic that sending your forces to another "planet" took time to happen and for results to be reported.

[0]: https://en.wikipedia.org/wiki/FidoNet

[1]: https://www.johndaileysoftware.com/products/bbsdoors/barrenr...


Woah, I remember playing BRE, which must have been close to 30 years ago. Impressed the author still has a site for it much less that you can purchase it!


I'm going to call "mostly wrong" on the server-to-server propagation.

By 2000 the NNTP mesh was very interconnected and fast. Sites competed[1] to get messages propagated fast. Especially text messages.

Any site hooked into the mesh was going to be offered any article multiple times within seconds of it being posted.

Slow articles would happen but only during outages or with sites that were not following good practice.

[1] Via http://top1000.anthologeek.net/


Absolutely - the federated case (I was rather active on the tolkien, c64 and various computing groups) was terrible. We used to use email lists for real discussion with plain text messages (shame that isn't a thing anymore).

It was an excellent experience in the non-federated case.


> Then go online an hour later to send reply + update the feed.

Usenet was truly special and an awesome design. Every iteration of something newer has been strictly worse. Mailing lists, forums, and now proprietary platforms controlled by a single corporation. At every step, features and functionality have been lost.

In the early 90s I used to travel coast to coast a lot, so I pulled all the newsgroups I was following to my laptop and spent the flights reading and responding to posts. Then I'd sync at the other coast before flying back.


What really killed it was alt.binaries.*. Vastly bloated bandwidth/storage requirements, so ISPs either killed NNTP altogther, or stopped giving a crap and you'd start, like, randomly dropping 10% of messages, which amounts to the same thing.


I don't think this was true. It was entirely common practice for an ISP to ignore alt.binaries but carry all the rest of Usenet fairly reliably. To the point where users would rebuke each other for posting binary content in a non alt.binaries hierarchy, lest the ISPs detect that as a stealth binary group and drop that too.

What killed Usenet was as the parent said - users are better served by the centralized management and reliability of a web forum, they don't need or care about decentralization and peering. The same thing happened for Discord and Slack replacing IRC.


Discord/Slack replaced IRC because IRC ran over a set of ports which could (and would) be blocked by almost every corporate firewall. I still remember the day that my friendly Systems Administrator left and the new guy nuked our IRC access :(


Probably the reason that took over is they are well funded and have marketing and/or salespeople. And they have emojis, file uploads, etc and they are just generally easier to use. Not that IRC is hard, but it is geeky enough.


> Not that IRC is hard

This totally understates the idiocy of IRC.

I was a full-blown operator/system administrator across both mainframes and Unix workstations for years. And I never truly felt like I grokked IRC.

Whenever I had to go into a channel, it was always a screwy dance of mishmash of how to get a directory, how to appease the channel bot, how to even bloody just log in, etc.

And God help you if you didn't have the preferred client for the channel. "Bye bye, N00b."

Sorry, I weep for the centralization of these kinds of services. I don't weep even in the slightest for how IRC actually got implemented.


Yeah, I really don't get how people feel like IRC is somehow the pinnacle of chat. Even if you think a lot of the features added by newer applications and protocols are unnecessary or even a negative, it's really hard to argue that IRC isn't a patchwork of incredible cruft built on top of something so barebones that it just did not survive contact with reality.


Plenty of techies love idealized versions of open protocols, even horrible ones like FTP.

Similar story for ubiquitous tools. Shell is a horrible tool with a few lucky features (pipes, focus on text) that's everywhere so it wins by default. And techies turn a blind eye to its million defects.


and of course authentication, when it existed involve, literlaly involved sending your password in plaintext to some user you _hoped_ was the a system bot and not some impersonator.


> Whenever I had to go into a channel, it was always a screwy dance of mishmash of how to get a directory, how to appease the channel bot, how to even bloody just log in, etc.

> And God help you if you didn't have the preferred client for the channel. "Bye bye, N00b."

These things have not really gone away on Discord. On the other hand, it's hard to choose the wrong client when there's only one you're allowed to use.


Aren't discord/slack products of the last ~5 yrs? Were IRC channels actually active until then? The last I spent time on efnet was maybe early/mid 2000s.


Discord started in 2013. Campfire from 37Signals started in 2006 or so. Web chats with immediate message availability started to spring up immediately after `document.write()` became a thing in browsers, around 1994; by 1996 I personally participated in several, all with different software stacks.

This is not to mention instant messengers like ICQ / AOL / MSN, which largely replaced private IRC chats.


Last time I heard regular people use "Mirc" was 15 years ago.


IRC is too inconvenient unless you are enough nerd to operate IRC bouncer. It's natural to go away.


I'm pretty sure there is a heck of a lot more to the success of discord over IRC than that.


Yep, and the entirety of Usenet became associated in the minds of ISPs with piracy.


> Usenet was - by the standards of the early 00's, which is when it began to decline - slow, and asynchronous. Posts could take hours to days to propagate, which meant conversation threads would go off in multiple directions at once.

Maybe that was your particular experience but generally that wasn't the case unless we're talking about a server connected via UUCP to the rest of the world, possibly with a dialup link between the servers and transferring news articles every few hours or once a day.

NNTP employs a flooding algorithm whereby every server offers all messages received to every peer except the ones showing up in the Path Header of the message. Peers were first offered the unique Message-Id each message has and could then decline or accept the transfer. Programs like innfeed implemented this conversation as a bidirectional streaming protocol (RFC4644). We're talking seconds for a text posting to be propagated to every half decently connected news server on the planet.


In addition to been slow, I would also add another factor: lack of notification method, which exasperated the problem of it's "slow"ness.

Another thing is, in effect, Usenet is a tried-to-be globally governed forum service runs on servers that's been configured differently. Some server don't allow large attachments, some don't allow HTML message, some moderator likes to delete while others don't...

The protocol don't really help it too. NNTP for example, is neither a high efficient nor resilient protocol. If your mail got too large, or your link is unstable, you'll have some nightmares trying to send it out. NNTP is also not a well-designed protocol, which makes client-end life-improving optimization very hard, as a result of it, some NNTP clients (Thunderbird and Outlook) will just let the user handle every error encountered, which is very annoying for the user.

So for me, Usenet is just simply outdated. That's all there is to it.


> lack of notification method

The way usenet clients worked was when new message headers were downloaded, the client would highlight unread messages. It's pretty similar to email where when you receive new messages, you can tell that they haven't been read before you read them. Websites like Hacker News, Reddit, Facebook, etc lack this feature.

Facebook, for instance, does have a notification system, but makes it very difficult to find the specific reply to your post or comment because it frequently filters out comments from being displayed unless "all comments" is selected from the drop down. Once that selection is made, you have to expand every thread to find the reply to your comment.

> Usenet is a tried-to-be globally governed forum service

Globally governed by whom? The only real governance was the creation of new groups in the Big 8 hierarchy. And, unless the group was moderated, anyone could post or reply to a comment.

> Some server don't allow large attachments, some don't allow HTML message

Text groups had conventions disallowing HTML or attachments, so very few people used them (unless their client wasn't configured properly).

> The protocol don't really help it too. NNTP for example, is neither a high efficient nor resilient protocol. If your mail got too large, or your link is unstable, you'll have some nightmares trying to send it out.

Message size isn't an issue for plain text messages and all major usenet services have no issues handling articles up to 750 Kb in size (for binary groups).

> NNTP is also not a well-designed protocol

What specifically is not well designed?

> NNTP clients (Thunderbird and Outlook)

Outlook never supported NNTP

> will just let the user handle every error encountered

What errors are you specifically referring to?

> So for me, Usenet is just simply outdated. That's all there is to it.

Based on what you've posted, it doesn't seem like you have much if any first hand experience using a NNTP client or have read any of the related RFC documents documenting the NNTP application level protocol.


BTW: Since we're already here and I'm too nostalgic now and all, I Googled my old posts from back in time, and found a [post] that we wrote and posted on microsoft.public.cn.msn.messenger as the moderators of the group.

[post]: https://groups.google.com/g/microsoft.public.cn.msn.messenge...

I'm pretty sure Alex Zhang, who's a rank higher up than us, was probably annoyed by all the request for guidance that we constantly need. But...still, good times.

I also remember how it ended: the moderator tool that Microsoft provided to us has stopped working one day, we queried and waited for a few months and it didn't come back. So we assumed that Microsoft has changed the plan.

Also, now that I got my timeline clear, my lunch money was not donated when I was 13 years old (and just to be clear, it was not to Microsoft newsgroup, of course), I would say much older than that. But I can't found any record of the action since the server is private, so, sad I guess :(

Reading the old posts now, judging by the words we wrote back then, we... were really acting like cooperate freaks. Glad I've grown more human since :)


> it doesn't seem like you have much if any first hand

Well, I didn't really setup any newsgroup server (other than the one in IIS for fun), if that's what you mean by "first hand". But I do donated to one out a week worth of my lunch money when I was 13 years old because I frequented there and they needs some funding for their broken RAID card + drives.

> Outlook never supported NNTP

If you know a bit more, you should know that Outlook Express does, then Windows Live Mail etc. The reason why I'm using Thunderbird as email client is probably because it got a (even through buggy but) functional NNTP support. I also know Thunderbird has a bug which prevents it from using the correct charset defined by the user, because I guess I've spent too much time on these news groups.

Also, reading your reply, I realized that you are putting a lots if not all of your own experience on "plain text messages", and you're thinking "this is fine"? Then wake up. On today's Internet, if your forum service cannot host memes and other types of rich multimedia, it's already out of public view. So,

> "have no issues handling articles up to 750 Kb in size"

Dude, really? Just 750 Kb? :)

> the client would highlight unread messages

It require some setup that most people don't know they can do. "Filter" in Outlook Express if I remembered it correctly gives you a different color when a matching mail is received. But guess what, it's not the default at least during the time I'm using it. If you don't do that, you'll only get "notification" (actually, just count refreshing) every time when new mails are downloaded, even if none of those mails are meant for you.

Your overall reply gives me a feeling that you're deep inside your own good-old-time memories, which let you to give out these apologetic words. It's reasonable if you are an user, after all, it's indeed a good time for many of us. But one should not be blind by nostalgia, because if they did, they running into the risk of degenerate their own creativity for something far better. Seeing flaws don't prevent you from loving something, you know?


>> Outlook never supported NNTP

> If you know a bit more, you should know that Outlook Express does

I know that Outlook Express does and it's a distinct product from Outlook.

> The reason why I'm using Thunderbird as email client is probably because it got a (even through buggy but) functional NNTP support. I also know Thunderbird has a bug which prevents it from using the correct charset defined by the user, because I guess I've spent too much time on these news groups.

I haven't used Thunderbird to post to newsgroups recently, but I'm pretty sure that it set the charset part of the MIME header to utf8. When I used to post regularly, it was set to us-ascii based on what I set in Thunderbird's, Seamonkey's, Mozilla Mail & News, and Netscape Communicator.

> Also, reading your reply, I realized that you are putting a lots if not all of your own experience on "plain text messages", and you're thinking "this is fine"?

It works for this forum.

> Then wake up. On today's Internet, if your forum service cannot host memes and other types of rich multimedia, it's already out of public view.

Practically every comment on this website is essentially in plain text. Even most comments on reddit stick to plain text and neither website is out of public view.

> Dude, really? Just 750 Kb? :)

None of the comments you or I have written here come anywhere close to 750 Kb in size. Most emails I type aren't anywhere near that either. In fact, if you look at mailing lists like the ones used for the Linux Kernel and various subsystems and the one used by git project, you'll see that their inline patch messages come nowhere near that size limit. So you can easily read through any of those mailing lists using a NNTP gateway.

> It require some setup that most people don't know they can do

Like what exactly? The default configuration[1] shows message subject, author and date in bold text in the message list pane if you haven't read it and in regular text if you have.

> every time when new mails are downloaded, even if none of those mails are meant for you.

Usenet messages formed a multi-threaded discussion per topic. Messages that are posted as direct replies to message you posted are meant for you and can be easily spotted by scrolling through the list of messages in the message list pane and seeing direct replies to messages you posted in bold text. Even in threads that had thousands of messages, I could find replies to messages I posted in less than a minute. In contrast, I have to click through several screens of my comments on this website to see if any of them have replies and then remember if I had already read them or not.

> Your overall reply gives me a feeling that you're deep inside your own good-old-time memories

I still occasionally use usenet today and frequently use NNTP gateway services to browse mailing lists. It's not recollection of "good-old-time memories". Given your remark about being 13 years of age and donating lunch money makes me think that your recollection isn't very precise given your young age. I started using usenet in the mid '90s when I was an adult in my 20s. I used it daily posting and reading groups up till around 2015 or so when the groups I posted in basically were abandoned by all the regulars I used to interact with.

[1] https://i.imgur.com/i226q2c.png


> Slashdot, PHPBB and later on Reddit offered synchronous, low-latency, single-point-of-truth updates to thread/post order.

...and reputation/karma, avatars, post counts, moderation, search, categories, emojis, anonymity (email address not exposed), notifications...


> anonymity (email address not exposed)

Quite a few people posted under pseudonyms with fake email addresses on usenet.

> notifications

When you downloaded new headers from the server, the client would highlight the new messages in a way that made it very easy to see which messages you haven't seen and whether you got any replies to your message. Try doing that in Hacker News, reddit, Facebook, Twitter, etc.


Don't Reddit and Twitter email you when you get a reply -- or at least have the option to get an email? With Usenet you had to go check.


> Don't Reddit and Twitter email you when you get a reply -- or at least have the option to get an email?

Yes, but you would have to check a completely different service to see whether you got any replies. Most people would probably refresh the comments page or click on the notification icon directly in the page.

> With Usenet you had to go check.

Clients could be configured to check for message every X minutes/hours, etc, but just clicking on "Get messages" would show unread messages. And you could easily find replies to your comment(s) since the thread pane would essentially provide an indexed view of the entire thread including every subthread that you could expand/collapse. The other advantage was that you could see new messages in context.

On the other hand, when clicking the notification icon in reddit, you see new messages in isolation and would have to click on the context link to see what they're referring to.


Web was just the new shiny.

People like the new shiny.

Forum posts also had embedded images, and didn't require installing/configuring/maintaining a separate application.. all you needed was your web browser.


  "I should briefly mention Google buying the Deja News archive, promising to revitalise Usenet, and then promptly abandoning it. Cheers Google. Choogle."
Oh wow, I totally forgot about that. I remember it being a really big deal at the time, and now I barely even remember.


It's even worse than that.

Here, Google pulled one from the Microsoft playbook: embrace, extend, extinguish.

Google bought DejaNews, rebranded it into Google Groups (embrace) and then proceed to Googlify it, making it less and less Usenet-y and more and more Google specific (extend), until only the Google-specific mailing lists remained (extinguish).

Edit: see https://web.archive.org/web/20010226023947/http://groups.goo...


Umm according to Wikipedia, Deja News was turning into a shopping site, going bankrupt and shutting down the Usenet archive.

So Google came in and saved it.

https://en.m.wikipedia.org/wiki/Google_Groups#Deja_News


Yeah. At the time I remember the sentiment being more "woah, this is amazing" and less "boo, corporate overlords taking over".


Google was pretty much considered the "good guys" back then. It only started with AdWords a year earlier and was barely breaking even at the time. Loong way from "corporate overlords".


Yeah, and then they let it sink as they have with so many other cool things they built.


It certainly smacks of being someone's promo project where they moved on to other things after the migration and the project fizzled out without leadership.


It's even worse than that... USENET was one of the places that Google used to seed its search engine. Links from USENET were high quality links.

It's like reddit as the source for LLMs today.

So Google neglecting USENET could be about preventing competitors.


Didn't Deja become the basis for Google Groups, which still remains today.


Oh wow, that still exists! What a sparse and abysmal end.

I wonder if they even know it's still there? Probably doesn't use enough bandwidth for anyone to notice. Probably just a few people taking a trip down memory lane to re-read things posted 10-20 years ago.


> Cheers Google. Choogle.

This appears to be inspired by the BBC show 'Look Around You'.

https://en.wikiquote.org/wiki/Look_Around_You


Seems like nobody has mentioned a huge reason Usenet failed:

It needed a client program.

Digg/slashdot didn't succeed because they were flashier or moderated or had less illegal content or privately developed or any of these other reasons given, it was because people could just follow a link to a web page and they're in. Even today reddit or discord do everything they can to convert web users to app users for obvious reasons, but they must have that web version for casual users or they'll die off to whoever does have one.

So ultimately what killed Usenet was Section 230. Without that services could do simple copying/caching like NNTP (protected under telecom rules) but the actual presentation had to be done on the user's computer. And consequently were 230 repealed it wouldn't be the end of the social media world, but rather the resurgence of Usenet-like models where all the presentation is done by a client program.


> it was because people could just follow a link to a web page and they're in

Discord has entered the chat. (wait a minute...)

Seriously though, Discord is precisely an example of this. I was talking to someone about how Skype was degraded, and I wanted to build a competitor, then he showed me Discord. I've been using Discord since (and on my free time thinking of what I would do different to Guilded / Discord). Slack was cute, but Discord was much better. There's lots of room for improvement with Discord, but its just good enough.


This is one of the big reasons, I think. I hear about Usenet all the time, but whenever I try to go looking for it in the past to figure it out, it just makes no sense to me and I cannot figure out how to get to it. To me it's just this nebulous thing that exists somewhere, but I have no idea where.


You can get a free account here (text groups only, not binaries):

https://www.eternal-september.org/

You can access Usenet via Thunderbird (or Pan, mentioned in the article).


Some ISPs still provide Usenet feeds, or there are commercial Usenet providers in many countries, similar to email.

Usenet can be accessed via the OSS Thunderbird client, or OSS Usenet clients.

https://www.fastusenet.org/thunderbird-windows-tutorial.html


Go to telehack.com and type 'notes'.

It's not comprehensive, but will give you a taste of the flavor of what Usenet was like.


There were, and still are, many web interfaces to USENET. So, IANAL, but I doubt section 230 had much to do with it.


> Seems like nobody has mentioned a huge reason Usenet failed:

> It needed a client program.

Many services required client programs to use. Email did and still does unless you use webmail. And there are services that allow you to read and post to usenet through one's browser.


Section 230 had nothing to do with it. Offering usenet retention is the same thing as being a common carrier. Section 230 would only come in to play if you actively moderated your usenet retention to prohibit perfectly legal speech.


> Section 230 would only come in to play if you actively moderated your usenet retention to prohibit perfectly legal speech.

https://www.techdirt.com/2020/06/23/hello-youve-been-referre...


What is your argument?


Sorry, I thought you would actually read it, forgot where I was for a second - Section 230 protection doesn't require a platform to act as a "common carrier" or to only moderate strictly illegal content, so it would not "come into play" under the circumstances you described.


It seems that "failure" is a rather strong word.

Usenet served its purpose at a point in time and then faded into obscurity, though it keeps chugging away in a niche, appealing to some.

It's like saying that IRC failed.

What forum community, social network or media platform has been permanent and outlasted everything else? It seems that on every social network, people migrate, people abandon platforms, and losing critical mass can be fatal to the platform, and then it just passes on to the next one, having served its purpose; did MySpace "fail" or did it simply cede the crown to Facebook?

I remember Usenet fondly, though perhaps through rose-colored glasses. It was already full of trolls, flamewars, disruption and discord. Yet I cut my teeth there as a young sysadmin. I listened at the feet of experts, literal prominent experts in the field, who had spare time to just freely share their knowledge, because the Internet was an R&E community! I learned a lot about reputation and trust in the digital age. I gained "Internet street smarts" there.

I believe that it shaped a lot of people's lives, for better or worse, and that's not a failure by any measure.


Yep. Protocol-backed stuff on the Internet "fails" the way civilizations "go into decline" in the board game Small World; they aren't removed from the board, but they no longer serve any purpose other than taking up space that the active civilizations haven't taken over yet.


Exactly, it's protocol-backed, and that's the beauty of free, open source software and standards. As long as anyone care to maintain the client and/or servers, or even if very old code still works, you have a viable service. It's not a monolith that's beholden to the whims and fortunes of a single corporation that can just shut it down. Consider the enormous outcry when a single company has to shut down an entire platform, like a popular MMO or something, and then compare that to what happens when an IRC network has a schism, like the Freenode controversy: yep there was drama, but everyone took their toys and migrated to an identical network and now we have two.

I come from MUDding communities myself, and there is no shortage of old and crusty wizards who cling to their MUD servers and have kept them open, going on 30 years now! Clients come and go, Internet standards change, and social network fads happen, and we're still playing a game that doesn't bother supporting TLS, MFA, Unicode, or graphics. MUDs never failed, they simply achieved posterity and faded from sight.


I've said this a bunch of times before, but I don't see this take in a lot of Usenet postmortems, and I think it's truer than most of what people say about Usenet.

Piracy killed Usenet.

I ran a Freenix-competitive NNTP installation for a very popular ISP in Chicago in the 1990s. Usenet was by far the most demanding infrastructure we ran: we ran striped RAID arrays, read replicas†, transit servers --- all of this on the only expensive Sun hardware we had in our fleet. At one point, we independently reinvented the INN history cache, and there were a couple sites near the top of the Freenix leaderboard in part due to IRC conversations about our dumb patch. All this is just to say I feel like I have some bona fides here.

Keeping up with the Usenet everybody laments today was not this demanding. You could easily have done it with a Pentium FreeBSD server and a fast disk. But customers wouldn't let you. The open secret about Usenet was that it was a binary distribution system first and a discussion forum second.

Unlike discussion groups, keeping up with full-feed binary groups was a fucking nightmare. Usenet is the stupidest mechanism for distributing binaries you can possibly imagine: just imagine BitTorrent, but entirely Base64'd, and without the forward error correction --- your file was in 100 pieces, and if you lost one of them, you were just fucked. If users couldn't reliably download pirated games or porn from your Usenet servers, they'd loudly complain and tank your reviews for running a bad Usenet setup. And woe betide any ISP that claimed to do NNTP on anything less than a binary full feed.

The rest of the story is obvious. ISP operators had better things to do with their time and money than cater to the small minority of angry cost-center users who used Usenet this way, and so they outsourced Usenet altogether. Usenet centralized because binary groups forced it to, and once it was centralized there was no reason not to just stick it on a website.

Web forums are better than Usenet was back in the day, so arguably, over the long term, not much was really lost. Reddit today is very much like Usenet was like in the mid-1990s, and they at least have a reliable archive. Apart from the archives of my old comp.security.unix posts, all I really miss about Usenet was the competitive systems engineering aspect of it. I'd love to go back in time and take another whack at Usenet knowing some of what I know now; I didn't realize how fun a problem it was at the time: a real system people actually cared about with a leaderboard.

These things sound very boring now but were most definitely not boring in 1995.


> your file was in 100 pieces, and if you lost one of them, you were just fucked.

That is, until Parchives were invented.

https://en.wikipedia.org/wiki/Parchive


And the yEncoding scheme also got rid of the 33% data size overhead inherent in base64 encoding. It's too bad that never made it over to email.


By the time anybody knew what this was, Usenet was already fucked.


Some providers (namely ISPs) simply dropped the binary groups and kept the text ones. So piracy alone is clearly not what killed it. In fact, Usenet really only exists today for piracy, so if anything, piracy actually saved Usenet. Just for a different use case than originally intended.


I mentioned this in my comment. As a technical matter, you could simply not take binaries. But as a commercial matter, you could not; if you didn't have binary groups, you weren't offering real Usenet (and if your binary feeds weren't ultra-reliable, you were offering shitty Usenet). The result was a consolidation of the platform down to a small number of providers willing to invest large amounts of resources to placate binaries users.

If lots of people aren't running NNTP servers, there's not much reason to use Usenet. By the time Reddit and Digg rolled in, it was no contest.


> But as a commercial matter, you could not; if you didn't have binary groups, you weren't offering real Usenet (and if your binary feeds weren't ultra-reliable, you were offering shitty Usenet).

None of the ISPs I had from the mid '90s through 2010 ever had reliable binary news feeds. Most would be missing too many articles, or articles would expire within hours and the download could never be reassembled.

Anyone who really wanted binaries started using paid services and just use their ISPs usenet feed for text discussion.


This is exactly my point. It's why Usenet consolidated and then died: because smaller firms lost the ability to provide the whole service, which was overwhelmingly abused as a file sharing network.


Unlike discussion groups, keeping up with full-feed binary groups was a fucking nightmare.

The nice thing about NNTP was you didn't have to carry any group you didn't want to. We stopped carrying the binary hierarchy and some of the dedicated pr0n groups because of just this sort of administrative pain.


Some providers simply dropped the binary groups and kept the text ones. So piracy alone is clearly not what killed it. In fact, Usenet really only exists today for piracy, so if anything, piracy actually saved Usenet. Just for a different use case than originally intended.


That's effectively been a large part of my view / understanding, though from (mostly) outside the service-provider viewpoint.


Nowadays usenet really is nothing but a binary transfer system. Such a shame.


It didn't fail. It was a smashing, world-changing success. Success, though, doesn't always mean immortality.


Came to say the same. Usenet was amazing - in it's time. If that's "failure" then lots of apps would be happy to fail.


"Just resting."


> Thirdly was the user interface. Usenet looked dull. In a world of animated GIFs and MySpace colour schemes, Usenet didn't even have avatar images! Sure, the spartan nature meant that you could focus on a conversation - but it didn't feel as modern and exciting as the web did

"Modern and exciting" beat "being able to focus on a conversation" :(

Being able to focus is right at the top of my list of priorities. Surely I'm not the only one?


There are subjects where the UI matters.

Eg, I don't see the HN UI ever be used for anything graphics related, because there's no image embedding.

This is fine for many subjects, but greatly constrains others.

It also has social effects. HN makes usernames very vague compared to say, Reddit or Slashdot. Which means that after more than 2 years on this site I still don't know any. The site is extremely impersonal to me, compared to Reddit where I can recognize regulars in some of the subreddits. Whether this is a good thing or not varies, but it certainly affects the dynamics.


Usernames being vague is a feature. Comments should be judged by their content and not by the person who posted them.

Also if you use old reddit usernames in reddit are as impersonal as they are here because of no discernible avatars.


Pseudonymity beats anonymity. People's track records and comment history should be a factor when judging what someone says, unless we expect every comment to be a lengthy thesis.

If I see a comment from someone I know to be a relative expert on a field based on my previous interactions with them, I want to judge it as more authoritative and interesting than a rando who registered two weeks ago and has mostly commented "lol".


> It also has social effects. HN makes usernames very vague compared to say, Reddit or Slashdot. Which means that after more than 2 years on this site I still don't know any. The site is extremely impersonal to me, compared to Reddit where I can recognize regulars in some of the subreddits.

I actually like this approach as it reduces the issue of some users dominating the conversation, thereby overshadowing others. I think it's essential to foster a level playing field for everyone when discussing topics of interest.


Claiming you could focus on the conversation with Usenet is putting on some rose tinted glasses.

By the late 90s Usenet was well into the Eternal September and inundated with spam. Unmoderated groups were a mess. There was also just the simple problem of formatting. IIRC Outlook Express defaulted to MIME/multipart posts which would cause all sorts of weirdness if someone with a text-only client quoted them and the OE user replied. Then of course there was the issue of top and bottom quoting. Many Usenet clients had truly awful UIs so a user could inadvertently quote a message's headers in a reply.

Compared to even a simple WWWBoard the experience was often lacking in terms of focus.


> By the late 90s Usenet was well into the Eternal September and inundated with spam. Unmoderated groups were a mess. There was also just the simple problem of formatting. IIRC Outlook Express defaulted to MIME/multipart posts which would cause all sorts of weirdness if someone with a text-only client quoted them and the OE user replied. Then of course there was the issue of top and bottom quoting. Many Usenet clients had truly awful UIs so a user could inadvertently quote a message's headers in a reply.

I didn't really start lurking and posting on usenet until the late '90s and while those problems occasionally cropped up, they really didn't interfere with the main discussion threads in the groups I was a regular in. I only stopped regularly using it by the early/mid 2010s because all the other regulars in the groups I was in ceased posting.


I mean, I used `tin` as my newsreader until somewhere between 2010-2015. It wasn't quite a hellscape of unusability if one was using a text only client.


I should have clarified the GUI newsreaders. I'd be willing to bet a very small amount of money that around 2000 the majority of newsreaders in use by real humans (not spam bots) were Outlook Express and Netscape.

There were some good GUI newsreaders but Outlook Express was a shit show. I remember OE was what my ISP had NNTP documentation for and nothing else.


> Outlook Express was a shit show

The main problem OE had was the quote wrapping problem that other clients didn't. The other was that it defaulted to top posting IIRC.


Yeah, I think Thunderbird was still pretty popular around then as well


> Thunderbird was still pretty popular around then as well

Thunderbird didn't come out until the early 2000s. The predecessors were Mozilla mail & news and Netscape Communicator. Seamonkey[1] is an offshoot of the former once Mozilla split mail and news into Firefox and Thunderbird.

[1] https://www.seamonkey-project.org/


You're definitely not the only one. I also endorse the idea of function over form, because catering to the masses has historically stood adjacent to 'lowering the bar' (choose any metric for this hypothetical).

Every place that doesn't cater to the masses seems to always be a higher quality community.

Slightly off topic, but dang, if you're reading this, I recommend making HN invite only for a month or two after the incoming reddit fiasco in July 1st. A ton of people on the major subreddits are already parroting things like "Just go to HN, it's basically the new reddit".

I don't think I'm making a hot take when I say reddit has a pretty awful community outside of the tech focused subs. But maybe I'm too concerned - I'm just worried that one of the last comfy parts of the internet will be invaded by plebs and emoji spammers.


There are plenty of good subreddits outside tech. Unfortunately, telling anyone about them risks ruining them due to how easy it is for a subreddit to blow up.


I would have gone with discoverability/UX. Finding and figuring out how to use a Usenet group was a complete pain. Once you’d managed the learning curve and found a group that interested you, it was great. But you can’t beat the discoverability of a Google indexed web forum.


As HN's success shows, there's a significant niche for text-only forums.


How can I focus on a threaded conversation if I can't visually see the difference between the participants? Quickly looking at an avatar is easier than reading an email address.

Having even basic HTML formatting helps lay out a post - which makes it easier to concentrate on it. But that was something eschewed by most Usenet clients and posters.

Similarly, inline images are great for illustrating a point. I don't have to lose focus by clicking on a link and going to another tab to see what's being discussed.


> How can I focus on a threaded conversation if I can't visually see the difference between the participants? Quickly looking at an avatar is easier than reading an email address.

You post this on a threaded discussion forum without avatars, one you've frequented for a decade


Yes - and I find it rather tedious. Why do you think that every other major SN has avatar images? It's because most users find it a better experience.

I also have my own CSS for HN which makes it much more pleasant (for me) and I I've started using https://news.ycombinator.com/item?id=30668137


The nice thing about Usenet: It's an open stadard and clients could show avatars.

For one simple generic blobs generated based on the sender's address, using some service like gravatar or X-Face header.


You could have your Usenet client to the same thing. Generate some sort of image out of the email address.


On hacker news I mostly never look at the usernames. It is all just hive mind. Unless someone is claiming to have won a famous math contest or something. Or if I am really mad at a post, I might look for additional context in their posting history.


Usenet had avatars: XFaces

https://www.emacswiki.org/emacs/XfacesSupport

I remember them being used by some Usenet posters in the 1990s.


> How can I focus on a threaded conversation if I can't visually see the difference between the participants? Quickly looking at an avatar is easier than reading an email address.

The from column in the thread pane view in my client[1] makes it easy to see the difference between participants, and also gives an indexed view of the entire thread. You don't get that with Hacker News or Reddit.

[1] https://i.imgur.com/i226q2c.png


> Thirdly was the user interface. Usenet looked dull. In a world of animated GIFs and MySpace colour schemes, Usenet didn't even have avatar images! Sure, the spartan nature meant that you could focus on a conversation - but it didn't feel as modern and exciting as the web did. NNTP software was fragile.

Forté Agent was a masterpiece. Change my mind.


> Usenet didn't even have avatar images!

Now let's not forget my 40-line ASCII-art sigfile.


alt.fan.warlord would like a word


and BIFF.


>> Thirdly was the user interface. Usenet looked dull. In a world of animated GIFs and MySpace colour schemes, Usenet didn't even have avatar images!

https://en.wikipedia.org/wiki/X-Face

I think there was even a color version of x-face? Oh yeah, found it. A 48x48 base64 encoded PNG image.

https://www.emacswiki.org/emacs/GnusFace

I don't know what other Usenet clients supported that. The x-face header was already not widely supported.


Around the time frame of the beta versions of agent coming out in early 1995, the Windows world really surpassed unix for internet usability. Obviously netscape was netscape but agent vs tin, mirc vs ircii, or eudora vs pine, the windows software getting churned out was peak usability as far as I am concerned.


There was also Outlook Express.

It could do NNTP feeds integrated with e-mail, an outgrowth of previous work on IE3-era Windows News and Mail (which was an architectural masterpiece - a Windows Explorer view of a folder full of messages, populated from the mail and news feeds)

I wish we had something like that in Gnome - processes that sync my IMAP mails to local Maildir-like folders (dynamically populated, with download-on-open, perhaps) and a view that shows e-mails. Same for iCal feeds - make it a way to view specific kinds of content.


> There was also Outlook Express.

It was the one client that never fixed their quoted text wrapping bug IIRC.


Oh well... It's not like they tried hard.


> Forté Agent was a masterpiece.

Yes.

Also, my social network feeds could benefit from a killfile feature ...


https://docs.joinmastodon.org/user/moderating/

There's a wide variety of ways to self-moderate content. Andd you can block whole servers from your account as well.

And if stuff is terrible, you can report to your instance's admin (if you're not running your own). And barring that, you can also report to their instance's admin.

Given enough harassment or badness, servers can block harassing servers.


Agent was the very first piece of software I paid for. That and my Extra Newsguy subscription opened an amazing world for me. Then yEnc came along!


It didn't fail, it's just that GUI chat UIs like AOL and later web forums were more visually attractive, and were able to incorporate graphics. Usenet relied on uuencode for any sort of binary attachment which was a pain in the ass. There was also a high degree of snobbishness that made it unwelcoming to new users, who naturally went to other fora where they weren't treated like children.

The other problem was spam. Usenet had excellent moderation/blocking capabilities for the time ('welcome to my killfile' was the last thing many trolls heard from a person), but relied on a general consensus that spam is cancer. Then Laurence Canter and Martha Siegel single-handedly broke usenet in one afternoon nearly 30 years ago, and I still feel annoyed whenever I think of them. People like this exemplify everything that is toxic about capitalism.

https://en.wikipedia.org/wiki/Laurence_Canter_and_Martha_Sie...


There was an AOL newsgroup reader, curtesy of Jeff Warrington. The AOL posters were mostly still treated like children, but not unfairly.


Largely agree with the author's viewpoint. Compared to the Web and Web forums it was very hard for new people to pick up.

Even for people who had mastered the software the lack of ability to get rid of bad people was a problem. One "Kook" could single-handedly jam up server groups ( eg https://www.geocities.w/donfool/ ).

The interface however was still pretty unmatched to this day unmatched. You could and did get threads of 1000 posts from 100 people that were perfectly threaded, readable and referred to points in previous posts.

The requirement for each ISP to run their own server was a problem. If you wanted binaries then you would need a relatively expensive server (or group of servers) plus quite a lot of bandwidth.


ISPs had binaries until around 2007 when NYS AG Andrew Cuomo did a favor for big media under the headline of combating 43 child pics.


There was another reason Usenet died -- people want other people to do the work. Oh, they'll pay for it to a degree, but Usenet wasn't plug-and-play. To make Usenet work, everyone had to cooperate and most people just don't want the work. AOL wasn't even close, but it was someone else's problem to run and maintain. Hostly, my own vies here, but if Reddit just had an NNTP bridge, people would flock to it and never know the differece. I remember running a node -- it was a constant battle to maintain space on the spools and deal with my own users. And, for what? It cost us money and while everyone said they wanted it, no one wanted to help with the work or pay for it. Add to that the number of groups that just want "zombie" on us, and it died.


> If Reddit just had an NNTP bridge, people would flock to it and never know the differece.

I never really looked into it, but if someone were to make a NNTP bridge for reddit, Hacker News, and other similar websites, would it be against their respective TOSs?


I’d add barriers of entry. Not high, but way higher that anything that is publicly readable on the internet and therefore indexed on Google.

I can discover interesting subreddits by pure chance by following a google search on a question I have. Or following a link on a blog

There’s no chance of that happening on Usenet. For most users, search/discovery is a web thing (web search, links) and there’s no link from the web to usenet. Therefore if you don't know anything interesting on usenet you can't easily discover it. Any interesting post/discussion cannot be simply linked on a tweet (or, for that matters, on IRC). Usenet is completely insular. A web forum is not.

Were I a Usenet enthusiast, that would be my 1st priority. Public archives of most important hierarchies (comp.* for example), indexed on Google, easily linkable.


When I first started using usenet in the early 90s, I spent days going through the entire list of newsgroups my provider had. IIRC it was somewhere around 6000 groups at the time. Found all sorts of interesting things that way. To your point, that's not exactly a scalable solution and not one that most people would do :)

I'd counter that though with the aspect of one stop shopping, which e.g. Reddit also has. One thing that kept me on Usenet far longer than most was that with one app (my newsreader) I could access limitless topic specific forums. Instead of having to go to a bunch of different sites, each with their own logins & quirks, etc. To this day I still prefer applications like that, for instance it is something I really appreciated about Twitter until recent changes.


> I can discover interesting subreddits by pure chance by following a google search on a question I have. Or following a link on a blog

> There’s no chance of that happening on Usenet.

You could the list of newsgroups for phrases that could lead you to an interesting group. Other times, someone in the group may mention the other group if you mention your interest in a comment. At least, that's how I found a lot of different newsgroups when I was regularly reading and posting to usenet.


One of the reasons HTTP ate the world is simply the fact that everyone already has a user interface program installed to use an HTTP-vended service.

In this day and age, that need to install one more thing (unless you're on mobile) is itself a huge barrier to entry.


Binaries, imo. At some point Usenet became synonymous to illegal downloads for people, and it became too costly to support.


Warez, and porn. Neither of which are easy to defend when an ISP is looking to address the costs of a Usenet feed. Towards the end, we had close to 100 Mbps full time just for the feed, as I recall. Then we outsourced it to cut that cost. But then management decided none of our customers needed alt.binaries and we dropped it altogether. The effect on churn was negligible.


Totally agree with this. Why would any commercial ISP want to offer this as a service?

There was very little upside and massive potential liabilities because they were storing and supplying unmoderated content that in the very best case was pirated music and movies.


I'm not sure binares were too big of an issue - most server simply didn't carry groups that allowed binaries.

The spam issue was the real deal breaker imho - you either had a fully moderated group, or you had a fully unmoderated one with spam left and right.

Also, for me it was social media - web groups, however lacking the interface was, had more variety in their subjects.


I suspect it was a market failure.

As home internet moved from a competitive dialup market to a handful of DSL/Cable players, all ancillary services dropped in importance. People weren't willing to take 128kbps ISDN instead of 10Mbps cable because they had better Usenet coverage or retention policies, or more personal home page storage, or better email services.

These services, in turn, became obvious places to economize and scale back. Nobody uses them because we don't invest in them so we shouldn't invest in them. All everyone wants is port 80 and promising the largest number of "up to" MBps we can.

So eventually ISP-infrastructure Usenet falls apart, and you're left with having to find an external service provider, which usually was competing for an entirely different market-- "we have the mostest and bestest alt.binaries, and we're going to charge prices close to what we pay for a streaming service in 2023, because that's an expensive customer to service" rather than "we have all the text-based communities you love, and realistically, each user clicking three banner ads a month would finance the bandwidth and disc space those use."

I loved Usenet as a teen, but the family stayed on a crappy provider for too long, and they killed it. I got an account on a server being offered as a public service a few years later, only to find it shuttered within weeks of doing so.

I just want to go back to 1997, writing terrible cringe RP on alt.fan.dragons.


In my experience the main factors that killed Usenet were 1) the World Wide Web and 2) spam.

1) Usenet was at one point a huge portion of internet traffic. But as the rest of the web ballooned in popularity, what was originally the big fish in a small pond became a small fish in a huge ocean. Eventually newer internet users skipped the Usenet experience entirely. And the zeitgeist went with them. People who spent time putting new information online stopped making newsgroups their first choice and started doing it on their own weblogs or on siloed websites.

2) As newsgroups hollowed out, the S/N ratio went way down. I think once they were about 30% spam, people just started giving up en masse.

There's also the contributing factor of the old ISP services dying or morphing into media providers. Usenet was dialup friendly and the old 300/600/1200/2400/4800/9600/14400/19200/38400/57600 Kbps modem-serving ISPs would advertise their Usenet feed as a perk. It was 80 character, monochrome screen friendly. Once dialup became superseded by broadband, and computers turned into media and 3d gaming devices, and Napster/Limewire/eD2k made binary distribution so much more convenient for the masses, Usenet lost its raison d'être.


> Usenet was at one point a huge portion of internet traffic.

Usenet posts were just text that may have been several kilobytes in size at most. If you look at binary newsgroups, the current level of internet traffic is quite large[1] (though that was 12 years ago and is by now multiple petabytes per day).

[1] https://www.ghacks.net/2011/01/26/usenet-traffic-growth-to-a...


The size of the daily Usenet feed is up to over 200TB a day now, even more than the 9TB number in your link. Although I imagine the total daily amount transferred by users is going to dwarf the size of the feed, one might still note for reference that total daily internet traffic is about 3EB which is around 15,000x that of a single full news service.

In other words, good point! I hadn't realized that in nominal terms, Usenet was still growing like topsy (albeit likely shrinking as a total proportion of global traffic).

[0] https://www.newsdemon.com/usenet-newsgroup-feed-size

[1] https://techjury.net/blog/how-much-data-is-created-every-day...


ISPs decommissioned their news servers. That's what killed netnews.


>ISPs decommissioned their news servers

Yes, this is the number 1 reason.

But USENET is still being used and there are many people still using it. And there are places where you can signup for free access, this is one but there are many others:

http://www.eternal-september.org/

One thing to remember, you never see a post from your teenage self on USENET being dragged up by your employer causing you to be fired or loose out on a job. Why, privacy still exists on USENET.

Also I guess he never heard of killfiles :) These days I have a rather large one compared to what I had many years ago. See a post you do not want to see anymore, add it to your killfile and maybe the person who sent it.


> One thing to remember, you never see a post from your teenage self on USENET being dragged up by your employer causing you to be fired or loose out on a job. Why, privacy still exists on USENET.

Google at one point did a very good job at indexing usenet posts and including them in their search results. If I searched for the name I used when posting to usenet, I would get hundreds of results of posts I made. But if I try searching now, very few results from usenet come up (perhaps because of the age of the posts).

> Also I guess he never heard of killfiles :)

I'm actually surprised he didn't mention that in his article.


This is (part of) the correct answer, yet it is way down the list of upvoted answers.

The story is:

1) There were a number of independent ISPs, but oligarchic forces, in addition to the Verizon/AT&T last mile monopoly consolidated things so that Verizon/AT&T became the ISP for most people, with limited competition.

2) Once consolidated, Verizon and AT&T killed their news servers. This mostly happened in 2008. The old Internet which was more open and peer-to-peer became a platform with end users limited mostly to web clients, consuming from big corporate upstream web servers.

3) This was helped along by various places in the US government, which blessed or even applied pressure to kill off the old peer-to-peer nature of the Internet, particularly Usenet, for the current model of big corporate web servers to end user customers.

Of course the public relations announcements were all of this was done for the litany of reasons companies and the government do anything - to protect customers/children/whatever, fight against the forces of evil etc.


That seems like it’s reversing cause and effect, though. If lots of ISP customers were using their nntp servers, ISPs would have been quite likely to maintain them in order to be competitive.


They didn't. What seems logical doesn't need to be true. Maintaining servers was hard, they decided to not compete on that.

They were trying to minimize any kind of customer support by humans, the number one cost center for any digital company.

Lucky for them that the like of Google and Reddit externalized mail and forum services.


Because it was expensive to run such servers. Imagine dedicating 50-100 Mbps of constant bandwidth just to Usenet, and then explaining to management that 99.9% of Usenet was alt.binaries. We outsourced for a while, but even that cost was difficult to justify and none of our customers would admit to wanting it.


Which is a lesson for those saying "well just federate it!"


I had much fun in Usenet (you can find many of my posts on comp.sys.amiga.advocacy), I've learned a lot, wrote a lot in English for the first time.

There was another post some months ago here that argued success is not tied to survival.

To me Usenet was a huge success and didn't fail in any way, just because people are no longer using it doesn't mean it failed.


In my opinion, the main reason[1][2] for its decline in terms of text discussions was due to the deal that Andrew Cuomo (the attorney general of New York at the time) made with the major ISPs to restrict access to child pornography. This resulted in many ISPs discontinuing their usenet offering and cutting off their customers from it.

Unless those customers made an effort to access usenet some other way, they stopped using it.

[1] https://www.cnet.com/tech/tech-industry/n-y-attorney-general...

[2] https://www.cnet.com/tech/tech-industry/cuomo-strong-arms-co...


Usenet didn't fail at all, it just went out of fashion after Google killed the usenet archive. Honestly, I think if we applied everything we've learned about http over the past thirty years and applied it to an updated nntp protocol then it would be quite formidable.

The biggest issue that never was never resolved was on one hand you have the people who don't want people controlling what they read and say - and on the other hand you have the jerks that want to abuse that. If you've ever logged into a darknet forum and someone has uploaded hundreds of autopsy photos anywhere they could upload photos except the "autopsy photos" area, then that is exactly what used to happen on Usenet. It's probably the same guy doing it. Though personally I think having to do the occasional bulk delete is worth preserving freedom of expression.


I used it quite a lot around 97 / 98, it was a great way to meet people and discuss serios and not-so-serious topics with people around the world. It was technically interesting and a very powerful tool.

Then it got flooded with "replica watches" and "girls nextdoor porn", huge attachments and shady stuff.


I really hate to see the word "fail" used for something that was so incredibly successful, just because technology advanced enough to provide something better.


I agree. By that metric, some other notable failures from history include:

- Steam locomotives

- Gutenberg's printing press

- The IBM PC

- The original iPhone

- The Ford Model T

- The telegraph, and landline telephones

- The cotton gin

- Black powder, cannons, etc

- Wind-powered boats


> Dropbox is FTP with a better UX

Nope. Dropbox is NFS with an asynchronous client with lots of local persistent cache.

> WhatsApp groups are just ListServes

That's a huge oversimplification.

All in all, I wish I have built something that was the backbone of internet discussions for decades. Truth is, we all moved on. At one time, newsgroups was the only space where we could have threaded discussions. Now we have lots of other, more targeted, options.

All the issues the author mentioned could be fixed. We can use AI to rate comments (at the client), and build a decentralized reputation system (maybe a good use for blockchain, finally).


Why did usenet fail:

1) It couldn't serve ads in a controlled manner. The lack of monetization made sure that most people paid for usenet access allowing 'free' http driven apps to take over.

2) Most ISPs didn't provide much information if they had a NNTP server users could use, and NNTP readers didn't come with most users computers. HTTP did.

3) It wasn't exciting to new users. Web went from mostly text to multimedia as internet connection speeds increased.


> Most ISPs didn't provide much information if they had a NNTP server users could use, and NNTP readers didn't come with most users computers. HTTP did.

Until Outlook Express was replaced by Windows Mail, most users computers did come with a NNTP client.


I don't really agree, but also don't have a better explanation. Moderation and clients would have been solved, had Usenet continued to be huge. Email have had it's fair share of problems, including spam, but has always managed to survive because nothing has been able to replace it and it's still valuable. You could argue that development of attractive clients have died down in the past decade for email as well. Panic kept their Unison client alive for way longer than they needed to, and that looked pretty.

Personally I would love to see Usenet replace localized Facebook groups. It seems like every small town have a Facebook group, and if you have no presents on Facebook, you get no information. Usenet would be a perfect fit, no central authority to monetize and control the group, you'd be able to read about activities in your neighborhood, while having no account or an account with an organisation you trust.

The ISPs might not want to pick up the tab for running the servers, but they kept doing it for years after Usenet became a ghost town.


> Personally I would love to see Usenet replace localized Facebook groups.

It actually happened the other way around. Local ISPs had newsgroups that had limited distribution where people local to you would post about upcoming events, things they were looking to rent or buy, things they wanted to sell, etc.

> The ISPs might not want to pick up the tab for running the servers

There's nothing really stopping a local ISP from running a server that's not connected to an upstream feed and requiring credentials to post. That would be localized and a lot more relevant to those in the community if they really wanted a local discussion forum.


Possibly slightly off-topic: Are there any free (as in speech) web-based NNTP front-ends that are relative ly up to date, and still maintained? On my home server I can pull up my email, texts, and RSS feeds in a web browser. It would be nice to do the same with newsgroups, instead of having to run Pan (even though I get the same feeling of nostalgia from its UI as the author does).


Yes--the DLang forums are what I'd consider the best.

> About (https://forum.dlang.org/help#about)

> This website is powered by DFeed, an NNTP / mailing list web frontend / forum software, news aggregator and IRC bot. DFeed was written mostly by Vladimir Panteleev. The source code is available under the GNU Affero General Public License on GitHub: https://github.com/CyberShadow/DFeed

> This DFeed instance (forum.dlang.org) is a frontend to the DigitalMars NNTP server and mailing lists. Portions of the web interface (including style and graphics) are Copyright © by Digital Mars.


Anything wrong with Mozilla Thunderbird?


Only that it's not installed on every computer I use :)


Usenet didn't fail. It is still around, but nowhere near as vibrant as when it was 'the only game in town'. Yes it is slow, yes it is spartan. But like AM radio, it should still be kept as a no-frills alternative when things have to depend on low-bandwidth.

Like FTP, it still has a use.

But people are still attracted to "oooh, bright shiny thing!" rather than the spartan.


> Yes it is slow

People using it for binary downloads can pull down data at gigabit speeds.

> it should still be kept as a no-frills alternative when things have to depend on low-bandwidth.

It's unfortunate that most websites and other services require significant amounts of bandwidth. If I'm using a GPRS connection on my phone, websites other than Hacker News are not really usable. IRC and Usenet still work though.


> Yes it is slow

> People using it for binary downloads can pull down data at gigabit speeds.

Sorry. I wasn't clear. I meant that being a 'store and forward' medium rather than a broadcast medium, it can take days to propagate to all clients, rather than talking specifically about the speed of transmission.


I presumed it was spam. It was a system designed for maybe 1 million non-commercial users The presence of large numbers of users and commercial users were unmanageable. I used it quite extensive in 1992-6, but after that it was a mostly inferior, and the surviving groups I followed became closed mailing lists.


Nobody made money on Usenet. It was a "cost" for all ISPs and eventually Google.

So maybe that's why it failed.


I enjoy reading old usenet posts of mine. The confident yet naive bravado almost makes me feel young again.


Usenet did not fail. It was killed. It's different.


"so CompuServe was my gateway to the Internet."

I don't even remember CompuServe having Internet access. AOL, which came well after CompuServe, didn't even have it until they tacked it on as an afterthought (with a button on their toolbar).


Two points completely unrelated between them:

1. A large part of the world completely ignored Usenet. ISPs began offering Net connectivity in my country in the late 1990s, but then no one (IIRC) had an Usenet server or archive; and if they had one, it was largely unadvertised. Connecting to a NNTP server not offered by your ISP might be doable, but it could be a paid-for thing, not very easy and not very reliable. So, too many people around the world did not get into their Eternal September.

2. There are good NNTP clients (and free software ones). Claws Mail has good NNTP support and is actively developed, and this is just one example.


Did Usenet fail or did it succeed until something better came up?

Nobody would say that telegraph failed, do we agree on that? And yet we started using landline phones, fax, email, SMS and all sort of messaging systems on mobile phones.


alt.binaries killed usenet.

Apart from pron clogging up bandwidth, alt.binaries became a good vector to distribute viruses, child pron, and a bunch of other things that ISPs, and their dial-up customers, were becoming increasingly concerned about. Its openness in an increasingly more dangerous, scammy, and criminal world was always going to be its own undoing. There must have been a stage where Pamela Anderson took up most of the world's storage and bandwidth just from synchronising usenet feeds.


> alt.binaries killed usenet.

In 2023 Usenet is solely commercially viable for alt.binaries.


I'm surprised people aren't mentioning this more. Usenet may be dead for discussion, but it certainly isn't a dead technology.

edit: I guess they are. Took a second to search around the comments.


From what I've read, they're transferring multiple petabytes of data a day between different backbone NNTP server farms.


This is true. Pretty much all alt.binaries though and there is an interesting ecosystem of tools, forums and providers that you need to be aware of in order to access content.


Consider applying for YC's first-ever Fall batch! Applications are open till Aug 27.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: