Hacker News new | past | comments | ask | show | jobs | submit login
After fifteen years of downtime, the MetaFilter gopher server is back (metafilter.com)
401 points by ohjeez on Feb 26, 2016 | hide | past | favorite | 162 comments



I'd love to go back to the days where I didn't have to endure the 20+ external scripts (mostly ads, analytics, social and "optimization") that some crappy dev has crammed into their markup, just to read an article or browse a store.

Maybe we need a content oriented alternative to the web browser.

Maybe a modern Gopher.

Display preferences are handled by the client. Sure, support video and audio. Support tables. But keep it minimal. Put all the scripting on the server side, as it should be. Lock down the specification - do everything to stop it becoming web browser 2.0.

Then we wouldn't need junk like AMP.

Maybe it would take off well in some nice to start with.


External scripts are not a requirement, just an option, and it's quite possible to optimise pages without something like amp or FB instant articles.

The web is really well suited to content heavy sites, it's just being used to do all the things humans like to do (spam,ads,control over access), as gopher would be if it were popular. I think your problem is more with the behaviour than the medium and if you restrict the medium people would just look elsewhere.


The whole point is that people who wanted to do spam, ads and paywalls would look elsewhere.


The problem is with humans, not the medium; even plaintext can easily carry spam, pointless content or links etc and it's very hard to separate from real valuable content, in fact it me be considered both by different people.


Core problem is that creators of value want to be paid for it. The delivery medium doesn't change that.


Much value is created by people who don't want to be bothered with payment because they are motivated by the social capital accruing from contribution to a community. Trying to pay for such gifts actually drives these contributors away, by devaluing their efforts - as though they were only doing it for the money. There is labor you literally cannot buy.


That class of people is not as large as you might think it is.


There's a strong selection bias, of course, given that I have structured my social life around communities of people who value community. Still, my impression is that it's less about wanting to be paid for creative work and more about needing to be paid for it, else they can't afford the time to do it. If we had a basic income system, I suspect you'd see a lot more work done for community benefit, without the overhead and distraction of payment.


> Core problem is that

[some]

> creators of value want to be paid for it.

The WWW has been a proof of concept of how much people will do _without_ direct monetary compensation. If you're thinking who's going to build Kayak on a glorified Gopher though, maybe you should start by asking where audiences will go when they find something that provides a better experience than Chrome.


I suspect the problem domain is more along the lines of work requiring capital expense - like remote war reporting which is tough to do from your bedroom in your pajamas. It takes real money.


No, I don't. I'm sure many others creators feel the same.


You and all the other freebie guys are welcome to create content without all the extra scripts then. Pretty simple no? It would appear though, seeing as this is considered a pretty big problem, that the vast majority of people do want to be compensated.


I can see at least 5 viewpoints someone might have about ads on their site:

* They don't care about the money and don't want ads on the site.

* They're someone whose content is the advertising. A database consultant might write an article on using indices to speed up low-selectivity queries. He doesn't need to plaster his site with ads, but his articles do serve a secondary purpose of advertising himself. Or something like Angie's List, where people literally visit the site with the goal of being advertised to.

* They don't mind unintrusive ads, and these help provide money and motivation to work more on the site. Google ads would be acceptable here.

* In addition to their site's primary content, they also write sponsored content for companies that give them enough money.

* They only care about extracting as much value from their site as possible. They might hire ghostwriters on Fiverr to write cheap articles, apply SEO techniques to get them to rank higher than they should, and then plaster noisy ads and popups everywhere. They might buy cheap traffic on low-contested keywords, and redirect them to sites with more expensive ads to arbitrage the traffic. They might have a form to collect your name, email address, and phone number; and then sell your information to a mortgage reselling company. They might have loud ads that play music on page load, that automatically play video, that pretend to be a Windows error dialog, that look like download buttons, etc. Some of these are arguably the ad networks' fault, but the maintainer of the website is ultimately responsible for anything that appears on his website. They might release news articles with clickbaity headlines just to drive traffic to their ads.

I think most people take offense to #5. A minority also take offense to #3, since these track location and can form profiles of you between pages. I don't think anybody really minds the #2 or #1 people.

If you banned ads entirely, you'd still be able to monetize using method #2.


Yes, the second option is the best however it is the one that scales the least. As the content creator will have to use their time to find comanies interested in sponsorship and with their line of work. Time spent not creating quality content.

This is where I think tracking went wrong. Advertisers were so happy they could track users that they (kind of) forgot to track the content. I think basing ads on the page content is, in the end way more safe and beneficial to everybody.


Value is a property of a market. If there is no market (like in free software because it's "free") or if people don't want to pay for it, there is no value.

In other words, create things is not enough.


There are three independent concepts: cost, use value (I usually abbreviate this as "value"), and price, or exchange value (which you are terming value).

There's rather more discussion of this in economic litereature than might be expected, both early (pre-Smith and 19th century) and contemporary.

Trying to make all three of these agree is a bit of a problem. Economic orthodoxy variously tries to do this / pretends they do (and mind, "agree" != "equal").

Content has cost. There's time and effort necessary to create it.

Content has value. It can improve, even drastically change, the lives of those it reaches. That cost may be negative as well -- disinformation or distraction, for exaple.

What content doesn't have much of is exchange value -- it's difficult to draw walls, fences, boxes, or whatever else it is that prevents free access -- around content such that people should pay to get it. And doing that ... has its own sets of problems (generally: under-provisioning of content, particularly to the economically disadvantaged).

Advertising essentially hijacks one content stream (that the audience seeks) with another (that the producer wants distributed). Advertising has value to the author, not audience, and so the author is willing to pay for it to be disseminated. This ... produces numerous perverse incentives. Crappy Web pages among them.

But claiming that no market value (exchange value / price) means no use value, or no cost basis, is simply false. Common, but false.


It was in the context of the comment.

"Therefore, in the course of the economic transactions (buying and selling) that constitute market exchange, people ascribe subjective values to the commodities (goods and services), which the buyers and the sellers then perceive as objective values, the market-exchange prices that people will pay for the commodities."

https://en.wikipedia.org/wiki/Commodity_fetishism#Objectifie...

It's not a personal point of view but it describes well the reality for most people.


You can just disable javascript. That seems to fix 99% of the problems. I use the scriptblock addon for chrome.

Give me HTML. Let my browser render it. If I need some fancy-sauce like websockets, then I will selectively enable them.


> You can just disable javascript

And break an increasing number of websites. What I'm suggesting is a platform where this isn't even a thing. I open my client and I'm guaranteed that all services on the network won't blast me with ads, lazy load content, have page weights of 1MB+ (for a 50KB article), won't let third party advertisers track me via their JavaScript. And they all just work.

Every time I open the uBlock panel, I see even more new domains pop up that aren't on block lists yet. One big UK detailed had over 50 external scripts. Worst is when a sites main layout JS relies on a function in some third party as JS (trackThisGuy()) and completely fails when that code isn't present - block the ad provider, site breaks.


I hate that sites fail to render without JS and dozens of calls to third party sites. At least degrade gracefully back to a readable page!


I've often thought that RSS would fit this role well, if only there was a way to display and reply to comments on the content as well. RSS readers are great for consuming content without all the BS, but I just miss being able to interact with the reactions other people have about the story so I find myself clicking through to the 'real' article about half the time anyway.


You basically described HTTP and web browsers, but with restrictions on the type of content that can be created based on nothing more than your preferences. The market in general has spoken pretty clearly that they want more capability, so I just can't grant that you have a real point. The popularity of your concept on HN is no real surprise given the conservative bent of some techies, but it's the sort of opinion that gets no traction out in the world because it relies on retarding progress.


wasn't all of your needs met by RSS? why did everyone stop using it after Aaron died?


I use RSS daily.

That said it doesn't prevent ads or tracking scripts. It's just HTML stuffed in XML


I've been flirting with no script on my desktop computer. I'm actually pleasantly surprised by how much actually work. Give it a shot if you use Mozilla Firefox.


Absolutely, nearly since it's inception it is the first addon I install.


The first thing I do after firing up a new copy of firefox is install NoScript, uBlock Origin, and Ghostery. I set all three to block everything and then set a) NoScript to temporarily allow top-level sites by default b) Ghostery to allow disqus (because lots of the sites I read use disqus). I rarely allow anything to bypass uBlock, and give sites three chances unblocking script sources in NoScript (mostly ajax.googleapis.com). If unblocking three sources in NoScript still doesn't get me the content I want, I generally just close that tab and don't go back.


Don't quote me on this but I think Ghostery is supposed to whitelist the benign stuff on Google domains such as googleapis.com automatically. I don't know enough about technology in general or NoScript in particular to tell whether this was a recent change or if it is then whether NoScript wants to mess with configuration for existing installations (my gut reaction is probably not as this is a little different from just getting updated fanboy's list for ad blockers).


What? Everyone stopped using it when Google Reader died.


Reddit/HN are the same interface but with relevant content and community interaction.


I'd really love (and have toyed around with) a modern reimagining of Gopher, targeted toward mobile devices and maybe e-readers, and with a focus on accessibility. Just the content, please, and as quick as you can get it on my screen. I'll pick the styling options that make it easiest for me to read.


That was pretty much the idea behind Rebol IOS or X-Internet [0, 1]. I love the idea of a half-megabyte runtime, zero-install, cross-platform that can run on Linux, Android, HaikuOS, Windows and Mac.

Here's a Rebol one liner that opens a GUI, reads web page and sends it as email:

  view layout [u: field "user@rebol.com" h: field "http://" btn "Send" [send to-email u/text read to-url h/text alert "Sent"]]

Too bad it never caught on. Anyone who's ever dealt with JS will identify with Rebol's motto:

Most software systems have become needlessly complex. We rebel against that complexity, fighting it with the most powerful tool available, language itself.

[0] http://www.drdobbs.com/architecture-and-design/the-rebol-ios...

[1] http://www.rebol.com/ios-intro.html


Rebol is not dead yet. I expect a new bounce with the Red language.


I love Firefox "reader mode" and wish there were some way to turn it on by default, because it always provides a better experience than whatever the page designer intended.



Alternatively, a keyboard shortcut would be great.


If you're willing to install an addon, there is: https://addons.mozilla.org/en-US/firefox/addon/autoreadervie... (I'm in no way affiliated with the author of the addon, nor do I use it — with Vimperator, I'm just a "gr" away from reader mode anyway.)


Thanks for the suggestion! I'll give it a try.


Unfortunately most sites I visit don't offer reader view as an option.


Is it something the site has to offer? I thought it was just something the browser was doing on its own, for any page meeting some (undocumented?) criteria.


You're correct, the browser extracts and reformats the content. Firefox only shows the Reader View icon for pages where it thinks it can do a good job of that.


I find myself wondering if it relies on the "for printing" reformating of various sites. This because i was browsing one site i could have sworn had the icon earlier using desktop Firefox, but it didn't show up. That is until i enable the JS of s third party printing framework in Noscript. After that it showed up.


It's using this code:

https://github.com/mozilla/readability/blob/master/Readabili...

which is a descendant of the original Readability.js published by Arc90, before they decided to turn it into a proprietary Instapaper clone.

It extracts what looks like article text from the page markup. DOM elements included for a printer-friendly view could possibly be helping, but it doesn't target that directly.


I'm not entirely sure what is going on with reader view on Firefox mobile, but sometimes it doesn't show the icon in the URL bar. However, if you open the hamburger menu and add the page to your reading list (the "book+" icon), then switch to the reading list and open the page from there, you can sometimes get reader view when it's not otherwise available.


Scrolling slightly down the page then refreshing also does the trick without needing to clear it from your reading list. Just a refresh sometimes works but it's inconsistent.

I suspect it's looking at the page before it's fully rendered and determining that it doesn't meet some criterion for reader mode.


That sentiment brings me back to the old days of unix geeks writing structural html with no concern for appearance.

The separation of structure and formatting is not practical for serious media of any kind. You can't separate form and content. Nor would most people want to, because (visual and other) complexity is an inevitable and rewarding part of human experience...


Instapaper? Reader View? RSS readers? Ebooks? Separation of form from content is both practical and commonplace.

Your blog post may be beautifully styled, but that makes no difference if I can't even load it, or the fonts are so thin I have to squint to try to read, or the glaring white background makes me see floaters. A blind person doesn't care about the whizz-bang of $framework if your JS-only site breaks their screen reader.


Likewise I read all Hacker News links and comments over an iPhone HN app. Absolutely don't miss the form at all except in the cases where the source HTML is not semantic and so the reader mucks it up. But that's an issue with structure relying on form.


Most screen readers look at the output of the DOM. They aren't trying to parse RAW HTML from the server. Really wish this myth would go away :/


The problem for authors is accommodating all the different modes which readers may consume their media. Well-styled hypertext presented in the way the author intends/imagines is never going away, there will always be a place for that and it is very important.

For other use-cases... that's why we have HTML and stylesheets. The problem has been solved, mostly sort-of.


> The separation of structure and formatting is not practical for serious media of any kind.

As already mentioned, this is demonstrably false. I'm choosing to give you the benefit of the doubt and assume you don't understand how a great amount serious media is served nowadays, since the alternative means you are attempting to score internet points with a second-level contrarian post. But I have to wonder where you've been.


This has nothing to do with technology. A movie can't be accurately summarized textually. A book of poetry is profoundly coloured by the typeface chosen. A corporate logo is a visual artifact that cannot be communicated any other way.


You're applying an absurdly narrow definition to the term "serious media" to support your false and equally absurd narrative. Congrats, you got your points.


The AP's articles are printed in dozens of newspapers and websites. The content is completely separated from the form. Wire services have been doing this forever.


"Serious media"? What does that even mean? I guess all those articles I read with mostly text and a few inline images/videos aren't serious.


"The separation of structure and formatting is not practical for serious media of any kind."

The separation of structure and formatting is not practical for serious advertising platform of any kind.


"You can't separate form and content." -- as the author, but you should think about the reader.

They may not be on the device you chose to support.


That would be terrific. Can you imagine the decrease in data costs vs. increase in usability / experience once a few popular services were available?


Or, you know, people could just write webpages that worked without loading 10s of megs of (useless) assets.


Agreed. I had to create a 'modern' design website for a client with all the fullscreen images included. I hate those websites but had to do the job. So I decided to do it all myself. And guess what: 80% compression for a fullscreen image is enough, people don't notice it. All animations were done in CSS. And then I wrote some small javascripts for those hide on scroll, paralax images and other scripts.

Kept the page size below 1MB, server time around 0.3s (ProcessWire), onload below 3s.

It's still not optimal, but when you dump all those bloated Javascripts and compress images a page should not be 10MB.


"Love" the latest fad there, where JS is used to turn a site into a flip book as you scroll down. This by having new "content" come up and cover the old stuff as if you were flipping pages.


Accelerated Mobile Pages is kind of along those lines. https://www.ampproject.org/


Beat me to it, I'm working on a implementation of this myself currently. I like the idea so far.


AFAIK there really isn't a story for clients to configure how to view the content. It is really just a way of removing bloat on pages, so they render faster on mobile.


Given the way that AMP standardizes markup, it's easy for a client to do whatever it wants instead of using the "official" javascript.


The question that all the writers are asking is: in such a world, how do I make a living wage and put food on the table?


By writing things that are worth paying for?


Did you pay for any content recently? What kind of content was it?


I paid for download links to Louis C.K's latest thing, Horace and Pete.

And too many Kindle books...

I haven't ever paid for any blog-type content, but I probably would if someone marketed it right.

With advertising, I don't see why everyone's assuming it would be impossible on a plain text medium.

I listen to podcasts that have sponsors. They just talk about their sponsors in the middle of the show. They try to make it sincere ("I use this product myself" etc). It seems to me like this is a very high quality type of advertising, compared to blockable little ad server banners.


I pay for ebooks and audiobooks mostly. I've never been interested in any of that blog-type content and don't really consume any of it for free, other than what's posted to HN. It may as well stop existing for all I care to be honest. And even without ads there'll always be all kinds of blogs because sharing is caring.


I don't think a streamlined content delivery model would necessarily preclude ad serving.


Capitalism's default answer is retraining...


The correct phrase is "how do I put food on my family?". You'd think "a writer" would know better.


The protocol is very simple (delimited with cr+lf). It's a kind of extended "ls".


Isn't that what CSS is supposed to achieve?


You're trying to solve a social problem with technology. That can work sometimes, but I seriously doubt it will work here.

The problem with this is that the content producers have incentives to add content they get paid for, and people who pay to have others shove content in front of readers have an incentive to make that content as eye-catching as possible.

So first it's just text. Then it's bold and underline, both of which have legitimate uses in real content but both of which are also obviously useful for ads. Then it's blinking, and to hell with light-sensitive epileptics. Then it's all over.

The alternative is the old radio trick of weaving all of the content together as tightly as possible, so it all gets consumed in one big gulp. That has implications for long-term credibility, but that's a long-term problem beyond the time horizon of the bills your financial people just got today, and then there's payroll...


You could just use 1993-style HTML...


So, html then?


With tables and frames for layout!


Tables? Frames? That stuff didn't exist in 1993.

(Tables first arrived in Mosaic 2.0 Alpha 8 in December 1994, according to http://www.barrypearson.co.uk/articles/layout_tables/history...)


That pales in comparison the the real innovation in ~94. The blink tag, so legit


It works :-)

http://gopher.floodgap.com/gopher/gw?a=gopher%3A%2F%2Fgopher...

A post from Cameron Kaiser explaining some features of gopher:

http://gopher.floodgap.com/overbite/relevance.html

Something lightweight like gopher and markdown is enough for a lot of people.


From the relevance link:

"by divorcing interface from information, Gopher sites stand and shine on the strength of their content and not the glitz of their bling."

Forcing content creators to focus on the strength of their content is pretty much a non-starter for most modern commercial enterprises. Which is why it's amazingly useful, but likely futile.


The departure of the modern web from text that has links -> full heterogeneous application suites that run in an elaborate VM has been wonderful for users of desktop applications, in that they no longer need to maintain several stacks of software, but just their browser.

However - those who enjoyed text + links were left behind. I really just wanted text with some links. Thinking about setting up my own websites as a gopher site, just don't know the best ways to proxy it back to HTTP best.

I think pocket, instapaper, readability etc were all essentially peeks into a style of web we could have had where content presentation was something a user has decent control over.


There's Lynx--the text web-browser. [1]

Someone wrote up their experience using Lynx on the web around 2012. [2] Conclusion: "Not all the sites are usable with Lynx, but many of them offer at least basic functionality using the text-only web browser."

As long as there's HTML output, a more humane web browser could allow better customization of the information; it could infer the usual human organizational methods; lists, metrics, groupings, buttons, etc, and present it with your favorite background color, font size, line-length, image size, videos, etc, based on the hierarchical structure. The other camp is web apps, which is a different use case.

[1] http://lynx.invisible-island.net/

[2] http://royal.pingdom.com/2012/06/25/using-web-browser-lynx-v...


IMO, elinks[1] does a much better job of rendering web content in a terminal. Google actually looks pretty good, and gmail is not too bad either.

[1] http://elinks.or.cz/


Not only is gmail not too bad, in some ways, it's better[0], imo.

[0] https://news.ycombinator.com/item?id=11023851


Elinks is my preferred browser on the terminal, it also has gopher support. DuckDuckGo looks very good with Elinks by the way!


Has elinks development stopped? I vastly preferred it to lynx, but it seems unmaintained now.


Looks like there's limited activity still ongoing.

http://repo.or.cz/w/elinks.git


>However - those who enjoyed text + links were left behind. I really just wanted text with some links.

I'm not sure why this can't be done with HTTP? I have a Neocities page which is nothing but text and links with a barely-formatted page to be slightly-better-than-default.

Examples (NSFW language as obvious from the URL's but otherwise SFW):

http://motherfuckingwebsite.com/

http://bettermotherfuckingwebsite.com/

(Note: Above two examples aren't my sites.)


This is actually addressed in the 'relevance' link above, and was the preamble to the text that I quoted.

"On the Web, even if such a group of confederated webmasters existed, it requires their active and willful participation to maintain such a hierarchical style and the seamlessness of that joint interface breaks down abruptly as soon as one leaves for another page. Within Gopherspace, all Gophers work the same way and all Gophers organize themselves around similar menus and interface conceits. It is not only easy and fast to create gopher content in this structured and organized way, it is mandatory by its nature. Resulting from this mandate is the ability for users to navigate every Gopher installation in the same way they navigated the one they came from, and the next one they will go to."

Simply put, it can't reasonably be done with HTTP/HTML because it requires active management (which as I pointed out is basically a non-starter), instead of being 'baked in' like it is with Gopher.


You could use a browser plug-in that required pages with a particular attribute to conform to a Gopher-like schema.


> ... full heterogeneous application suites that run in an elaborate VM has been wonderful for users of desktop applications...

Since the announce of Webassembly, I have a scenario in mind. As you say, the VM can become a complete hypervisor that can can run a full OS. At that point, the browser will become useless and we could get rid of all the complexity of CSS and HTML and close that episode of internet.

Another point is for AI and robots, beautiful design is not a priority.


...and then inside that full OS, what are you going to use to replace what CSS and HTML do now?


Gopher :-)


:) Might want to put SSL around it, though. And then strictly forbid anyone from reimplementing an HTML browser inside the browser OS!


I'm sure that the convenience of a single application is attractive, but it doesn't seem to be worth it. With standalone applications, I never need to worry about a company yanking it out from under me, or forcing an update that I don't want.


On the other hand, you do have to worry about installing it on Windows/Mac/Debian/Fedora/Arch/FreeBSD/Darwin.

And worry about it deleting your root file system (https://github.com/MrMEEE/bumblebee-Old-and-abbandoned/commi...).

I agree with your points, but we should remember why the web got so popular: a safe, seamless software distribution mechanism. Recent OS's have approached this. (https://xkcd.com/1367/)


Gopher + text based advertisements? Maybe the answer to ad-blocking is a system with restricted display options.


Yes, you're right. It's not sexy for business but for Wikipedia (for example) it can be enough.

It can also be a tool for programmers or geeks (like markdown). It's more a niche.


Also, hard to show ads there. This is why RSS failed too.


Hmm, I was just thinking about a return to gopher, in the past couple of years. Guess I wasn't the only one.

I've also been thinking about NNTP. Usenet never went away, entirely, but maybe it will be having a little renaissance, as well. Old-school distributed, independent networking. And again, where content rules over graphic design.

P.S. That second part is NOT a poke at MetaFilter, which I rather like -- although, I guess I've been away from it for a while, now.

But, I'm interested in communities that are orthogonal to any one particular platform, or perhaps I should say, host. Among other things.


Usenet never went away, entirely, but maybe it will be having a little renaissance, as well.

It sorta came back as Reddit. I was a huge Usenet junkie from the first time I saw it, but it eventually turned into 99% binaries and junk. All the interesting discussions in the areas I was involved in went to sites like HN, Metafilter, Reddit, Web-based forums, etc.


And IRC came back as Slack.

It seems like we are going through a period where everything that used to be its own protocol is now recreated using HTTP and JSON.

Hell, it may well be that the web browser has become the new X server.


I don't think IRC went away. More like IRC was an old crusty alcoholic down on its luck, Ms. XMPP (once a cool chick) was having a need-a-pregnancy crisis in middle age, so went to a sperm bank, made some bad jokes about plugins, and wound up getting pregnant. Shortly after that, XMPP met a VC who thought the child was its own. Thus we have the test tube baby called Slack, who takes after its father in capitalist tendencies, and mother in pragmatism.


IRC is still heavily in use in open source development and gaming circles. You can even use it to implement things like Twitch.tv chat. The great thing with open protocols is that everyone can write clients and servers for them.

I've never used Slack but their front page alone is enough for me to turn away. The first two words I see is "Product" and "Pricing". Nothing on the page hints to an open protocol so I guess it is proprietary.


I've never used Slack, so I have no idea why people would use it instead of IRC, but they do have IRC and XMPP gateways available: https://get.slack.help/hc/en-us/articles/201727913-Connectin...


Reddit's not the same, though --- it's centralised, for a start.

A really interesting (and probably really hard) project would be a proper decentralised Usenet replacement, with proper identity and reputation control. It might actually be something that blockchains could be good at, if you could avoid the requirement to replicate the entire database on every machine.

It might be feasible to have a single blockchain for identity and reputation, and then have each 'newsgroup' have its own, referring to it; that way I should be able to reduce the disk and bandwidth footprint to something reasonable.

Coming up with an appropriate costing system so that spammers were priced out without also requiring real money to actually use the thing would be the trickiest part, I think.


Usenet was arguably more centralised, limited, and vastly smaller in scale that Reddit. (I know whereof I speak: I was there.)

I've been trying to come up with some metrics for the size of "traditional" Usenet -- Big 8 hierarchy, say, early 1990s. Gene "spaff" Spafford thought that 50k - 500k users was probably in the right ballpark.

Note that to gain access you needed to be student or faculty at a research university, or work for one of a handful of tech companies (and almost certainly within their engineering divisions), or for a government agency with access. Very limited independent options existed.

The Usenet Cabal who managed things, such as they were managed, was about the same size as Reddit's technical staff, if that.

And the system proved highly vulnerable not only to spam and crap, but to users not acculturated into the system itself, and behavior protocols. The Eternal September was a thing for a reason.


When Reddit was suffering through its censorship/Ellen Pao phase, I couldn't help but think what everyone really wanted was usenet. Decentralized, not owned by anyone, not censored, use any client you want, be anonymous, etc.


I think what really killed Usenet in the US was Andrew Cuomo's crusade against it. During the period from 1993 to 2008, I had Usenet from most of the ISPs I used. The ISPs started pulling the plug in 2008 or 2009. I was using text groups up until then.

These days, I've found a free provider for my text group access, of which there are several.


There are still more than 10,000 public NNTP servers around though I haven't looked at its trajectory to know whether the numbers are increasing/ decreasing:

https://www.shodan.io/report/kNLypDbO

Here's the actual search query:

https://www.shodan.io/search?query=port%3A119


A Seattle Council candidate was running a gopher site last year. It is incredible. It feels like the road that should have been evolved upon, rather than HTML.

Ah well. I should turn my own site into gopher!


That might have been my bad :-) I'm still paying Digital Ocean $5/mo for its hosting AND REGRET NOTHING. Alon was a really great sport for letting me put it up. We even put an Easter egg up where anyone who read the open invitation on the gopher site would be treated to free drinks at a post Reddit AMA party. Sadly, we had few takers. Oh well, we and some nerdy friends poured a few out.

I really hope Alon will run again at some point so I can jump back into action.


I'm a volunteer for The MADE, and as part of our preservation efforts, we maintain a (simplistic) gopher server (thanks to pyGopher). It's pretty neat bringing a computer old enough to drive back to life, getting an IP address, and downloading our logo via gopher.

http interface: http://themade.org:70/ (there's a proper gopher server as well, of course)

Gopherspace ain't dead!


Gopher was the coolest thing on the internet when I first came online with the advice of The Whole Internet Catalog [1]. Eventually, I found Project Gutenberg and gopher became the second coolest thing.

[1]: a printed book!


I have my copy of the Whole Internet Catalog here with me at all times. You never know when you may want to reminisce about The Well.


https://tools.ietf.org/search/rfc1436

They may not quite suit the modern world very well, but I like protocols that I can use via telnet/nc

   nc gopher.metafilter.com 70
   <hit enter>


It's just the navigation that's been converted. Even though metafilter pages are almost all just text + links, they haven't been converted from HTML. So you still need a web browser.


I'm not sure but I think floodgap.com can convert the web pages to text with elinks or w3m.


lynx can render gopher pages itself as well as being able to render the <html> format content linked to lower levels of the MetaFilter site mentioned in OA.

Below is an 'ascii screenshot'

                                                                       Gopher Menu
                                    Gopher Menu



            __  __      _        _____ _ _ _
           |  \/  | ___| |_ __ _|  ___(_) | |_ ___ _ __
           | |\/| |/ _ \ __/ _` | |_  | | | __/ _ \ '__|
           | |  | |  __/ || (_| |  _| | | | ||  __/ |
           |_|  |_|\___|\__\__,_|_|   |_|_|\__\___|_|

     (DIR) MetaFilter
           sharing and discussing neat stuff on the web
     (DIR) Ask MetaFilter
           asking questions and getting answers
     (DIR) FanFare
           pop culture discussion -- TV, movies, podcast, books
     (DIR) Projects
           creative work by MetaFilter community members
     (DIR) Music
           original musical and audio recordings by MeFites
     (DIR) Jobs
           employment opportunities and member availabilities
     (DIR) IRL
           organizing meetups and community events in real life
     (DIR) MetaTalk
           where the commuity talks about MetaFilter itself
     (DIR) FAQ
           frequently asked questions


    Commands: Use arrow keys to move, '?' for help, 'q' to quit, '<-' to go back.
      Arrow keys: Up and Down to move.  Right to follow a link; Left to go back.
     H)elp O)ptions P)rint G)o M)ain screen Q)uit /=search [delete]=history list
Back in mid 90s I was dialling into a GreenNet shell account and running lynx to surf gopher and http content seamlessly, all for the price of a phone call to a London number.


Sure, but lynx is a web browser. If you use a real gopher client, clicking on any of those links will open into a browser rather than in the gopher client.


Yes, for the real retro experience, people might want to try a graphical gopher client, complete with the little folder icons. Lynx just happened to already be installed.


Next up on HN... "GOfurr, a gopher client written in Go" ;)


I'm actually halfway done implementing a gopher library in Node.js. I think it works, but I haven't really tested it yet (was going to build an actual client and try to use it, so I can see how well it works and how well the interfaces work in practice). Have at it, if you want:

https://github.com/twhaples/gopher-client

Not that I believe it'll really be useful for anything. :b

My real plan is to write a web-to-gopher gateway that actually looks nice (because the existing ones look like they're from 1995) and then use it to host my homepage / blog / CV before my next job search (in London!) for the retro-hilarity factor.


Heh, was actually looking for something like this and found some on an old reddit thread: https://www.reddit.com/r/golang/comments/jrl9d/a_gopher_serv...


Gopher was already a historical curiosity fifteen years ago. So this is a nostalgic recreation of a retro project.

On an other note, I think I remember web browsers used to support gopher natively. I wonder when it was removed from the last major browser.


It was removed from Firefox in Firefox 4:

https://bugzilla.mozilla.org/show_bug.cgi?id=388195


And it was me who did it!


Well, apparently now there's demand to bring it back with this new resurgence of Gopher :)


Any hint of guilt there?


Why are you limiting my functionality in 2016!!! You mean I have to download an ADDON to make this work?

I'm submitting a bug report right now.


At least you don't have to download an add-on to use Pocket ;)


In the discussion Paul Bausch (the site coder) notes they are using PyGopherd

http://gopher.quux.org:70/devel/gopher/pygopherd

https://github.com/jgoerzen/pygopherd


It's so fast!!

Now, if only there was a way to browse Reddit and HN using 'tin' or 'nn'.


I have too many projects, one that I shelved was "Reddit-as-a-filesystem", because I was messing around with Dokany [1] at the time.

The same kind of thinking (Reddit->FUSE->Files) could be applied to a Reddit -> Gopher proxy. To reduce the cost of the project, I'd make it self-host the Gopher server on the user's PC and make the API calls from there, rather than setting up gopher-hackernews.xyz:70

My gut feeling is that the slowest part of this would be the API call to HN [2] or Reddit [3].

[2]: https://github.com/HackerNews/API

[3]: https://www.reddit.com/dev/api

[1]: https://github.com/dokan-dev/dokany


Instead of using API calls for user navigation, you can keep a synchronized local mirror of whatever subreddits the user is interested in.

I have a repository somewhere with a thing that does that, through Reddit's .json URLs, mirroring subreddits into Git repositories full of JSON files, and then serving a minimal local web interface with real time updates. I'll make a note to clean it up and publish it...


That's a very clever idea.

I've been playing with FreeBSD's ports system this afternoon. There's something to be said for using the filesystem instead of a database, API calls or whatever.



How do people read HN comments on their phone? On Android, neither Firefox nor Chrome seem to do this correctly. Each line of text scrolls off to the right, there is no reformatting to the screen. So I zoom in to a readable size, and now half the text is not visible off the right so I have to navigate right to read then left then right then left. It's terrible. Why is this so awful?


I just use the https://cheeaun.github.io/hackerweb url. It formats the whole thing pretty nicely.



Report this to the HN staff.

HN used to not be mobile friendly at all, but recently they added the viewport meta tags and now it works very well on iOS. I now prefer vanilla HN over any other HN client on iOS.


There's awesome app for Windows Phone called 'Hacky News'. I guess there are similar apps for Android and iPhone.


It sometimes reflows for me. But only if I hold it sideways.


<3


Why are people so obsessed with Gopher? The protocol is abysmal.


It's because even a shitty protocol can result in a qualitatively great experience. Whether that's because of, or in spite of, the protocol is open to discussion.


Explain


There's an RFC that goes over it but basically, you open a connection, send a pathname, and then the server responds with the file, which is basically the original HTTP spec (without the GET/DELETE/etc verbs).

Oh, but it also differentiates between text and binary files. Binary files are sent as-is, text files are sent line by line with a \r\n terminator. .\r\n indicates the end of file. Leading .s must therefore be doubled. And then directories are sent in another format (type/name, path, hostname, port -- tab separated).

Which means the client has to know before hand if it's a text or binary file or directory. How does it know that? Well, there's a 1-character code at the start of the filename which gives the filetype. 0 is a file, 1 is a directory, 9 is a binary file. There are also other filetypes which will make you reminisce about the 80s (Unencoded file, BinHexed file, n3270 session, etc.

HTTP (once headers and mime types were added) is a better protocol. Maybe people pine for the days when you didn't need 2 megabytes of javascript to punch the monkey, but that's not due the underlying transport protocol.

https://www.ietf.org/rfc/rfc1436.txt


Sounds like in this day and age you can tag everything as either 0 or 9, and leave it to the client to figure it all out.

The whole text vs binary reminds me of FTP, and most clients there defaults to requesting everything as binary transfers.

Edit: MIME is no cure-all, i wonder how many times i have seen Firefox get royally confused because the MIME type is wrong.


Even if you limit it to text/binary 0/9 you still have problems.

1. your url looks like gopher://dipstick.io/0file or gopher://watchingpaintdry.museum/9folder/file. The filetype is part of the url, but only the client is aware of it -- it's not passed to the server.

2. When using a URL, the client has one idea of the file type but it does not necessarily match what the server thinks the file type is.

2. Error messages. HTTP has a status code to indicate the file doesn't exist (or was moved, etc). Gopher can send the error message back as the payload but... is that in text format or binary format? The server has no idea what format the client expects.

Always sending binary data and using out-of-band status codes and file type just keeps life simpler.


I don;t even know what Gopher protocol. is. I know Go.


Go is not a protocol. Go is a language.


I know it is not. I am saying most of the time Gopher reminds me of the language Go.


I guess MetaFilter kind of misses the point of gopher here, since they just serve HTML document on the gopher server, with <a href> links to normal web.

Example:

http://gopher.floodgap.com/gopher/gw?gopher://gopher.metafil...

What is the point of serving that on gopher? :)


Gopher is the ultimate separation of content from presentation. You write the content, I'll read it in the style of my choosing.

Flashy ads and social network dropins need to go away. A good content author can figure out how to work sponsorships and other revenue generation into their work. I don't mind reading through someone's text based plug for a product, it creates a stronger focus on writing quality content rather than link-baiting.


i waiting for someone to kick up archie.


Gopher really does feel like it would be a match made in heaven for Freenet or some kind of Bittorrent FS.


I say gopher it!


But is the connection encrypted? :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: